Apply Now
Location: Charlotte | Bay Area | Dallas | Iselin | Chandler, Arizona (AZ), California (CA), New Jersey (NC), North Carolina (NJ), Texas (TX)
Contract Type: C2C
Posted: 1 month ago
Closed Date: 11/14/2025
Skills: Big Query, Dataproc, Dataflow, Cloud Storage
Visa Type: Any Visa

Role: GCP Data Engineer

Location: Charlotte, NC | Bay Area, CA | Dallas, TX | Iselin, NJ | Chandler, AZ (100% onsite)

Hire Type: Contract

Experience: 13+ years

 

Background:

 

As tenants transition to Google Cloud Platform (GCP) to comply with data center exit mandates, they encounter challenges that require extensive support. This initiative focuses on creating a structured tenant engagement model, including education on platform capabilities, migration best practices, and hands-on guidance throughout onboarding. Key activities include acting as a concierge service for queries, providing practical demonstrations and reusable artifacts, and maintaining knowledge resources. The model combines helpdesk support with high-touch consulting to deliver a comprehensive and consistent migration experience.

 

 

Technical Skills

- Deep expertise in Google Cloud Platform (GCP) services: Big Query, Dataproc, Dataflow, Cloud Storage, 

 Pub/Sub, Cloud Composer, Data Plex

- Experience with Teradata and Hadoop ecosystems (Hive, HDFS, MapReduce)

- Proficiency in SQL (BTEQ, Teradata SQL, and Big Query SQL)

- Scripting in Python

- CI/CD pipeline setup using GitHub Actions, Jenkins, Harness

- GCP BigQuery Migration Services

- Building automated data validation and reconciliation frameworks

 

Soft Skills

- Effective communication and documentation abilities

- Experience working in Agile/Scrum environments

- Ability to collaborate with cross-functional teams (Data Architects, cloud engineers, and Business Analysts)