Apply Now
Location: Concord, California (CA)
Contract Type: C2C
Posted: 4 hours ago
Closed Date: 03/17/2026
Skills: Architect & Design: Design and implement robust,
Visa Type: Any Visa

Role : GCP Data Engineer

Location : Concord, CA (Onsite)

Hire Type : Contract

H1b works. Need Relocation email for Nonlocal candidate with i94 and Travel history.

 

Job Description

What You'll Do (Responsibilities):

  • Architect & Design: Design and implement robust, scalable, and cost-effective data solutions on Google Cloud, serving as the target architecture for migrated workloads.
  • Develop Reusable Frameworks & Accelerators: Design, build, and maintain reusable frameworks, templates, and code libraries to standardize and accelerate data engineering work.
  • This includes creating boilerplate pipeline structures, generic data validation modules, and automated deployment patterns that other engineers will leverage.
  • Migrate & Modernize: Lead the hands-on migration of data and processes from on-premises systems like Teradata and Hadoop to Google Cloud services, with a primary focus on BigQuery, Google Cloud Storage (GCS), Dataflow, and Dataproc. ETL/ELT Transformation: Analyze, deconstruct, and translate complex legacy ETL logic from tools like Informatica and Teradata BTEQ/Stored Procedures into modern, cloud-native pipelines, leveraging the frameworks/tooling you help create.
  • Pipeline Development: Build and automate new data pipelines for batch and streaming data using Python, SQL, and GCP's core services, ensuring all new development contributes to and benefits from our shared engineering frameworks.
  •  Performance & Cost Optimization: Proactively optimize BigQuery performance through effective partitioning, clustering, and query tuning.
  • Data Validation & Governance: Develop and implement rigorous data validation frameworks to ensure data integrity and accuracy post-migration. Collaborate with governance teams to apply data security, lineage, and cataloging using tools like Google Cloud Data Catalog and Dataplex.
  • Collaboration & Mentorship: Work closely with on-premises data experts, business analysts, and other engineers to understand requirements, ensure a smooth transition, and act as a subject matter expert and mentor for GCP and internal framework best practices.