Apply Now
Location: Pennsylvania (PA)
Contract Type: C2C
Posted: 4 hours ago
Closed Date: 04/10/2026
Skills: Data Engineer ,ETL/ELT data pipelines using PySpark
Visa Type: USC, Any Visa

Job Title: Data Engineer (PySpark + Redshift)

Location: Philadelphia, PA (Onsite)

Duration: 12+ months / Long-term Contract

 14+ Years candidates needed

JD:

We are looking for a skilled Data Engineer with strong experience in PySpark and Amazon Redshift to design, develop, and optimize scalable data pipelines. The ideal candidate will have hands-on experience in big data processing, cloud platforms, and data warehousing solutions.

 Key Responsibilities:

  • Design, build, and maintain scalable ETL/ELT data pipelines using PySpark
  • Develop and optimize data workflows on Amazon Redshift
  • Work with large-scale structured and unstructured datasets
  • Perform data modeling, data cleansing, and transformation
  • Optimize query performance and data storage strategies in Redshift
  • Collaborate with data scientists, analysts, and business teams
  • Ensure data quality, integrity, and governance standards
  • Monitor and troubleshoot data pipeline issues