Apply Now
Location: Bellevue, Washington (WA)
Contract Type: C2C
Posted: 3 days ago
Closed Date: 12/15/2025
Skills: Redshift, Databricks, AWS Lambda, Python, Java, and Node.js, Apache Spark, Delta Lake
Visa Type: Any Visa

Job Role: AWS Architect

Location: Bellevue WA (onsite)

Type: Long Term Contract

 

Must Have Skills: AWS services including S3, Lambda, Step Functions, Aurora DB, Redis and Nest JS Services, Redshift, Databricks, AWS Lambda, Python, Java, and Node.js, Apache Spark, Delta Lake

 

Experience: 12+ Years.

 

Job Summary:

  • We are seeking a highly skilled and experienced AWS Architect who can guide the Team and also suggest on design on new enhancement.
  • He should be able to design, develop, and implement scalable cloud-based solutions.
  • The ideal candidate will have hands-on expertise in AWS services including S3, Lambda, Step Functions, Aurora DB, Redis and Nest JS Services, Redshift with a strong understanding of cloud architecture.
  • Experience with Databricks is a plus.

Key Responsibilities:

• Lead the design and development of cloud-native applications and data pipelines using AWS services.

• Architect scalable and secure solutions leveraging AWS best practices.

• Collaborate with cross-functional teams to gather requirements and translate them into technical solutions.

• Implement serverless workflows using AWS Lambda and Step Functions.

• Manage and optimize data storage and retrieval using Aurora DB, Redis, and Redshift.

• Ensure high availability, fault tolerance, and performance of cloud applications.

• Provide technical leadership and mentorship to development teams.

• Stay current with AWS innovations and recommend improvements to existing systems.

• Document architecture decisions, technical designs, and implementation details.

Required Skills:

• 7+ years of experience in software development with at least 3 years in AWS cloud architecture.

• Strong proficiency in AWS services: S3, Lambda, Step Functions, Aurora DB, Redis, Redshift.

• Experience with workflow orchestration tools like Conductor.

• Solid understanding of data modelling, ETL pipelines, and data warehousing.

• Proficiency in programming languages such as Python, Java, and Node.js.

• Familiarity with CI/CD pipelines, Infrastructure as Code (IaC) using tools like CloudFormation or Terraform.

• Excellent problem-solving and communication skills.

• AWS certifications (e.g., Solutions Architect, Developer) are a plus.

• Experience with Databricks for big data processing and analytics.

• Knowledge of Apache Spark, Delta Lake, or other modern data frameworks.