Apply Now
Location: Palo Alto, CA 94304
Contract Type: C2C
Posted: 3 months ago
Closed Date: 01/31/2025
Skills: Tableau /Power BI
Visa Type: H1B, H4 EAD, USC

Role : AWS Data Engineer with Tableau /Power BI

Location : Palo Alto, CA 94304 (100% Day 1 Onsite)

 

1)          The candidate should have very good experience writing complex SQL queries

2)          The focus is on – SQL & hands-on knowledge of using various AWS services

 

Job Qualifications:

•                    Proficient in Data Analysis/Analytics, BI Dashboard design, Data Warehousing/ETL, BI

architecture, Spark Architecture and well-versed in tools/Languages like

•                    Python, Pyspark, Databricks, AWS, Tableau/Power BI, Insight generation, Scaling & Optimization.

•                    Hands-on experience in building custom ETL with focus on design, data modeling, implementation, and maintenance, to cater to the reporting needs of customers

•                    Strong Experience in AWS and other Cloud environments. AWS S3, AWS Glue/ EMR, Apache Spark, Redshift, Athena

•                    Demonstrated success in leading and scaling Data Analyst & BI solutions across diverse

organizations, from start-ups to large enterprises.

•                    Good understanding of Analytics ready data formats such as Parquet, ORC, JSON, etc. and Open Table formats Apache Iceberg, Apache Hudi, etc.

•                    Experience working in a fast-paced environment leveraging an agile development framework, understanding of test automation and continuous integration

•                    Bachelor’s degree or higher in software engineering, CS, or any related field

•                    Experience in healthcare industry and finance domain is highly desired

 

Job Responsibilities:

 

•                    Collaborate with stakeholders to define business requirements and translate them into

effective Analytical solutions. Oversee end-to-end deliverables, ensuring successful closure of solutions.

•                    Design and implement highly efficient data pipelines using AWS, PySpark and

•                    Airflow. Automated data completeness, Data Quality, Data Dependency checks

•                    Work with Data Engineers to build data ingestion pipelines to gather data from various data sources. E.g., Salesforce Health Cloud, Lab systems, financial and billing applications.

•                    Established a robust and scalable platform for centralized data storage, processing, and

or analysis in AWS.

•                    Established and maintain Data Governance using AWS Data Catalog,

•                    Co-ordinate closely with Finance, Ops, Compliance teams for regulatory reporting, Auditing, Financial closures.

•                    Initiate and lead technical design discussions within and across technical teams. 

•                    Create artifacts, such as design and implementation documents, to guide development, implementation, and support

•                    Work with DevOps to develop and maintain automated deployment for regular release cadence.

•                    Provide second-tier production support

 

Role: Ping Federate / HYPR Engineer

Location: Southlake TX (100% Onsite from Day 1) – No flexibility

 

Job Requirements:

Must have Skills: Ping Federate, HYPR

Good to Have Skills: other Identity Management

 

Key Responsibilities:

  • Good hands-on experience in implementing identity products specifically PingFederate and HYPR.
  • Excellent troubleshooting skills with PingFederate, HYPR
  • Strong organizational, problem solving and analytical skills
  • Provide recommendations for problem management for proactive incident management
  • If they have experience with ID verification products such as Ping ID verify/Hypr Affirm/ Lexis Nexis that will be a big bonus

 

Technical Experience:

  • Ability to develop technical documentation.
  • Token Generator and Token Processor in PingFederate

 

Professional Attributes:

  • Good Communication presentation skill
  • Someone who has implementation experience and can figure out stuff without supervision.
  • Experience in installing and maintaining the PingFederate across different environments

 

Role: Java Integration Expert

Location: New York, NY 10020 (100% Onsite from Day 1)

 

Job Description:

You will be responsible for designing, developing, and maintaining integration solutions using Mulesoft & Java. The ideal candidate will have a strong background in DevOps CI/CD pipelines, Java, Spring Framework, JavaScript, React JS and SQL/PLSQL.

 

Key Responsibilities:

  • Design and develop integration solutions using Mulesoft & Java, ensuring seamless data flow between systems.
  • Implement and manage CI/CD pipelines to automate the deployment and testing of integration solutions.
  • Collaborate with cross-functional teams to gather requirements and design integration solutions that meet business needs.
  • Develop and maintain Java-based applications and services as part of the integration solutions.
  • Optimize and manage Oracle Databases to ensure high performance and reliability.
  • Troubleshoot and resolve integration issues, ensuring minimal disruption to business operations.
  • Mentor junior engineers and provide technical guidance on best practices and standards.
  • Stay updated with the latest industry trends and technologies to continuously improve integration solutions.
  • Strong communication and collaboration skills to work effectively in a team environment.
  • Strong analytical and problem-solving skills to troubleshoot issues and implement solutions.
  • Implement unit tests, conduct code reviews, and adhere to best practices to maintain code quality and reliability.
  • Willingness to learn new technologies and stay updated with industry trends.

 

Qualifications:

 

Education: Bachelor's degree in Computer Science, Software Engineering, or a related field (or equivalent work experience).

 

Technical Skills:

  • Minimum 3 years of experience in integration development using Mulesoft & Java
  • Minimum 3 years of experience with DevOps tools and CI/CD pipelines (Azure DevOps, Jenkins, GitLab).
  • Experience with web application development using frameworks like Spring & Hibernate.
  • Experience with cloud platforms (e.g., Azure) is a plus.
  • Familiar with front end development like HTML, CSS, JavaScript.
  • Soft Skills:
  • Strong problem-solving and analytical abilities.
  • Good communication and collaboration skills.
  • Ability to work effectively in a team environment.