Apply Now
Location: Charlotte, North Carolina (NC)
Contract Type: C2C
Posted: 1 month ago
Closed Date: 11/13/2025
Skills: Data scientists, AI engineer, Snowflake
Visa Type: Any Visa

Title: Data Engineer Must have elite Snowflake, and Informatica


Location: Charlotte, NC(Hybrid)-need local

Duration: 4+ Months with possibly extn.


Visa: Green Card/Citizens


 

Need Full LinkedIn| .| MUST HAVE ELITE SNOWFLAKE AND INFORMATICA| Charlotte local preferred will consider remote if worked at Grant Thornton before 

 

 


Client vision and strategy consists of 3 pillars, Application Modernization, AI, and Data. In Data, our vision is to enable data-driven decision-making by implementing a firm-wide data strategy that ensures accurate, meaningful reporting and generates actionable insights and opportunities.

To realize this vision, IT is establishing a Data & Analytics function underneath the Chief Technology Officer. This organization’s goal will be to drive technological innovation, ensure the alignment of technology strategies with business objectives, and enhance the firm's competitive edge.

The Data Engineer is responsible for designing, developing, and maintaining the infrastructure and systems required for data storage, processing, and analysis. They play a crucial role in building and managing the data pipelines that enable efficient and reliable data integration, transformation, and delivery for all data users across the enterprise. You will collaborate with various groups within the Technology organization and across the firm to deliver high-value technology solutions and achieve organizational success.

• Designs and develops data pipelines that extract data from various sources, transform it into the desired format, and load it into the appropriate data storage systems

• Collaborates with architects, data scientists, AI engineers, and analysts to manage data as an asset and optimize models and algorithms for data quality, security, and governance

• Integrates data from different sources, including databases, data warehouses, APIs, and external systems

• Ensures data consistency and integrity during the integration process, performing data validation and cleaning as needed

• Transforms raw data into a usable format by applying data cleansing, aggregation, filtering, and enrichment techniques

• Optimizes data pipelines and data processing workflows for performance, scalability, and efficiency

• Monitors and tunes data systems, identifies and resolves performance bottlenecks, and implements caching and indexing strategies to enhance query performance

• Implements data quality checks and validations within data pipelines to ensure the accuracy, consistency, and completeness of data

• Takes authority, responsibility, and accountability for exploiting the value of enterprise information assets and of the analytics used to render insights for decision making, automated decisions and augmentation of human performance