Title: Snowflake/Azure/ Data warehouse - Data Architect
Visa: USC, GC, GC- EAD, H4 EAD (Need genuine only)
Duration: 12 + Months
Location: Local to Richmond VA or Atlanta GA. (Local to Cities) - Hybrid
Rate: $85/hr
Interview: 2videos
Need Updated LinkedIn with profile pic.
- Must be Senior
- Snowflake
- Must have strong Data Modeling
- Enterprise Data Model (EDM) Implementation
- Conceptual Data Modeling
Job Summary: We are seeking an experienced Snowflake Data Architect to lead the design and implementation of scalable enterprise data warehouses (EDW) while establishing strong ontology and taxonomy frameworks. The ideal candidate will have deep expertise in data modeling, metadata management, and data governance to ensure seamless data integration and accessibility across the organization.
Enterprise Data Warehouse (EDW) Design & Modeling
- Architect, develop, and optimize data warehouse solutions in Snowflake for large-scale enterprise applications.
- Design and implement dimensional and relational data models, ensuring scalability, efficiency, and maintainability.
- Define best practices for schema design, partitioning, clustering, and indexing in Snowflake.
- Develop ETL/ELT pipelines in collaboration with data engineers to streamline data ingestion and transformation processes.
Ontology and Taxonomy Development
- Define and implement ontology frameworks to establish relationships between data entities.
- Develop and manage data taxonomies to standardize data classification and metadata organization.
- Collaborate with business stakeholders to create a semantic layerthat improves data discoverability and usability.
- Ensure alignment with industry standards (e.g., FIBO, CDISC, schema.org) for data classification.
Metadata Management & Data Governance
- Implement data cataloging solutions to enhance metadata documentation and lineage tracking.
- Define data governance policies, including data security, access controls, and compliance.
- Work closely with data stewards, analysts, and engineers to ensure consistent data definitions and quality across the organization.
- Establish automated workflows for data classification, tagging, and access provisioning.
Performance Optimization & Best Practices
- Optimize Snowflake compute and storage costs through clustering strategies, caching, and workload management.
- Implement monitoring tools to track query performance, storage consumption, and concurrency.
- Advocate for CI/CD practices, Infrastructure as Code (IaC), and automation in Snowflake deployments.
Required Qualifications
- 15+ years of experience in data architecture, data modeling, and data governance.
- 5+ years of hands-on experience designing and implementing Snowflake-based data solutions.
- Strong expertise in dimensional modeling (Kimball/Inmon).
- Proven experience in ontology and taxonomy design, preferably in enterprise environments.
- Proficiency in SQL, Python, and dbt (Data Build Tool) for data modeling and transformations.
- Experience with metadata management tools (e.g., Collibra, Alation, Atlan) and data catalogs.
- Knowledge of cloud-based architectures (Azure) and integration with Snowflake.
- Understanding of data privacy, security frameworks (GDPR, CCPA, HIPAA), and access controls.
- Strong problem-solving skills and the ability to communicate complex technical concepts to business users.
Important Notes
• Very technical and tactical. Not strategic, can't have a business conversation with stakeholders
• Highly technical, tactical, strategic and can talk with stakeholders - CAF
• Need to help impact
• Level of Director or AVP // working and communicating with and executive
• Ideally in person- Richmond or ATL (CAF is in ATL) // one week a quarter on site, or 2 days in a week
• Snowflake is a huge requirement -- recent
• Azure Data Factory is huge to have that - ideal
• Main platform and ecosystem is Snowflake-- good experience in snowflake
Prefer my architects to be hands on, 70% execution, 30% strategic -- maybe even 60/40 but have to be able to develop, prototype, provide the first foundational component. Bring up intentional thought leadership on how the data modeling is doing. Theres a journey there