Job Title: Data Product Manager / Technical Program Manager
Location: Lewisville TX / Onsite
Data Product Manager / Technical Program Manager
Data Platform Migration: Snowflake ? Databricks
About the Engagement
Client is undertaking a strategic re-platforming of its enterprise data infrastructure from Snowflake to Databricks. The initiative spans 30+ data sources across APIs, flat files, and SQL databases. Existing Snowflake pipelines will first need to be reverse engineered and documented before migration begins.
This contractor will serve as the connective tissue between business stakeholders, data engineers, and legal leadership — owning the product vision, backlog management, story creation, delivery roadmap, and day-to-day execution.
Migration Phases You Will Own
Discovery & Reverse Engineering
Audit and document all existing Snowflake pipelines, transformations, and data flows across 30+ sources before any migration work begins.
Source Inventory & Data Profiling
Catalog all ingestion sources — REST/GraphQL APIs, structured file feeds, SQL databases — and assess data quality, volume, and criticality.
Architecture Translation & Backlog Creation
Translate legacy Snowflake patterns into Databricks-aligned designs; break these down into Jira epics, stories, and tasks for data engineering teams.
Delivery & Stakeholder Management
Drive sprint execution, manage dependencies, surface blockers, and keep firm leadership and business stakeholders aligned throughout migration.
Cutover & Validation
Coordinate pipeline cutover sequencing, define acceptance criteria, and ensure data quality validation sign-off before decommissioning Snowflake workloads.
Core Responsibilities
Product & Delivery
Technical Translation
Stakeholder Management
Program Management
REQUIRED
5+ years in data product management, technical program management, or a data-adjacent delivery role. Prior experience on a data platform migration or cloud data warehouse project (Snowflake, Databricks, Redshift, BigQuery, or equivalent). Proven ability to write high-quality Jira stories and manage an agile backlog. Comfort reading and interpreting SQL; familiarity with data pipeline concepts (ingestion, transformation, orchestration). Strong stakeholder communication skills, able to present to both engineers and management. Experience in a professional services, legal, or regulated industry environment is a strong plus.
TECHNICAL FAMILIARITY EXPECTED
Working knowledge of at least one: Databricks, Snowflake, dbt. Understanding of data ingestion patterns - REST API, SFTP/file-based, JDBC/ODBC. Basic data modeling concepts (dimensional modeling, medallion architecture, or similar).
NICE TO HAVE
Experience in a professional services, legal, or regulated industry environment is a strong plus. Familiarity with legal data systems (matter management, billing, DMS). Hands-on pipeline development experience (even if not the primary role).
What Success Looks Like