Apply Now
Contract Type: C2C
Posted: 4 days ago
Closed Date: 04/21/2026
Skills: Snowflake ? Databricks, BigQuery
Visa Type: Any Visa

Job Title: Data Product Manager / Technical Program Manager

Location: Lewisville TX / Onsite


Data Product Manager / Technical Program Manager

Data Platform Migration: Snowflake ? Databricks

About the Engagement

Client is undertaking a strategic re-platforming of its enterprise data infrastructure from Snowflake to Databricks. The initiative spans 30+ data sources across APIs, flat files, and SQL databases. Existing Snowflake pipelines will first need to be reverse engineered and documented before migration begins.

This contractor will serve as the connective tissue between business stakeholders, data engineers, and legal leadership — owning the product vision, backlog management, story creation, delivery roadmap, and day-to-day execution.

Migration Phases You Will Own

Discovery & Reverse Engineering

Audit and document all existing Snowflake pipelines, transformations, and data flows across 30+ sources before any migration work begins.

Source Inventory & Data Profiling

Catalog all ingestion sources — REST/GraphQL APIs, structured file feeds, SQL databases — and assess data quality, volume, and criticality.

Architecture Translation & Backlog Creation

Translate legacy Snowflake patterns into Databricks-aligned designs; break these down into Jira epics, stories, and tasks for data engineering teams.

Delivery & Stakeholder Management

Drive sprint execution, manage dependencies, surface blockers, and keep firm leadership and business stakeholders aligned throughout migration.

Cutover & Validation

Coordinate pipeline cutover sequencing, define acceptance criteria, and ensure data quality validation sign-off before decommissioning Snowflake workloads.

Core Responsibilities

Product & Delivery

  • Own and manage the full product backlog in Jira across all migration workstreams
  • Write clear, well-scoped epics and user stories for data engineering teams
  • Build and maintain a multi-phase product roadmap with milestone tracking
  • Define and monitor KPIs for migration progress and pipeline health
  • Facilitate sprint planning, backlog grooming, and retrospectives

Technical Translation

  • Reverse engineer existing Snowflake pipelines and produce documentation
  • Translate business and product requirements into actionable engineering specs
  • Apply basic data modeling skills to validate proposed Databricks designs
  • Understand ingestion patterns across APIs, file-based, and SQL source systems
  • Bridge the gap between non-technical legal stakeholders and data engineers

Stakeholder Management

  • Serve as primary point of contact for firm leadership on migration status
  • Manage expectations across legal, IT, compliance, and business stakeholders
  • Produce status reports, risk logs, and decision memos
  • Escalate blockers and facilitate cross-functional resolution

Program Management

  • Maintain project timelines, dependency maps, and critical path tracking
  • Manage vendor and contractor relationships where applicable
  • Identify and mitigate delivery risks across 30+ source migration tracks
  • Ensure alignment between engineering capacity and roadmap commitments

REQUIRED

5+ years in data product management, technical program management, or a data-adjacent delivery role. Prior experience on a data platform migration or cloud data warehouse project (Snowflake, Databricks, Redshift, BigQuery, or equivalent). Proven ability to write high-quality Jira stories and manage an agile backlog. Comfort reading and interpreting SQL; familiarity with data pipeline concepts (ingestion, transformation, orchestration). Strong stakeholder communication skills, able to present to both engineers and management. Experience in a professional services, legal, or regulated industry environment is a strong plus.

TECHNICAL FAMILIARITY EXPECTED

Working knowledge of at least one: Databricks, Snowflake, dbt. Understanding of data ingestion patterns - REST API, SFTP/file-based, JDBC/ODBC. Basic data modeling concepts (dimensional modeling, medallion architecture, or similar).

NICE TO HAVE

Experience in a professional services, legal, or regulated industry environment is a strong plus. Familiarity with legal data systems (matter management, billing, DMS). Hands-on pipeline development experience (even if not the primary role).

What Success Looks Like

  • All 30 Snowflake pipelines are fully documented and reverse engineered within the first 60 days
  • A sequenced, risk-weighted migration roadmap is delivered and approved by firm leadership
  • Data engineering teams have a clean, well-groomed backlog with no ambiguous stories blocking sprints
  • Migration completes on schedule with zero critical data quality regressions at cutover
  • Stakeholders across legal, IT, and business functions are consistently informed and aligned