Staff Software Engineer, Data Platform

ZocdocNew York, NY
13hHybrid

About The Position

We’re hiring a Staff Software Engineer (Data Platform) to define and lead the evolution of Zocdoc’s data platform at the intersection of data engineering and product engineering. This role owns how data moves into, across, and out of our lakehouse and warehouse ecosystems. You’ll shape the contracts, access patterns, APIs, governance controls, and interoperability standards that enable teams to reliably produce and consume data at scale. Unlike a traditional data engineering role, this position is focused on platform experience and data product design — building the frameworks, contracts, and service interfaces that make data trustworthy, compliant, and easy to use across both technical and non-technical stakeholders. You will define how: Product teams publish high-quality, validated data into the platform Analytics and ML teams consume governed datasets with confidence Reverse ETL and activation patterns safely operationalize data back into product systems Access control, compliance, and governance are embedded into platform design by default This is a highly cross-functional leadership role requiring deep data engineering expertise, strong product thinking, and the ability to influence standards across the organization.

Requirements

  • 8+ years of experience in data engineering, platform engineering, or backend platform development.
  • Demonstrated experience designing data contracts, schema governance, or producer/consumer standards at scale.
  • Strong expertise in Python and SQL, with hands-on experience building scalable data frameworks.
  • Experience with distributed data systems such as Spark (Databricks or EMR) and modern lakehouse architectures (Delta Lake / Iceberg).
  • Experience with data warehouses such as Snowflake and strong understanding of performance and access patterns.
  • Familiarity with schema registry systems and schema evolution in streaming systems (e.g., Kafka).
  • Experience building APIs, shared libraries, or platform services adopted by multiple teams.
  • Strong understanding of access control, RBAC, and compliance constraints in regulated environments.
  • Proven ability to lead cross-functional architectural initiatives across product, analytics, and infrastructure teams.
  • Clear communication skills and a track record of influencing standards across an organization.
  • Experience working with AI-assisted development tools or cloud-based coding environments (e.g., Claude Code, Codex, Cursor, internal code generation frameworks, or similar systems).
  • Strong understanding of governance considerations for GenAI systems, including access control, prompt safety, sensitive data handling, and auditability.
  • Perspective on how structured data models and contracts improve AI reliability and downstream automation.

Nice To Haves

  • Experience designing reverse ETL frameworks or operational activation pipelines.
  • Familiarity with metadata and governance platforms (e.g., Unity Catalog, Collibra, OpenMetadata).
  • Experience building internal developer platforms for event logging or data publishing.
  • Experience working in regulated environments involving PHI/PII.
  • Experience integrating streaming systems (Kafka/Kinesis) with warehouse and lakehouse ecosystems.

Responsibilities

  • Defining and evolving data contract standards across the company, including schema enforcement, versioning, and validation patterns.
  • Designing interoperable ingestion and publishing frameworks that enable upstream producers (e.g., product engineering teams) to integrate seamlessly with the data platform.
  • Building and standardizing APIs, libraries, or SDKs that simplify event logging, schema validation, and contract compliance.
  • Establishing best practices for schema registry usage and distributed schema validation across streaming and batch systems (e.g., Kafka-based systems).
  • Designing clear patterns for: When to use the data lake vs. the warehouse How curated layers are exposed How downstream consumers access data safely
  • Leading reverse ETL and activation architecture to support operational use cases.
  • Defining and enforcing access control, governance, and compliance standards (e.g., PHI/PII handling, DEID boundaries, RBAC).
  • Partnering with Product Engineering, Security, Compliance, Analytics Engineering, and Infrastructure to align on standards and long-term direction.
  • Mentoring engineers and influencing engineering culture around data quality, ownership, and contracts.
  • Driving adoption of AI-assisted development practices (e.g., cloud-based coding environments, internal AI tooling, or agentic workflows) to accelerate platform delivery.
  • Designing guardrails for AI access to data systems, including scoped permissions, auditing, and compliance-aware controls.
  • Partnering with product and AI teams to ensure our data contracts, schemas, and curated layers are AI-consumable and safe by default.
  • Evaluating how internal data platform assets can power AI use cases and intelligent automation across the company.

Benefits

  • Flexible, hybrid work environment at our convenient Soho location
  • Unlimited Vacation
  • 100% paid employee health benefit options (including medical, dental, and vision)
  • Commuter Benefits
  • 401(k) with employer funded match
  • Corporate wellness program with Wellhub
  • Sabbatical leave (for employees with 5+ years of service)
  • Competitive paid parental leave and fertility/family planning reimbursement
  • Cell phone reimbursement
  • Catered lunch everyday along with beverages and snacks
  • Employee Resource Groups and ZocClubs to promote shared community and belonging
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service