DataOps Engineer (26-27)

IDEA Public SchoolsEl Paso, TX
16d

About The Position

The DataOps Engineer designs and maintains the automation, testing, and deployment infrastructure that allows IDEA’s data platform to operate reliably at scale. This role applies DevOps principles to data systems -- eliminating manual processes, improving deployment safety, and enabling Platform and Analytics Engineering teams to ship faster without sacrificing reliability. Reporting to the Managing Director of Data Platform & Engineering, this engineer serves as the organization’s automation and reliability expert, partnering closely with Platform Engineering (infrastructure and ingestion) and Analytics Engineering (dbt transformations and data quality). This role is ideal for someone who enjoys building systems, tooling, and automation that make other engineers dramatically more effective.

Requirements

  • Hands-on experience building and maintaining CI/CD pipelines in production environments.
  • Strong proficiency with infrastructure-as-code tools (Terraform or equivalent).
  • Solid programming and scripting skills (Python, shell) for automation and tooling.
  • Experience operating production systems with monitoring, alerting, and incident response.
  • Familiarity with modern data platforms and analytics engineering workflows (e.g., Snowflake, dbt).
  • Bachelor’s degree in Computer Science, Engineering, Information Systems, or related field, or equivalent practical experience.
  • 5+ years of experience in DevOps, platform engineering, SRE, or similar technical roles.
  • Demonstrated experience automating deployments, infrastructure, and operational workflows with measurable impact.

Nice To Haves

  • Experience supporting data pipelines, analytics platforms, or dbt deployments.
  • Knowledge of Snowflake administration and performance considerations.
  • Familiarity with data ingestion tools and APIs (e.g., Fivetran, Airbyte).
  • Exposure to containerization or SRE practices.
  • Experience working in regulated or privacy-sensitive environments (education preferred).
  • Experience supporting data platforms, analytics engineering workflows, or large-scale data pipelines in production environments.
  • Hands-on experience with Snowflake administration, performance tuning, or environment management.
  • Experience deploying or supporting dbt projects, including CI/CD integration and testing automation.
  • Familiarity with data ingestion tools and APIs (e.g., Fivetran, Airbyte) and automating their configuration or deployment.
  • Exposure to containerization or site reliability engineering (SRE) practices.
  • Experience working in regulated or privacy-sensitive environments (education, healthcare, or public sector).
  • Relevant certifications (e.g., Terraform Associate, cloud DevOps certifications) or demonstrated equivalent expertise.
  • Contributions to open-source tooling, internal developer platforms, or shared automation frameworks.

Responsibilities

  • Design, build, and maintain CI/CD pipelines for data workflows, enabling automated testing, validation, and safe deployment to production.
  • Implement deployment automation for dbt projects, Snowflake infrastructure, and data ingestion tools with proper environment promotion and approval gates.
  • Develop safe deployment strategies (rollback, canary, blue/green) that reduce risk and downtime.
  • Maintain CI/CD tooling, documentation, and runbooks to ensure reliability and team adoption.
  • Own infrastructure-as-code for the data platform, using Terraform to provision and manage Snowflake environments and related resources.
  • Automate operational tasks such as environment setup, access provisioning, configuration management, and resource monitoring.
  • Build reusable automation tools, scripts, and templates that enable self-service provisioning and reduce manual toil.
  • Partner with Platform Engineering to align on IaC standards, patterns, and shared modules
  • Design and implement monitoring and observability for data pipelines, dbt models, and platform health.
  • Build dashboards and alerts tracking pipeline success, data freshness, test results, and system performance.
  • Implement intelligent alerting with clear escalation paths and minimal noise.
  • Establish incident response practices, runbooks, and post-incident learning loops.
  • Implement automated testing frameworks for data pipelines, including schema validation, regression testing, and data contract checks.
  • Enable automated execution and reporting of data quality tests in CI/CD workflows.
  • Partner with Analytics Engineering to standardize dbt test patterns and quality enforcement strategies.
  • Maintain test environments, fixtures, and safe testing workflows for production-bound changes
  • Serve as a DataOps advisor to Platform and Analytics Engineering teams, identifying automation opportunities and reducing friction.
  • Review pull requests with a focus on deployment safety, testing coverage, and operational risk.
  • Document DataOps standards and best practices, enabling scalable self-service adoption.
  • Partner with the Managing Director on DataOps strategy, tooling decisions, and technical roadmap.

Benefits

  • medical, dental, and vision plans, disability, life insurance, parenting benefits, flexible spending account options, generous vacation time, referral bonuses, professional development, and a 403(b) plan
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service