Senior Data Engineer – Upstream Oil & Gas

Crescent EnergyHouston, TX
21hOnsite

About The Position

Crescent is a differentiated U.S. energy company committed to delivering value through a disciplined, returns-driven growth through acquisition strategy and consistent return of capital. Our long-life, balanced portfolio combines stable cash flows from low-decline production with a deep, high-quality development inventory. Crescent is a top three producer (by gross operated production) in the Eagle Ford basin. Crescent’s leadership is an experienced team of investment, financial and industry professionals that combines proven investment and operating expertise. For more than a decade, Crescent and our predecessors have executed on a consistent strategy focused on cash flow, risk management and returns. Through disciplined and accretive investments, we have successfully tripled the size of our company since going public in December 2021 while maintaining a strong balance sheet. The Sr. Data Engineer- Upstream Oil & Gas is responsible for designing, building, and operating scalable, reliable, and secure data pipelines that support enterprise analytics, reporting, and advanced data use cases. This role is critical to enabling data-driven decision-making by ensuring trusted, high-quality data is available across operational, engineering, financial, and HSE domains. This position requires hands-on experience in upstream oil and gas and proven expertise working with enterprise-scale cloud data platforms, with Snowflake (or comparable platforms) as a core prerequisite. The Data Engineer will play a key role in advancing the company’s cloud-based data platform and accelerating its journey toward becoming a data-driven, technology-enabled enterprise.

Requirements

  • Education: Bachelor’s degree in Computer Science, Engineering, Information Systems, or related field.
  • Technical Skills & Experience: 5+ years data engineering/analytics experience.
  • Experience with Power BI and Spotfire.
  • Advanced SQL, dbt, Snowflake.
  • Familiarity with medallion architecture CI/CD, and Git.
  • Industry experience in upstream oil and gas is required, ideally with exposure to drilling, production operations, reserves, and regulatory reporting is required.
  • Proven experience deploying and managing cloud data platforms with Snowflake expertise is essential.
  • Willing to work on-site in our downtown office Monday-Friday

Nice To Haves

  • Strong understanding of oil and gas data domains (e.g., wells, completions, SCADA, production volumes, reserves, HSE).
  • Expertise in data modeling, data integration (ETL/ELT), cataloging, lineage tracking, and security best practices.

Responsibilities

  • Data Engineering & Pipeline Development Design, build, and maintain scalable, fault-tolerant ETL/ELT pipelines supporting structured and semi-structured data across upstream operational, engineering, financial, and ESG domains.
  • Develop and operate data ingestion pipelines using Fivetran (or similar tools), including connector configuration, schema management, and change data capture (CDC) patterns.
  • Engineer and maintain data pipelines within the Snowflake environment, ensuring reliability, performance, and cost efficiency.
  • Monitor, troubleshoot, and remediate data pipeline failures, latency issues, and data inconsistencies in production environments.
  • Implement logging, alerting, and basic observability for data pipelines to support operational reliability.
  • Data Transformation & Medallion Architecture Implement data transformations using dbt, applying modular, testable, and version-controlled transformation logic.
  • Develop and maintain datasets aligned to a medallion architecture ensuring clear separation between raw, refined, and analytics-ready data layers.
  • Apply data modeling best practices (dimensional, star schema, domain-oriented models) to support analytics and reporting use cases.
  • Maintain documentation and tests within dbt to improve data transparency, lineage, and maintainability.
  • Analytics & Reporting Enablement Design and deliver standardized, production-grade dashboards and reports using Power BI and Spotfire.
  • Work with business users to define KPIs, metrics, and reporting requirements, translating them into governed, scalable data models.
  • Transition ad-hoc and spreadsheet-based reporting into certified, reusable analytics assets.
  • Support semantic models and curated datasets that enable consistent, performant reporting across teams.
  • Data Quality, Governance & Reliability Implement automated data quality checks, validations, and reconciliation processes within data pipelines.
  • Publish and maintain certified datasets and analytics-ready tables to support self-service consumption.
  • Enforce metric consistency by aligning transformations and reporting logic to approved definitions and standards.
  • Apply CI/CD, version control, and testing practices to data pipelines and dbt projects.
  • Business Partnership & Enablement Work directly with business analysts, operations users, and reporting consumers to clarify data requirements and troubleshoot issues.
  • Support day-to-day data needs by delivering reliable datasets and reports aligned to defined standards and priorities.
  • Provide technical input into data solutions while following established architecture, governance, and platform guidelines.
  • Assist in onboarding users to certified datasets and reports, helping reduce dependency on ad-hoc data extracts.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service