Commercial IT Sr. Data Solution Developer

ConocoPhillipsHouston, TX
2d

About The Position

We are seeking a highly-skilled Commercial Sr. Data Solution Developer with Commercial/Energy trading experience to join our Commercial IT team. Reporting to the IT Supervisor, Commercial Data, this role will bridge the gap between raw energy market data and actionable commercial insights by designing high-performance data solutions, data models and automated data pipelines. You will work closely with data architects, Commercial analysts, and model developers to create scalable data assets supporting forecasting, optimization, risk modeling, and real time decision making. This role will be part of a team of data architect, data governance analyst and data solutions developers responsible for implementing Commercial data solutions and data pipelines while ensuring a high level of data quality based on corporate and industry standards. The team will also be responsible for developing and enforcing standards for data management and implementing data technology solutions for structured and unstructured datasets as well as data catalogs and data lineage to enhance visibility and access to data for Commercial groups. Additionally, this role will work closely with Commercial data/quant analysts, data scientists, model owners, project/program managers, Corporate IT operational support managed services team, vendors, data architects, and the data governance analyst to provide clean and reliable data and high-quality data solutions. This role requires a demonstrated focus on current technologies and practical analytical experience.

Requirements

  • Legally authorized to work in the United States
  • Bachelor's degree or higher in Computer Science, Information Systems, Information Science, or related field
  • 5 or more years of experience in full stack software development using Python
  • 5 or more years of experience with Commercial/Energy Trading
  • 1 or more years of experience of modern cloud-based data platform technologies e.g., Snowflake, ADLS, DataBricks
  • Advanced level of knowledge of building and orchestrating pipelines using Airflow, Databricks Jobs, or ADF
  • Advanced level of knowledge of data visualization/analytics tools (Spotfire, Power BI, Sigma) plus strong Excel skills
  • Advanced level of knowledge of SQL and relational data modeling experience across traditional RDBMS and cloud data platforms (Snowflake, SQL Server, PostgreSQL)
  • Advanced level of knowledge of GitHub, CI/CD workflows, Docker, and Infrastructure as Code (Terraform)
  • Intermediate level of knowledge of developing REST APIs (Flask, FastAPI, or similar)
  • Intermediate level of knowledge of web-scraping frameworks (Selenium, Beautiful Soup, or similar)
  • Intermediate level of knowledge of Cloud DevOps (Azure, AWS)
  • Intermediate knowledge of project processes and methodology to support project management initiatives and delivery
  • Basic level of knowledge of Plotly Dash or other Python-based dashboard frameworks

Nice To Haves

  • Excellent verbal and written presentation skills, with the ability to communicate clearly and persuasively
  • Ability to work as a Team player, and self-driven individual who can multi-task, working independently under minimal supervision and to deliver on commitments
  • Work across structured, semi-structured, and unstructured data, with strong technical experience in large distributed systems, data warehousing, data lake at scale
  • Hands on experience and proficiency in Data Management tools – SQL, SDE, JSON, XML, scripting languages such as Python
  • Big Data technology stack including NoSQL, Spark, Hive, Kafka, StreamSets, IICS, ADF etc.
  • Intermediate knowledge of application development capabilities and ability to swiftly grasp new concepts and technologies
  • Experience coding with agentic-AI frameworks (Claude Code / GitHub CoPilot)
  • Knowledge of Data Management principles, processes and data lifecycle
  • Knowledge of master data management, real-time streaming data and cloud technologies
  • Understanding of Data Science and related technologies
  • Follow best practices for coding, testing, version control, peer review, and CI/CD automation
  • Ability to take ownership, engage, lead change, achieve results, adapt, solve problems, manage risk and drive tasks to completion
  • Demonstrated ability to deal with ambiguity and maintain effective performance under stressful and uncertain conditions
  • Excellent analytical mind and proven problem-solving skills
  • Project Management skills

Responsibilities

  • Design and develop robust data pipelines, ETL/ELT processes, and workflow orchestration systems to automate data ingestion, transformation, and distribution across multiple platforms for Commercial
  • Collaborate with data architects and analysts to translate business requirements into technical specifications to develop end-to-end data solutions—from database objects to security models—that support business modeling, forecasting, and reporting requirements
  • Build and maintain scalable data infrastructure including data warehouses, data lakes, and streaming systems that support high-volume data processing and analytics workloads
  • Implement data quality validation, monitoring frameworks, and error handling mechanisms to ensure reliable data delivery and maintain system performance standards
  • Monitor system health, troubleshoot data pipeline failures, and implement automated recovery procedures to maintain operational reliability and minimize downtime
  • Monitor Commercial trading analytics models performance
  • Adhere to ConocoPhillips’ governance and compliance policies for data and processes
  • Collaborate with Data Architect to develop and evaluate design options to ingest and catalogue real-time, structured, and unstructured datasets into Data Management environments per Commercial business priorities and use cases
  • Cultivate relationships with peer Data Management groups to ensure alignment, compliance and long-term sustainability of our Commercial digital assets
  • Proactively identify opportunities for automation and continuous improvements
  • Stays current with the latest trends, technologies, and best practices in data management, application development, and data engineering
  • Participate in and support Agile quarterly planning for the team.
  • Expertise in optimizing queries and performance by establishing standards to minimize system workload and costs

Benefits

  • Medical, dental, vision, mental health support, and wellness programs.
  • Competitive base pay, annual performance bonuses, and retirement savings plans.
  • Paid time off, paid parental leave, and family support resources.
  • Access to training, mentoring, and internal career mobility tools.
  • Peer-nominated awards, inclusive culture, and employee resource groups.

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Number of Employees

5,001-10,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service