Data Solutions Integrator

ConocoPhillipsAnchorage, AK
1dOnsite

About The Position

Welcome to ConocoPhillips, where innovation and excellence create a platform for opportunity and growth. Come realize your full potential here. Who We Are We are one of the world’s largest independent exploration and production companies, based on proved reserves and production of liquids and natural gas. With operations and activities in 13 countries, we explore for, develop, and produce crude oil and natural gas globally. We are challenged with an important job to safely find and deliver energy to the world. Our employees are critical to our success, and with them we power civilization. We’re grounded by our SPIRIT Values – safety, people, integrity, responsibility, innovation, and teamwork. These values position us to deliver strong performance in a dynamic business – but not at all costs. We believe it’s not just what we do – it’s how we do it – that sets us apart. Fostering an Inclusive Work Environment To deliver superior performance, we create an environment that respects the contributions and differences of every individual. Wherever possible, we use these differences to drive competitive business advantage, personal growth and, ultimately, create business success. Job Summary Alaska Overview The Alaska segment primarily explores for, produces, transports and markets crude oil, natural gas and NGLs. We are the largest crude oil producer in Alaska and have major ownership interests in the Prudhoe Bay, Kuparuk and Western North Slope asset areas. Additionally, we are one of Alaska’s largest owners of state, federal and fee exploration leases, with approximately one million net undeveloped acres at year-end 2024. Alaska operations contributed 14 percent of our consolidated liquids production and two percent of our consolidated natural gas production. Position Overview The Alaska Digital Technologies (AKDT) organization is seeking an experienced Data Solutions Integrator. This role is pivotal in adopting and executing digital strategies to drive innovation, enhance user experiences, and streamline operations. The role will be responsible for building data pipelines from source systems into the analytics environment with validation to allow creation and operations of analytical solutions and computational models The candidate should be proficient at integrating and preparing large, varied datasets including time series data and designing specialized database tables and information links within the environment. The person will work closely with engineers, data scientists, geo-scientists, project/program managers, and IT teams to provide clean and reliable data across our diverse portfolio of on-prem and cloud platforms and applications. This role should also have a familiarity with basic cloud data engineering concepts & task such as cloud (Azure & AWS) data architectures, managing cloud data storage (including ingress/egress considerations), cloud data security and troubleshooting cloud data issues. This position is based in Anchorage, AK.

Requirements

  • Legally authorized to work in the United States
  • Bachelor’s degree or higher in Information Technology, Computer Science, Math, Engineering, Statistics, Information Systems, Information Science, or a related technical field or foreign equivalent
  • 7 or more years’ experience in Information Technology
  • 1 or more years’ experience with data visualization/analytics tools
  • 1 or more years’ experience with an RDBMS to include SQL
  • 1 or more years’ experience handling large amount of operational data
  • Beginner knowledge of Cloud DevOps and Infrastructure as Code (IaC)
  • Beginner knowledge of data integration leveraging API Management Platforms

Nice To Haves

  • Master’s degree or higher in Information Technology, Computer Science, Math, Engineering, Statistics, Information Systems, Information Science, or a related technical field or foreign equivalent
  • 5 or more years’ experience managing data and data pipelines
  • 3 or more years’ practical programming
  • 1 or more years’ experience with Oil & Gas time series data feeds in combination with historical and unstructured data
  • 1 or more years’ experience with the following: Spotfire, Power BI Snowflake, Oracle hosted on Unix, Teradata Connecting to enterprise systems Azure, AWS (Cloud Formation, Terraform), MS Azure Azure APIM, Snowflake Rest API Java, Scala, or similar.
  • Advanced knowledge of one or more programming languages (R, C++, Java, SQL, Python)
  • Advanced knowledge of data integration approach differences between connecting directly to source databases and connecting to vendor provided APIs.
  • Advanced MS Excel skills
  • Intermediate understanding of project processes and methodology to support Project Management initiatives and delivery
  • Ability to think strategically and translate business goals into actionable digital strategies
  • Builds effective solutions based on available information and makes timely decisions that are safe and ethical
  • Listens actively and invites new ideas for exchanged opinions, then influences and acts to drive positive performance and achieve results

Responsibilities

  • Managing the daily operations of the Data Modernization and Operations team, ensuring timely resolution of issues and incidents
  • Develop and execute a comprehensive solution deployment strategy for seamless access and integration of infrastructure, application and data that aligns with business objectives and drives better decision making
  • Partner with vendors and cross-functional internal IT and business groups to design and implement interfaces for existing and/or new applications that meet business and IT requirements
  • Focus on creating scalable, efficient systems for data ingestion, transformation, and storage, ensuring that clean, accessible data is available for analytics teams, business intelligence, and operational systems
  • Security and Compliance: Ensure that all digital solutions comply with relevant Enterprise Architecture and Security Standards and Regulations to protect the organization's data and assets
  • Understand robust data pipelines, ETL (Extract, Transform, Load)/ELT (Extract, Load, Transform) processes, and workflow orchestration systems to automate data ingestion, transformation, and distribution across multiple platforms
  • Proficient with converting functional specifications, design patterns and architecture diagrams into technical requirements, creating test plans, triaging complex technical problems with multiple teams
  • Build and maintain scalable data infrastructure including data warehouses, data lakes, and streaming systems that support high-volume data processing and analytics workloads
  • Implement data quality validation, monitoring frameworks, and error handling mechanisms to ensure reliable data delivery and maintain system performance standards
  • Collaborate with data architects and analysts to translate business requirements into technical specifications for data processing solutions and system enhancements
  • Optimize data processing performance through query tuning, indexing strategies, and resource management to support efficient analytics and operational reporting
  • Establish data security protocols, access controls, and backup procedures to protect sensitive information and ensure business continuity for critical data systems
  • Monitor system health, troubleshoot data pipeline failures, and implement automated recovery procedures to maintain operational reliability and minimize downtime
  • Integrate diverse data sources including databases, APIs, streaming platforms, and external feeds to create unified data repositories for organizational use
  • Document technical specifications, data lineage, and operational procedures to support system maintenance, knowledge transfer, and compliance requirements
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service