GCP Data Engineer

CrackaJack Digital Solutions LLCHouston, TX
4d

About The Position

GCP Data Engineer Houston, TX Long Term Contract

Requirements

  • 4+ years of hands-on experience with Google Cloud Platform (GCP) services.
  • Strong experience with BigQuery, Dataflow, and Airflow.
  • Proficiency in Python or Java for data processing and pipeline development.
  • Deep understanding of big data processing principles and distributed systems.
  • Hands-on experience working with millions of rows/records in large-scale environments.
  • Experience with data lake and data warehouse architectures.
  • Strong problem-solving skills and ability to debug complex systems.

Responsibilities

  • Design, develop, and manage robust, scalable, and efficient data pipelines on GCP.
  • Build and maintain data workflows using Apache Airflow for orchestration.
  • Work with Google BigQuery for large-scale analytics and data storage.
  • Leverage Google Cloud Dataflow (Apache Beam) for real-time and batch data processing.
  • Ingest, transform, and analyze massive datasets (millions+ of records).
  • Collaborate with data scientists, analysts, and other engineering teams to deliver reliable data solutions.
  • Optimize data performance and troubleshoot data pipeline issues in production.
  • Ensure data quality, integrity, and security across all data platforms.
  • Automating specific system operations to enhance efficiency and speed
  • Testing designs to find and fix mistakes and make system improvements
  • Evaluating and identifying the best cloud solutions in collaboration with engineering and development teams
  • Developing, establishing and implementing modular cloud-based applications
  • Locating, evaluating and fixing infrastructure risks and deployment problems
  • Periodically evaluating computer systems and offering suggestions for performance enhancements
  • Offering support and guidance to meet customer requirements
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service