Data Ops Engineer

ShamrockOverland Park, KS
1d

About The Position

About the Role Shamrock Trading Corporation is seeking a DataOps Engineer who is passionate about data automation and lifecycle management. As a pivotal member of our Data Services team, you will help implement cutting-edge DataOps methodologies to optimize our data engineering and analytics frameworks. Your work will ensure high-quality data solutions and contribute to the overall robustness and reliability of our data systems. What You’ll Do Develop and enhance DataOps platforms and processes to boost data quality and shorten cycle times Develop and maintain CI/CD pipelines for data workflows using GitHub Actions, AWS CodeBuild/CodeDeploy, or similar tools. Build automated testing frameworks for data validation, schema checks, and regression testing Architect and enforce data quality checks to guarantee data accuracy and reliability Innovate and maintain architectures supporting both data lake and data warehousing environments Implement and improve monitoring and alerting using Prometheus, CloudWatch, and Grafana. Enhance metadata management and data cataloging using Databricks Unity Catalog. Implement robust access controls and governance using AWS IAM, Unity Catalog, and Terraform, ensuring compliance with SOC 2 standards. Deploy and manage infrastructure-as-code using Terraform and AWS services (S3, Glue, Lambda, ECS, IAM). Collaborate closely with engineering and analytics teams to ensure performance, scalability, cost‑efficiency, and identify operational risk in projects Provide operational support for internal reporting and data requests, ensuring timely and secure data delivery Identify data anomalies and perform forensic analysis for conclusive RCA of data issues Communicate clearly during incidents and coordinate cross-team problem solving. Participate in week‑long on‑call rotations every 5–6 weeks. What You’ll Bring Bachelor’s degree in Computer Science, Engineering, Analytics or a related field; or equivalent practical experience Experience with Databricks, including Unity Catalog, Delta Lake, and Spark Streaming Experience in build and deployment technologies such as Docker, AWS Code Build, Code Deploy, ECR, ECS Experience with star schema data warehouse Proficient in automated testing frameworks and tools Skilled in cloud data technologies and modern programming languages including Python, SQL, Java, and Spark Experience using CLI tools and familiarity with linux Demonstrated ability to work collaboratively in a dynamic team environment with a proactive, positive attitude

Requirements

  • Bachelor’s degree in Computer Science, Engineering, Analytics or a related field; or equivalent practical experience
  • Experience with Databricks, including Unity Catalog, Delta Lake, and Spark Streaming
  • Experience in build and deployment technologies such as Docker, AWS Code Build, Code Deploy, ECR, ECS
  • Experience with star schema data warehouse
  • Proficient in automated testing frameworks and tools
  • Skilled in cloud data technologies and modern programming languages including Python, SQL, Java, and Spark
  • Experience using CLI tools and familiarity with linux
  • Demonstrated ability to work collaboratively in a dynamic team environment with a proactive, positive attitude

Responsibilities

  • Develop and enhance DataOps platforms and processes to boost data quality and shorten cycle times
  • Develop and maintain CI/CD pipelines for data workflows using GitHub Actions, AWS CodeBuild/CodeDeploy, or similar tools.
  • Build automated testing frameworks for data validation, schema checks, and regression testing
  • Architect and enforce data quality checks to guarantee data accuracy and reliability
  • Innovate and maintain architectures supporting both data lake and data warehousing environments
  • Implement and improve monitoring and alerting using Prometheus, CloudWatch, and Grafana.
  • Enhance metadata management and data cataloging using Databricks Unity Catalog.
  • Implement robust access controls and governance using AWS IAM, Unity Catalog, and Terraform, ensuring compliance with SOC 2 standards.
  • Deploy and manage infrastructure-as-code using Terraform and AWS services (S3, Glue, Lambda, ECS, IAM).
  • Collaborate closely with engineering and analytics teams to ensure performance, scalability, cost‑efficiency, and identify operational risk in projects
  • Provide operational support for internal reporting and data requests, ensuring timely and secure data delivery
  • Identify data anomalies and perform forensic analysis for conclusive RCA of data issues
  • Communicate clearly during incidents and coordinate cross-team problem solving.
  • Participate in week‑long on‑call rotations every 5–6 weeks.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service