Software Data Engineer / Apache Airflow

GIGATEC EngineeringAnnapolis Junction, MD
15h

About The Position

In this key SW Data Engineering role, you'll get to build the workflow engines behind the curtain, designing reliable Apache Airflow pipelines that keep data moving cleanly, efficiently, and on schedule for analytics and operational needs.

Requirements

  • Experience using the Linux CLI and Linux tools
  • Experience developing Bash scripts to automate manual processes
  • Recent software development experience using Python and Java
  • Experience using Apache Airflow (DAG design, scheduling, operators, sensors) to orchestrate, schedule, and monitor complex workflows
  • Experience using Distributed Big Data processing engines including Apache Spark
  • Experience with containerization technologies such as Docker , containerd, and Podman
  • Experience with Git Source Control System

Nice To Haves

  • Experience using the Atlassian Tool Suite (JIRA, Confluence)
  • Familiar with AWS Cloud Services and Infrastructure
  • Appreciates a sense of humor and the occasional well-timed joke.

Benefits

  • 100% Paid Healthcare
  • 10% 401k in every paycheck
  • 100% Fully Vested
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service