About The Position

As a Full-Stack Software Engineer with a focus on Data Pipelines & Analytics, you will build and maintain systems that ingest, transform, enrich, and present large-scale data sets in support of mission objectives. Your work will span backend data processing, API development, and user-facing analytics interfaces that enable stakeholders to explore and act on complex information. This role blends software engineering with data engineering. You will help design scalable ingestion workflows, implement ETL and streaming pipelines, and ensure data integrity across multiple storage technologies. At the same time, you’ll contribute to application layers that expose this data through intuitive visualizations and mission-aligned workflows. You will work in a collaborative environment alongside analysts, data scientists, DevOps engineers, and other software developers to ensure solutions are performant, reliable, and adaptable to evolving requirements.

Requirements

  • Active TS/SCI with Polygraph.
  • Strong experience in backend development using Python, Java, or similar languages
  • Experience building or maintaining data pipelines (ETL and/or streaming)
  • Familiarity with JSON and schema-driven data modeling
  • Experience working with NoSQL and/or relational databases
  • Familiarity with RESTful API design
  • Experience with Git-based development workflows
  • Comfortable working in Linux development environments
  • Ability to manage complex data sets and translate analytical requirements into technical implementations

Nice To Haves

  • Experience with dataflow tools such as Apache NiFi, Kafka, Airflow, or similar
  • Experience with Elasticsearch, MongoDB, Redis, graph databases, or similar technologies
  • Familiarity with containerization technologies (Docker, Kubernetes)
  • Experience with CI/CD pipelines and automated testing
  • Experience supporting production CNO capabilities and operations
  • Excellent written and verbal communication skills.

Responsibilities

  • Maintain an active TS/SCI with Polygraph. Candidates without a current clearance will not be considered.
  • Design and implement data ingestion and transformation pipelines
  • Develop backend services to process and normalize structured and unstructured data
  • Build APIs that expose processed data for mission applications
  • Implement ETL and/or streaming workflows using modern dataflow tools
  • Develop and maintain data models and schema frameworks
  • Integrate with relational, document, graph, and search databases
  • Develop user-facing interfaces and dashboards to support data exploration
  • Optimize performance for large-scale queries and high-volume data environments
  • Ensure data integrity, validation, and monitoring across the pipeline lifecycle
  • Collaborate with mission stakeholders to refine data requirements and analytics use cases

Benefits

  • Top salaries because we're top performers
  • Pick your PTO – Everyone values time and money differently, so we give you the flexibility to choose between 3 and 5 weeks of PTO with a corresponding adjustment to your pay. Your choice, your balance.
  • All 11 federal holidays, paid!
  • Up to 2 snow days, paid!
  • We’ll quadruple (4x!) the first 6% you contribute to your 401(k), giving you up to a 24% company match. Contributing less than 6%? Unclaimed matches come right back to you as extra income, giving you a guaranteed 24% that goes to your retirement, to your paycheck, or both. C’mon now! 🚀
  • 100% employer-paid medical, dental, vision, life, and disability insurances. That’s a lot. Already covered on health insurance? No problem – we’ll trade you this benefit for a boost to your salary instead.
  • $5,250 annual education assistance for training, certifications, tuition, and even student loan repayments.
  • Spot bonuses for obtained certifications, customer recognition, and just about anything else that makes us go "Hot damn!". We hope to say that many times about you. 🔥
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service