Sr. Data Engineer

INDEX ANALYTICS LLC
1d$118,750 - $147,000Remote

About The Position

Index Analytics is seeking a Sr. Data Engineer to support Government clients to design, build, and optimize scalable data pipelines and cloud-based solutions. The Sr. Data Engineer plays a key role in modernizing the organization’s data ecosystem by helping transition legacy solutions to a contemporary infrastructure. This role blends advanced data engineering with hands‑on cloud solutions engineering, leveraging AWS, Snowflake, and modern DevOps practices. The ideal candidate has experience delivering high‑quality solutions in an Agile environment.

Requirements

  • US citizen or lived in the US for 3 of the last 5 years. Must be able to obtain a U.S. Federal government client badge and pass a government background investigation
  • Bachelor’s degree or equivalent with six (6) or more years of experience as Data Engineer or similar role
  • Hands-on experience in SQL and Python is required
  • Strong proficiency in AWS services, Snowflake
  • Strong proficiency in GitHub for source control, branching strategy and reviews
  • Knowledge of data integration patterns, data warehousing, and modern data architecture
  • Strong analytical and communication skills, including the ability to analyze data, create meaningful insights, and present information clearly to stakeholders.
  • Experience with object‑oriented programming and building with software engineering principles
  • DevOps exposure, including CI/CD tools like Jenkins
  • Strong written and verbal communication skills are required, ability to collaborate, present, and report to stakeholders on findings.
  • Knowledge of Agile framework, Scrum methodologies and knowledge of tools used to support them.

Nice To Haves

  • Prior working knowledge of CMS enterprise repositories like IDR.
  • Prior experience with government agencies is a plus.

Responsibilities

  • Collaborate closely with stakeholders, cross‑functional and internal technical teams to gather requirements, document business rules and develop thorough understanding of the business context and objectives
  • Design, build, and maintain scalable, reliable ETL/ELT data pipelines using AWS and Snowflake
  • Develop and optimize data models, both conceptual and physical, to support analytics, reporting, and operational consumption
  • Implement data quality, validation, and monitoring frameworks to ensure accuracy and reliability
  • Improve end-to-end performance of data workflows
  • Develop and maintain cloud-based data infrastructure using AWS services such as S3, Lambda, Glue, Step Functions, CloudWatch, SQS, DynamoDB, SageMaker and Event Bridge
  • Collaborate to design secure, scalable, and cost‑optimized data solutions.
  • Contribute to cloud governance, security, and best practices
  • Build and maintain CI/CD pipelines using Jenkins to support automated testing, deployments, and continuous integration
  • Ensure data workflows are modular, testable, and properly version‑controlled.
  • Operationalize pipelines with monitoring, alerting, and automated recovery mechanisms
  • Meet schedule deadlines and commitments with a high-level of quality of deliverables.
  • Collaborate with a team of cross-functional resources in an Agile delivery environment to deliver iterative value
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service