Senior Data Engineer

Varda Space IndustriesEl Segundo, CA
13h$133,000 - $170,000Onsite

About The Position

Varda is seeking a Senior Data Engineer to join our team. In this role you will be responsible for leading efforts to develop the essential data pipelines, storage, security and quality to power the organization. You will be responsible for understanding the current application ecosystem and data architecture. The knowledge of systems and data architecture will be used to engage with all functions across the enterprise to determine opportunities for process or application improvements. You will manage the implementation of these solutions through the entire product life cycle from opportunity identification to support and sustainment. You will work cross-functionally with functional team focals across Varda to lead initiatives to enhance company processes, execution insight and infrastructure robustness. Projects will include a wide range of desired outcomes including cycle time reduction, cost reduction, improved decision making, risk reduction, and any other key operational efficiency. Your contributions will immediately impact functional operations and have an opportunity to contribute to Varda's overall growth and success. This role will report the Director of Enterprise Applications. This is a full-time, exempt position located in our El Segundo headquarters.

Requirements

  • Bachelor's degree in Computer Science, Information Systems, or related fields.
  • 7+ years of experience in enterprise integration or data engineering roles (advanced degrees count towards years of experience).
  • Deep hands-on experience with cloud-scale data warehouse or Lakehouse platforms (ex: Snowflake, Databricks). Includes performance tuning, cost governance, and data sharing patterns at scale.
  • Strong experience building and managing a modern transformation layer in production. Includes project structure, testing strategy, and metrics/semantic layer (ex: dbt).
  • Fluency in a general-purpose programming language for pipeline development, scripting, and data transformation logic (ex: SQL, Python, R, Java).
  • Experience integrating data from enterprise source systems such as ERP and CRM. Includes handling schema complexity, API connectivity, and change management (ex: Deltek, SAP, Salesforce).
  • Strong data modeling fundamentals with the ability to evaluate and apply the right approach based on the use case.
  • Solid source control and deployment discipline. Branching strategies, peer review workflows, and CI/CD tooling treated as non-negotiable engineering standards.
  • Proven ability to write efficient, well-documented, testable code and implement data quality testing strategies at the model, pipeline, and source level. Includes familiarity with data observability tooling (ex: Metaplane, Monte Carlo).
  • Deep knowledge of software architecture principles, including event-driven and service-oriented architectures (EDA/SOA) and ELT workflows.
  • Strong understanding of data warehousing, ETL processes, connecting enterprise systems, and data modeling.
  • Capability to gather business requirements from stakeholders and take a project from initial concept to finished product.
  • Ability to work with end users to rapidly iterate on prototype applications by solving end user issues.
  • Ability to translate technical concepts to non-technical audiences for stakeholders internal and external to the organization.
  • Strong interpersonal and communication skills.
  • Ability to lead and/or work well in cross-functional teams.
  • Ability to lawfully access information and technology that is subject to US export controls.

Nice To Haves

  • Graduate degree in Computer Science, Information Systems, or related fields.
  • Experience interfacing with engineering and manufacturing groups to understand system designs and translations to data needs.
  • Experience conducting or contributing to platform trade studies or build-vs-buy evaluations for data infrastructure.
  • Experience supporting AI/ML pipelines including feature stores, training data pipelines, or model monitoring infrastructure.
  • Experience in a start-up or similar high growth environment.
  • Familiarity with workflow orchestration tools (ex: Airflow).
  • Exposure to aerospace, defense, or pharmaceutical data environments.
  • Experience contributing to or leading a data platform migration or modernization effort.
  • Experience working with ERP systems deployed in a design development environment.
  • Exposure to Infrastructure as Code for cloud resource provisioning (e.g., Terraform).
  • Relevant platform or tooling certifications (ex: dbt Certified Developer, cloud data platform certifications).

Responsibilities

  • Design and maintain robust ELT pipelines ingesting data from ERP, CRM, PLM, QMS, and other enterprise systems into a cloud-native Lakehouse environment.
  • Contribute to enterprise data platform and enterprise application selection by assessing multiple external solutions and determining the appropriate make vs buy selection.
  • Own the physical data storage layer by defining open table format standards, partitioning strategies, and ensuring interoperability across the broader tool ecosystem (ex: Apache Iceberg).
  • Establish and enforce data quality frameworks, lineage tracking, and pipeline observability including SLA monitoring, proactive alerting, and production-grade logging.
  • Support AI/ML workflows by designing feature pipelines, clean data products, and a governed knowledge layer that keeps data semantically aligned for reliable model and agent consumption.
  • Partner with application engineers and functional stakeholders across the business to translate requirements and business logic into technical specifications and production-ready data solutions.
  • Operate effectively within regulated environments where your pipelines will be subject to compliance and auditability requirements enforced by FDA GMP, ITAR, and DCAA.
  • Implement and enforce modern development practices including source control, peer review, CI/CD pipelines, and automated build/test/deploy workflows (ex: GitHub Actions).
  • Leverage AI-assisted development tooling to accelerate delivery, reduce technical debt, and elevate the team’s overall engineering output.
  • Apply layered data architecture patterns to organize data assets for reliability and reuse (ex: medallion).
  • Design, build, and maintain robust, scalable ELT pipelines from diverse sources (APIs, Databases, SaaS)
  • Provide technical leadership and mentoring to application engineers and business end users.

Benefits

  • Exciting team of professionals at the top of their field working by your side
  • Equity in a fully funded space startup with potential for significant growth (interns excluded)
  • 401(k) matching (interns excluded)
  • Unlimited PTO (interns excluded)
  • Health insurance, including Vision and Dental
  • Lunch and snacks provided on site every day. Dinners provided twice a week.
  • Maternity / Paternity leave (interns excluded)
  • Varda Space Industries is an Equal Opportunity Employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. Candidates and employees are always evaluated based on merit, qualifications, and performance. We will never discriminate on the basis of race, color, gender, national origin, ethnicity, veteran status, disability status, age, sexual orientation, gender identity, martial status, mental or physical disability, or any other legally protected status.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service