Data Platform Engineer - Python - Senior

Montcure, LLCWashington, DC
4d$90 - $115Remote

About The Position

The Senior Data Platform Engineer will support development and operation of a modern data environment supporting Navy financial management and audit artifact traceability initiatives. The role focuses on building and maintaining data pipelines, ingestion workflows, and transformation processes within Palantir Foundry and the Jupiter data platform environment. The engineer will work alongside ontology engineers and platforms lead to ingest, transform, and manage datasets that support artifact traceability, audit response reporting, and operational data applications.

Requirements

  • Bachelor’s degree in Computer Science, Data Engineering, Information Systems, or related technical field (or equivalent experience).
  • 4–7 years experience in data engineering or platform engineering roles.
  • Experience developing Python-based data pipelines
  • Experience building ETL/ELT pipelines and data transformation workflows
  • Experience working with enterprise data platforms such as Palantir Foundry, Databricks, AWS data platforms, or similar distributed data environments
  • Experience integrating datasets from enterprise systems (ERP, financial systems, logistics systems, etc.) preferred
  • Strong Python development skills for data processing and pipeline development
  • Experience with SQL and data transformation frameworks
  • Familiarity with distributed data platforms and large-scale data environments
  • Strong troubleshooting and problem-solving skills in data pipeline operations
  • Ability to collaborate with platform engineers, ontology engineers, and functional stakeholders
  • Candidates must have the above clearance level and, at a minimum, be able to maintain this clearance during their employment with Montcure.

Responsibilities

  • Data Pipeline Development:
  • Develop and maintain Python-based data pipelines and transformation workflows within Palantir Foundry and the Jupiter platform.
  • Build ingestion pipelines integrating financial management, logistics, property, and other enterprise datasets.
  • Implement transformation logic to prepare raw datasets for curated data layers and ontology population.
  • Data Platform Operations:
  • Support dataset lifecycle management including refresh schedules, validation checks, and pipeline monitoring.
  • Troubleshoot pipeline failures and assist in maintaining platform data reliability and stability.
  • Maintain dataset lineage awareness and support platform data integrity practices.
  • Data Integration and Support:
  • Assist with integrating enterprise system data into the platform environment.
  • Collaborate with ontology engineers to ensure pipelines populate ontology objects and platform data structures correctly.
  • Document pipelines and transformation logic to support maintainability of the platform.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service