Senior Application Developer

PennymacCary, NC
14h$90,000 - $150,000Onsite

About The Position

Pennymac (NYSE: PFSI) is a specialty financial services firm with a comprehensive mortgage platform and integrated business focused on the production and servicing of U.S. mortgage loans and the management of investments related to the U.S. mortgage market. At Pennymac, our people are the foundation of our success and at the heart of our dynamic work culture. Together, we work towards a unified goal of helping millions of Americans achieve aspirations of homeownership through the complete mortgage journey. The Sr. Data Platform Engineer - Python/AWS Specialist leads the design, development, and management of our enterprise data pipeline infrastructure, with a primary focus on Python-based solutions and AWS cloud services. This role supports critical business functions through sophisticated data engineering, including pricing analytics, trading systems, hedging models, and pooling operations, ensuring scalable, performant, and reliable data solutions across the organization. The Sr. Data Platform Engineer - Python/AWS Specialist will: Advanced Python Development - Architect, develop, and maintain production-grade Python applications using Object-Oriented Programming, design patterns, and software engineering best practices for enterprise data pipelines Expert AWS Cloud Services - Design and implement cloud-native data solutions using AWS services including Lambda, Glue, Step Functions, S3, EventBridge, SQS/SNS, and Kinesis Data Pipeline Architecture - Lead the design of scalable ETL/ELT pipelines using Python frameworks such as Apache Airflow, Prefect, or AWS Step Functions for orchestration API Development & Integration - Build and maintain RESTful APIs using FastAPI or Flask for data services, microservices, and system integrations Serverless & Event-Driven Architecture - Design event-driven data pipelines leveraging AWS Lambda, EventBridge, and serverless patterns for real-time and batch processing Infrastructure as Code - Implement and manage cloud infrastructure using CloudFormation, CDK, or Terraform for reproducible and version-controlled deployments Experience with Python data frameworks (e.g., Pandas, NumPy, SQLAlchemy, PySpark) for data transformation and analysis Strong experience with SQL and database technologies for data pipeline development and optimization Experience with containerization (Docker) and container orchestration (ECS, Kubernetes) for deploying Python services Experience with Git, CI/CD pipelines, and collaborative development workflows Experience with comprehensive testing strategies including unit testing, integration testing, and data validation frameworks (pytest, Great Expectations) Knowledge of DataOps practices (CI/CD for data pipelines, automated testing, monitoring) Knowledge of Agile, Scrum, Jira methodologies

Requirements

  • Degree in Computer Science, Data Engineering, Engineering, or similar technical major
  • 5+ years of software development experience with 4+ years of production Python development
  • Expert-level Python skills including OOP, design patterns, async programming, and building maintainable, testable code at enterprise scale
  • 3+ years of hands-on AWS experience with data-focused services (Lambda, Glue, Step Functions, S3, Kinesis, EMR)
  • Deep understanding of data engineering concepts, ETL/ELT patterns, and modern data architecture principles
  • Extensive experience building and maintaining production data pipelines that process high volumes of data reliably
  • Demonstrated ability to master cloud technologies including serverless architectures, event-driven systems, and Infrastructure as Code
  • Proven track record of architecting and implementing business-critical data solutions that improve stability, security, performance, and scalability
  • Demonstrated ability to effectively communicate complex technical concepts to engineers, product owners, project managers, and business stakeholders
  • Demonstrated experience in multi-team collaboration and agile development practices, particularly in data-focused environments
  • Ability to collaborate across teams and design data systems that address architectural gaps and scalability challenges
  • Experience with Python data frameworks (e.g., Pandas, NumPy, SQLAlchemy, PySpark) for data transformation and analysis
  • Strong experience with SQL and database technologies for data pipeline development and optimization
  • Experience with containerization (Docker) and container orchestration (ECS, Kubernetes) for deploying Python services
  • Experience with Git, CI/CD pipelines, and collaborative development workflows
  • Experience with comprehensive testing strategies including unit testing, integration testing, and data validation frameworks (pytest, Great Expectations)
  • Knowledge of DataOps practices (CI/CD for data pipelines, automated testing, monitoring)
  • Knowledge of Agile, Scrum, Jira methodologies

Nice To Haves

  • Experience with Snowflake, SQL Server, or other data warehouse platforms for pipeline target systems
  • Financial Services and mortgage industry experience, particularly with regulatory reporting and risk management data requirements

Responsibilities

  • Architect, develop, and maintain production-grade Python applications using Object-Oriented Programming, design patterns, and software engineering best practices for enterprise data pipelines
  • Design and implement cloud-native data solutions using AWS services including Lambda, Glue, Step Functions, S3, EventBridge, SQS/SNS, and Kinesis
  • Lead the design of scalable ETL/ELT pipelines using Python frameworks such as Apache Airflow, Prefect, or AWS Step Functions for orchestration
  • Build and maintain RESTful APIs using FastAPI or Flask for data services, microservices, and system integrations
  • Design event-driven data pipelines leveraging AWS Lambda, EventBridge, and serverless patterns for real-time and batch processing
  • Implement and manage cloud infrastructure using CloudFormation, CDK, or Terraform for reproducible and version-controlled deployments

Benefits

  • Comprehensive Medical, Dental, and Vision
  • Paid Time Off Programs including vacation, holidays, illness, and parental leave
  • Wellness Programs, Employee Recognition Programs, and onsite gyms and cafe style dining (select locations)
  • Retirement benefits, life insurance, 401k match, and tuition reimbursement
  • Philanthropy Programs including matching gifts, volunteer grants, charitable grants and corporate sponsorships
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service