Software Engineer III - AWS, Python, Spark

JPMorgan Chase & Co.Columbus, OH
4d

About The Position

As a Software Engineer III at JPMorganChase within Enterprise Technology, You serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives.

Requirements

  • Formal training or certification on software engineering concepts and 3+ years applied experience.
  • Strong understanding of Agile methodologies, with hands-on experience in at least one common framework (e.g., Scrum or Kanban).
  • Experience optimizing performance on both cloud and non‑cloud platforms, with emphasis on data pipeline throughput, latency, and cost efficiency.
  • Practical, hands-on experience in system design, application development, testing, and ensuring operational stability.
  • Proficiency in Python, including building scalable data pipelines using PySpark.
  • Experience developing, debugging, and maintaining code at enterprise scale using one or more modern programming languages and database/query languages.
  • Hands on experience with AWS services and infrastructure-as-code, including Terraform, SQS, SNS, Lambda, DynamoDB, S3, EC2, EMR, as well as Docker and Kubernetes.
  • Experience orchestrating workflows with Airflow or comparable cloud-native orchestration tools.
  • Working knowledge of the full Software Development Life Cycle for containerized and non containerized applications including CI/CD details (e.g., GitOps, artifact repositories, blue/green or canary releases).
  • Experience in building microservices and REST APIs for a service based architecture.
  • Familiarity with software applications and technical processes across one or more disciplines such as cloud, artificial intelligence/machine learning, and mobile.

Nice To Haves

  • Familiarity with modern front-end and back-end technologies.
  • Exposure to cloud technologies mainly on AWS side.
  • 5+ Hands on experience in developing data pipelines.
  • Strong analytics skills to debug issues in higher environment especially on Cloud platforms.
  • Proven problem solving skills across diverse software platforms and environments.

Responsibilities

  • Design, develop, and troubleshoot software solutions, applying innovative thinking to build resilient systems and decompose complex technical problems.
  • Write secure, high-quality production code and maintain performant algorithms that integrate reliably with upstream and downstream systems.
  • Produce and review architecture and design artifacts for complex applications, ensuring code implementations meet design constraints and non-functional requirements.
  • Build scalable data pipeline frameworks using PySpark across multiple application and business scenarios.
  • Gather, analyze, and synthesize insights from large, diverse datasets; create visualizations and reporting to drive continuous improvement of applications and systems.
  • Proactively identify hidden issues and patterns in data, using findings to improve coding hygiene, data quality, and system architecture.
  • Containerize services by creating Docker images; package and run ETL/data processing jobs using Kubernetes-based methodologies.
  • Schedule and manage workflows with Airflow; orchestrate end-to-end application pipelines using internal and external toolsets.
  • Own the application DevOps lifecycle from development to production, including CI/CD, environment promotion, release management, and operational readiness.
  • Participate in data validations, production cutover, and go-live activities through General Availability (GA), ensuring reliability, observability, and supportability.
  • Foster an inclusive team culture grounded in diversity, opportunity, respect, and continuous learning.

Benefits

  • We offer a competitive total rewards package including base salary determined based on the role, experience, skill set and location. Those in eligible roles may receive commission-based pay and/or discretionary incentive compensation, paid in the form of cash and/or forfeitable equity, awarded in recognition of individual achievements and contributions. We also offer a range of benefits and programs to meet employee needs, based on eligibility. These benefits include comprehensive health care coverage, on-site health and wellness centers, a retirement savings plan, backup childcare, tuition reimbursement, mental health support, financial coaching and more. Additional details about total compensation and benefits will be provided during the hiring process.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service