Software Engineer [Multiple Positions Available]

JPMorgan Chase & Co.Plano, TX
8hOnsite

About The Position

Duties: Participate in application development, testing, and operational stability, with a focus on Kubernetes cluster management and microservices architecture. Develop, debug, and maintain code in a large corporate environment using modern programming languages including Python, Poetry, PySpark, Java, and database querying languages including SQL and GraphQL. Create frameworks in Python using libraries such as pandas for data processing and Pytest for unit testing. Create front-end frameworks and libraries such as React and node.js for developing user interfaces for Azure OpenAI LLMs with Docker container images hosted on AWS ECS. Develop, debug, and maintain code with a focus on RESTful API development and server-side logic hosted on AWS ECS. Build data pipeline with AWS services using EC2, ECS, S3, Lambda, RDS, Glue, Stepfunctions, Open Search, API Gateway, Neptune and cloud architecture. Utilize Terraform and CloudFormation for infrastructure as code. Provision and maintain Apache Airflow for workflow orchestration and managing CI/CD pipelines with Jenkins, Liquibase, and Spinnaker. Follow agile methodologies such as CI/CD, application resiliency, and security, with a focus on maintaining application code and infrastructure in Bitbucket repositories. Build Machine Learning processing pipelines with technical disciplines such as cloud computing, artificial intelligence and data analytics, and tools like Databricks and Snowflake. Work on anomaly detection using TensorFlow, Neural Networks, Langchain and sklearn, with matplotlib for graphical analysis. Build and monitor dashboards using AWS X-Ray implementing logging solutions with CloudWatch, Dynatrace and Splunk. Build streaming pipelines using Kafka, Kinesis and snowstreaming API to ingest data to snowflake. QUALIFICATIONS: Minimum education and experience required: Master's degree in Computer Science, or related field of study plus 2 years (24 months) of experience in the job offered or as Software Engineer, Developer, Technical Systems Analyst, or related occupation. The employer will alternatively accept a Bachelor's degree in Computer Science, or related field of study plus 4 years (48 months) of experience in the job offered or as Software Engineer, Developer, Technical Systems Analyst, or related occupation.

Requirements

  • developing cloud infrastructure on AWS for modern merchant funding platforms using Terraform and Cloud Formation
  • maintaining resiliency and security for cloud applications hosted on AWS EKS, EC2, and AWS ECS on Fargate
  • provisioning, maintaining, and upgrading Kubernetes cluster with multiple services including ClusterIP, NodePort, LoadBalancer, and ExternalName
  • developing a Java-based, microservice-based file processing platform for processing CSV, fixed-length xml files and streaming Kafka data
  • developing software using Python and Poetry to resolve application dependencies and Pytest for unit testing
  • maintaining MySQL and PostgreSQL databases hosted on AWS Aurora with DDL and DML SQL statements managed in Liquibase
  • developing a service hosted on AWS ECS for generating files via a REST API
  • provisioning and maintaining in Amazon utilizing workflows for Apache Airflow AWS (MWAA) for workflow management and orchestration
  • developing Databricks tasks and Kafka streaming pipelines using the PySpark library Apache Spark
  • developing ELT (Extract, Load, Transform) application using Snowpark for processing flat files into AWS-hosted Snowflake
  • developing RAG agent using langchain library with documents stored in Opensearch vector database and Neptune Graph database using GraphQL
  • developing LLM models hosted in Azure for automatic identification of anomalies in payments systems and automatic Merchant reports using LLM based on user input questions using Azure Openai
  • developing UI for interfacing with LLMs using React with Docker container image hosted on AWS ECS
  • working on feedforward neural network for anomaly detection using Tensorflow, sklearn, and matplotlib for graphical analysis
  • maintaining AWS Glue jobs written using PySpark
  • maintaining file processing Java applications that use Spring Boot frameworks with Maven for application dependencies and JUnit for unit testing
  • creating Control-M jobs for automating tasks and scripts
  • writing Perl and shell scripts for running SQL queries in Oracle database and file movement on Red Hat Linux (RHEL) servers
  • creating and managing CI/CD pipelines using Jenkins and Spinnaker
  • building and monitoring dashboard using XRay and logging using CloudWatch, Dynatrace, and Splunk

Responsibilities

  • Participate in application development, testing, and operational stability, with a focus on Kubernetes cluster management and microservices architecture.
  • Develop, debug, and maintain code in a large corporate environment using modern programming languages including Python, Poetry, PySpark, Java, and database querying languages including SQL and GraphQL.
  • Create frameworks in Python using libraries such as pandas for data processing and Pytest for unit testing.
  • Create front-end frameworks and libraries such as React and node.js for developing user interfaces for Azure OpenAI LLMs with Docker container images hosted on AWS ECS.
  • Develop, debug, and maintain code with a focus on RESTful API development and server-side logic hosted on AWS ECS.
  • Build data pipeline with AWS services using EC2, ECS, S3, Lambda, RDS, Glue, Stepfunctions, Open Search, API Gateway, Neptune and cloud architecture.
  • Utilize Terraform and CloudFormation for infrastructure as code.
  • Provision and maintain Apache Airflow for workflow orchestration and managing CI/CD pipelines with Jenkins, Liquibase, and Spinnaker.
  • Follow agile methodologies such as CI/CD, application resiliency, and security, with a focus on maintaining application code and infrastructure in Bitbucket repositories.
  • Build Machine Learning processing pipelines with technical disciplines such as cloud computing, artificial intelligence and data analytics, and tools like Databricks and Snowflake.
  • Work on anomaly detection using TensorFlow, Neural Networks, Langchain and sklearn, with matplotlib for graphical analysis.
  • Build and monitor dashboards using AWS X-Ray implementing logging solutions with CloudWatch, Dynatrace and Splunk.
  • Build streaming pipelines using Kafka, Kinesis and snowstreaming API to ingest data to snowflake.

Benefits

  • We offer a competitive total rewards package including base salary determined based on the role, experience, skill set and location.
  • Those in eligible roles may receive commission-based pay and/or discretionary incentive compensation, paid in the form of cash and/or forfeitable equity, awarded in recognition of individual achievements and contributions.
  • We also offer a range of benefits and programs to meet employee needs, based on eligibility.
  • These benefits include comprehensive health care coverage, on-site health and wellness centers, a retirement savings plan, backup childcare, tuition reimbursement, mental health support, financial coaching and more.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service