JPMorgan Chaseposted 14 days ago
Full-time • Mid Level
Plano, TX
Credit Intermediation and Related Activities

About the position

The position involves developing and automating large scale, high performance data processing systems to enhance product experience. Responsibilities include building scalable Spark data pipelines, solving business problems through innovative engineering practices, and participating in all aspects of the Software Development Lifecycle (SDLC). This includes analyzing requirements, incorporating architectural standards into application design specifications, documenting application specifications, and developing or enhancing software application modules. The role also requires designing impactful telemetry and usage tracking solutions for BI tools, maturing modern data pipeline streams for analytics use-cases, and collaborating with data scientists and product owners to understand data requirements. Additionally, the position involves designing data models for optimal storage and retrieval, evaluating cloud enablement options for analytical tools, building audit compliant cloud solutions, and defining advanced vendor tool deployment solutions. The candidate will also conduct POCs for scalable analytic solutions, monitor infrastructure, troubleshoot technical issues, and perform API/Application Performance Testing.

Responsibilities

  • Develop and automate large scale, high performance data processing systems.
  • Build scalable Spark data pipelines leveraging scheduler/executor framework.
  • Solve business problems through innovation and engineering practices.
  • Involved in all aspects of the Software Development Lifecycle (SDLC).
  • Design and build impactful telemetry and usage tracking solutions for BI tools.
  • Mature modern data pipeline streams for analytics use-cases.
  • Work with data scientists, data analysts, and product owners to understand data requirements.
  • Design data models for optimal storage and retrieval.
  • Evaluate Public and Private Cloud enablement options for Analytical and Reporting tools.
  • Build audit compliant cloud solutions and simplify access mechanisms.
  • Build REST services using API and governance standards.
  • Define advanced vendor tool deployment solutions in the cloud and on-premises.
  • Conduct POCs for scalable analytic solutions.
  • Monitor infrastructure, servers, databases, and distributed batch jobs.
  • Troubleshoot or escalate technical issues.
  • Perform API/Application Performance Testing.
  • Design and develop PL/SQL blocks and Stored Procedures in SQL server database.

Requirements

  • Bachelor's Degree in Electronic Engineering, Computer Engineering, Computer Science, Computer Information Systems, or related field.
  • Three (3) years of experience in the job offered or as Software Engineer, Java Developer, Program Analyst, or related occupation.
  • Experience with Linux, Unix, Windows, Agile SDLC, Application Architecture Disciplines, Data Architecture Disciplines.
  • Experience with Microservices, Apache Kafka, Docker, J2EE, Jenkins, NodeJS, Spring, CSS, Hibernate, HTML, Java, Javascript, JQuery, Python, Selenium, Shell Scripting, SQL.
  • Experience with Apache Tomcat, REST, SOAP, JSON, Kubernetes, AWS Cloud Services, Dynatrace, Cassandra, Hadoop, Hive, MongoDB, Oracle, Apache Spark, GIT, Cucumber, Junit.
  • Experience in Automated Testing, Functional Testing, Performance Testing, Regression Testing, Unit Testing, User Acceptance Testing, RBAC, PBAC, Immuta, I&AM, and Databricks.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service