Senior Data Engineer

Techstra SolutionsPittsburgh, PA
13h

About The Position

The Data Engineer with strong experience in Kafka, Java, and cloud-native development will help design and build scalable data platforms and streaming solutions. This role focuses on developing resilient, event-driven data pipelines and services while working within modern containerized environments. The ideal candidate is a strong engineer who enjoys solving complex data problems, understands domain-driven design principles, and thrives in a collaborative, fast-moving environment. Candidates who demonstrate curiosity, adaptability, and a desire to continuously learn and grow will be highly successful in this role.

Requirements

  • Bachelor’s degree in Computer Science, Information Technology, or related field.
  • 5–10 years of experience in software engineering or data engineering roles.
  • Strong experience with Apache Kafka and event-driven architecture.
  • Strong programming experience in Java.
  • Experience building microservices using Spring Boot.
  • Experience working with containerized environments (Docker or similar).
  • Experience building cloud-native applications in modern cloud environments.
  • Strong understanding of AVRO schemas and message serialization.
  • Experience with domain object modeling / domain-driven design principles.
  • Strong problem-solving skills and ability to work in a collaborative team environment.

Nice To Haves

  • Experience with Apache Spark, Big Data platforms, or stream processing frameworks.
  • Prior Utilization of AI tools in development
  • Experience building real-time streaming data pipelines.
  • Familiarity with financial services or banking platforms.
  • Experience working in Agile development environments.
  • Exposure to CI/CD pipelines and DevOps practices.

Responsibilities

  • Design, build, and maintain scalable data pipelines and event-driven architectures using Kafka and Java-based microservices.
  • Develop and maintain Spring Boot applications that integrate with distributed streaming platforms.
  • Implement domain object modeling to ensure clean, maintainable, and scalable service design.
  • Work with AVRO schemas and schema management to ensure efficient and reliable message serialization.
  • Deploy and manage applications in containerized environments using Docker and modern orchestration platforms.
  • Build cloud-native data services that leverage modern infrastructure and DevOps practices.
  • Collaborate with architects, product teams, and other engineers to translate business requirements into technical solutions.
  • Ensure solutions meet performance, scalability, and reliability requirements.
  • Contribute to improving engineering standards, automation, and development best practices.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service