About The Position

As a Staff Systems Engineer, Enterprise Data Analytics you lead the design, development and optimization of mission-critical integrations across our tech-ecosystem. You work across low-code platforms and custom cloud-native services, leveraging your deep software engineering background. This role requires strong problem solving skills, and the ability to partner with key stakeholders and drive strategic collaboration to shape the future of our enterprise data backbone. In this role, you’ll: Architect, develop, and maintain automated data pipelines using a combination of low-code tools and custom AWS solutions Own the provisioning and management of infrastructure for integrations by applying Infrastructure as Code principles with Terraform Mentor engineers and champion best practices in software engineering, source control (Gitflow), and DevOps workflows within Github repositories Collaborate closely with data scientists, product managers, and stakeholders to ensure integrations deliver transformative business value Proactively monitor, troubleshoot, and ensure reliability and observability of integration solutions across distributed systems Design and implement robust, scalable integrations between Databricks and key enterprise business systems (Netsuite, Workday, SalesForce) Design & build integrations and automations using low-code/iPaaS platforms (such as Workato and Fivetran)

Requirements

  • Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent professional experience
  • Minimum 8+ years of hands-on experience in software engineering, with expertise in backend or distributed systems in cloud environments
  • Advanced proficiency in Python or Node.js
  • Expert-level AWS experience: Lambda, Step Functions, S3, CloudWatch, VPCs, IAM.
  • Deep hands-on expertise with Terraform and modern DevOps practices, including CI/CD and infrastructure lifecycle automation
  • Mastery of Git and modern source control workflows (Gitflow), using Github
  • Strong communication, collaboration, and leadership skills with a history of mentorship and technical ownership.

Responsibilities

  • Architect, develop, and maintain automated data pipelines using a combination of low-code tools and custom AWS solutions
  • Own the provisioning and management of infrastructure for integrations by applying Infrastructure as Code principles with Terraform
  • Mentor engineers and champion best practices in software engineering, source control (Gitflow), and DevOps workflows within Github repositories
  • Collaborate closely with data scientists, product managers, and stakeholders to ensure integrations deliver transformative business value
  • Proactively monitor, troubleshoot, and ensure reliability and observability of integration solutions across distributed systems
  • Design and implement robust, scalable integrations between Databricks and key enterprise business systems (Netsuite, Workday, SalesForce)
  • Design & build integrations and automations using low-code/iPaaS platforms (such as Workato and Fivetran)
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service