About The Position

We are looking for a Senior Analytics Engineer to play a foundational role in building Auto Integates’ internal analytics infrastructure. In this role, you will design and implement the systems that power reporting, operational visibility, and product analytics across the organization. You will focus primarily on building and scaling our internal data platform from the ground up. In close partnership with Product & Engineering, you will also help integrate analytics directly into the platform — enabling data-driven features, reporting, and insights for our customers. You will collaborate closely with leadership, engineering, and other internal stakeholders to ensure data aligns with business needs and supports strategic objectives. This is an opportunity to shape the foundation of Auto Integrate's analytics ecosystem, where data plays a central role in how we operate, how we grow, and how we deliver value to customers. This position will require experience in building a data platform from the ground up, enabling the team to access analytics in ways they have not before. If you are more interested in building upon and improving an existing data platform, check out our other Senior Analytics Engineer job posting: https://job-boards.greenhouse.io/fleetio/jobs/5054939007. The ideal candidate for this role is a business-savvy senior-level analytics engineer with experience in building and owning complex analytics platforms and working closely with stakeholders. You’re comfortable starting from a relatively open landscape and defining scalable architecture, modeling standards, and best practices. You have a deep understanding of data modeling, data transformation, and data warehousing on modern data platforms. Your experience goes beyond just data engineering; you have a strong foundation in analytics and have worked closely with customer-facing products. You bring strong but adaptable opinions on design patterns, architecture, and data modeling, and you are passionate about solving business challenges by delivering scalable, data-driven solutions. You have a knack for translating ambiguous business questions into clear, actionable data insights and incrementally deliver impactful results. Product-minded and team-oriented, you actively listen to others’ perspectives and educate on best practices. Excellent communication, especially written, is one of your strengths.

Requirements

  • 5+ years of experience with a proven track record in data or analytics engineering.
  • Experience transforming raw data into clean models using standard tools of the modern data stack and a deep understanding of ELT and data modeling concepts.
  • Hands-on experience building and orchestrating ELT pipelines using Azure Data Factory
  • Experience with Microsoft SQL Server.
  • Proficiency in Python and orchestration tooling like Prefect or Dagster.
  • Experience in designing, building, and administering modern data pipelines and data warehouses.
  • Experience with dbt.
  • Experience with semantic layers like Snowflake's semantic views, Cube, or Metricflow.
  • Experience with Snowflake, BigQuery, or Redshift.
  • Experience with version control tools such as GitHub or GitLab.
  • Experience with CI/CD and IaaC tooling such as GitHub Actions and Terraform.
  • Experience with business intelligence solutions (Metabase, Looker, Tableau)
  • Excellent communication and project management skills with a customer service-focused mindset.

Nice To Haves

  • Experience with data marketplaces/private sharing technologies such as Snowflake Marketplace or AWS Data Exchange.
  • Experience with ELT tools such as Stitch or Fivetran.
  • Experience contributing to open-source projects.
  • Experience with streaming data and pipelines such as Kafka or Kinesis.
  • Experience with full-stack engineering.

Responsibilities

  • Enable self-service analytics for all team members by designing clean, intuitive data models and metrics through dbt, empowering employees to make informed, data-driven decisions.
  • Develop and refine custom data pipelines that ingest data from operational systems to our analytics platform, handling both streaming and batch data using third-party tooling and home-grown solutions
  • Maintain and optimize the data platform infrastructure, focusing on data quality, ELT efficiency, and platform hygiene.
  • Architect and implement key components of the analytics infrastructure, such as BI, semantic layers, and foundational data warehouse.
  • Develop and maintain streaming data pipelines from a variety of databases and data sources
  • Collaborate across business units to understand data needs and ensure required data is collected, modeled, and available to team members.
  • Document best practices and coach/advise other data analysts, product managers, engineers, etc. on data modeling, SQL query optimization & reusability, etc. Keep our data platform tidy by managing roles and permissions and deprecating old projects.

Benefits

  • Multiple health/dental coverage options (100% coverage for employee, 50% for family)
  • Vision insurance
  • Incentive stock options
  • 401(k) match of 4%
  • PTO - 4 weeks (increases at year two!)
  • 12 company holidays + 2 floating holidays
  • Parental leave - birthing parent (16 weeks paid) non-birthing (4 weeks paid)
  • FSA & HSA options
  • Short and long term disability (short term 100% paid)
  • Community service funds
  • Professional development funds
  • Wellbeing fund - $150 quarterly
  • Business expense stipend - $125 quarterly
  • Mac laptop + new hire equipment stipend
  • Fully stocked kitchen with tons of drinks & snacks (BHM only)
  • Remote working friendly since 2012 #LI-REMOTE
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service