Sr. Manager, Data Engineering & Architecture

LVT (LiveView Technologies)American Fork, UT
2dHybrid

About The Position

As the Sr. Manager of Data Engineering and Architecture, you will be a hands-on leader responsible for defining and executing the data engineering strategy, architecture, and technology stack. You will be responsible for the foundational data infrastructure that powers analytics and decision-making across the organization. A critical part of this role will be building, mentoring, and managing a team of 3 data engineers. You will guide the team's efforts while also contributing directly to the development of our modern data warehouse in Snowflake using dbt and SQL, transforming raw data into reliable and accessible datasets. Crucially, you will spearhead the data migration efforts for our ongoing Oracle Fusion Cloud deployment, designing a robust, cohesive data ecosystem that bridges our new Oracle environment with Snowflake. Your leadership will ensure the successful design, build, and maintenance of robust data pipelines that ingest data from a variety of internal and external sources into Snowflake. Your team's work will support reporting, dashboarding, and analysis across the company, enabling teams to make informed decisions based on trusted data. Given the green-field nature of this initiative, the role is expected to be approximately 30% technical leadership, strategy, and people management, and 70% direct, hands-on engineering and architecture. This position is based in a hybrid work environment and requires regular in-office collaboration. It offers an opportunity to lead with modern data tools, build a high-performing team, and make a direct, strategic impact on data quality and accessibility during a massive phase of enterprise scaling.

Requirements

  • Experience: 8+ years in data engineering or related roles, with a strong focus on data modeling and pipeline development.
  • Education: Bachelor’s degree in Computer Science, Engineering, Data Analytics, or a related field.
  • Technical Skills: Proven experience leading complex data migration or implementation projects.
  • Advanced proficiency in SQL and experience with data modeling (e.g., star/snowflake schemas).
  • Hands-on experience with dbt for building modular and testable data transformations.
  • Experience developing data pipelines using Python and workflow orchestration tools (e.g., Airflow, Prefect).
  • Deep understanding of Snowflake.
  • Demonstrated ability to design and implement a modern data architecture from scratch.
  • Infrastructure & Governance: Understanding of ELT/ETL workflows, data governance, and monitoring practices.
  • Experience defining and enforcing organizational standards for data quality, metadata management, and cost optimization within a cloud data warehouse (Snowflake).
  • Communication & Problem Solving: Ability to clearly explain technical details to both technical and non-technical stakeholders.
  • Strong analytical and debugging skills; attention to detail in code and data quality.

Nice To Haves

  • BI & Analytics Tools: Familiarity with BI platforms such as Looker, Tableau, or Sigma is helpful.

Responsibilities

  • Data Strategy & Architecture: Define the long-term vision, strategy, and architecture for the company’s data platform. Design a cohesive hybrid architecture that maximizes the strengths of our full tech stack, ensuring it drives measurable business value and scales efficiently to support hyper-growth.
  • Team Leadership & Management: Build, mentor, and manage a team of 3 data engineers, fostering a culture of technical excellence, accountability, and continuous improvement.
  • Data Modeling & Transformation: Lead the team in building and maintaining robust data models using dbt and SQL that support complex analytics and reporting needs. Contribute directly as an individual contributor as needed.
  • Snowflake Development: Oversee the design and optimization of the Snowflake data warehouse to ensure performance, scalability, and usability. Participate directly in key development efforts.
  • Cross-Functional Collaboration: Act as the primary technical partner to analysts, business stakeholders, and data teams to deeply understand requirements and translate them into strategic engineering solutions and delivery plans.
  • Performance Tuning: Guide the optimization of SQL queries and data transformations to improve execution speed and resource efficiency across the platform.
  • Tooling & Automation: Identify, evaluate, and implement opportunities to automate data workflows, improve pipeline reliability, and establish a formal DataOps/MLOps framework using modern orchestration tools (e.g., Airflow, Prefect, cloud-native serverless functions).
  • BI Tool Support: Ensure the team provides clean, well-structured data models to enable effective use of BI tools like Looker, Sigma, Tableau, or similar platforms.
  • Pipeline Engineering: Direct the development and maintenance of scalable data ingestion pipelines that pull data from APIs and other sources into Snowflake, including exploring solutions for near real-time data feeds.
  • Data Governance & Quality: Champion best practices in data governance, data lifecycle management, and dimensional modeling. Implement data validation checks, documentation standards, and lineage tracking to maintain high data integrity across all of our systems.
  • Oracle Fusion Cloud Migration & Integration: Lead the complex data migration strategy for our active Oracle deployment. Architect, build, and maintain secure, high-performing data flows and syncs between Oracle and Snowflake to ensure operational continuity and analytical excellence.

Benefits

  • We believe you do your best work when your whole life is supported. We invest in our crew’s health, families, and financial futures with a benefits package designed to support you inside and outside the office.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service