Azure Data Engineer

InfoCentric
1d

About The Position

We are looking for a highly skilled Mid to Senior Data Engineer to join our team and support a key project involving the migration of data from a legacy system to a Snowflake Data Warehouse . The ideal candidate has strong experience in data engineering, data warehousing, and building scalable data pipelines , along with hands-on expertise in dbt for data transformation within a modern ELT architecture . This role will play a critical part in designing, building, and optimising enterprise data pipelines and warehouse structures , ensuring data quality and enabling scalable and efficient data integration across our platforms.

Requirements

  • 5+ years of experience in Data Engineering or Data Warehousing.
  • Strong hands-on experience with Snowflake (data warehousing, performance optimisation, SQL, tasks, streams, etc.).
  • Hands-on experience with dbt for building and maintaining ELT data transformation pipelines.
  • Strong experience with Azure services , including Azure Data Factory, Azure Data Lake, and related components.
  • Strong SQL skills and experience working with large-scale datasets .
  • Solid understanding of data warehousing concepts (dimensional modeling, star/snowflake schemas, data layers).
  • Experience designing and operating production-grade data pipelines and data platforms .
  • Familiarity with CI/CD practices for data pipelines and data platforms .
  • Excellent problem-solving and communication skills.

Nice To Haves

  • Experience with Salesforce APIs or related integration tools is a plus.

Responsibilities

  • Design and implement scalable data pipelines and data warehouse solutions to support the migration to Snowflake.
  • Develop and maintain ELT pipelines using Azure Data Factory and dbt to transform and load data into the Snowflake data warehouse.
  • Build and manage dbt transformations and models to support structured, maintainable warehouse layers.
  • Implement data warehouse best practices , including schema design, data partitioning, and performance optimisation in Snowflake.
  • Work closely with engineering and business stakeholders to understand data requirements and deliver reliable data solutions.
  • Ensure data quality, reliability, and integrity through validation, testing, and monitoring frameworks.
  • Optimise Snowflake workloads, including query performance, storage management, and cost efficiency .
  • Troubleshoot and resolve pipeline failures, performance bottlenecks, and data inconsistencies.
  • Contribute to improving data platform architecture, pipeline standards, and engineering practices .

Benefits

  • We’ll cover the cost of your training and certifications in the latest tools and technologies, including AWS, Azure, Snowflake, and AI
  • You'll work with industry leaders and gain experience in complex Data Platforms and AI projects from the start.
  • You'll gain mentorship and collaboration from industry leaders
  • Receive genuine recognition for work and valued behaviour.
  • Regular social events to connect. Previous events include private cinema screenings, bubble soccer, minigolf, bowling, and social drinks.
  • Be part of embracing and celebrating team diversity
  • Work alongside and collaborate with experienced technical professionals in Data Platforms, AI and GenAI
  • Work flexibly to balance family, life, and career
  • Create impact and positive change for the team and business
  • Drive process improvement
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service