Azure Data Engineer

Inizio Partners CorpWarren, NJ
4d

About The Position

Role & Responsibilities Overview: Build and maintain Azure/Microsoft Fabric data pipelines (batch, streaming, API) using Data Factory, Lakehouse, Warehouse, and Real‑Time Analytics. Design scalable data models and ETL/ELT workflows supporting insurance datasets (policy, claims, billing, actuarial, customer). Implement Lakehouse‑based architectures enabling analytics, reporting, and ML workloads. Optimize pipeline performance, reliability, cost, and storage across Azure and Fabric environments. Apply data governance, lineage, metadata, and data quality rules in collaboration with governance teams. Implement security and compliance controls (RBAC, sensitivity labels, encryption) aligned with NAIC and data privacy regulations. Collaborate with cloud engineering, security, and business stakeholders to deliver robust, standards‑aligned solutions. Support platform monitoring, troubleshoot issues proactively, and contribute to continuous improvement and Fabric feature adoption. Candidate Profile: 7+ years of experience in data architecture, data engineering, or analytics platform design. Strong hands on experience of at least 5 years with Azure services. Deep understanding of P&C insurance data domains (policy, claims, underwriting, actuarial, billing). Expertise in data modeling (dimensional, canonical, semantic models). Strong knowledge of data governance, metadata management, and data quality practices. Proficiency with SQL, Python, and modern data engineering frameworks. Excellent communication and stakeholder management skills. Experience with Microsoft Purview, Power BI, Databricks, or Azure Synapse. Familiarity with modern data architecture concepts (data mesh, data fabric, Lakehouse). Certifications such as Azure Data Engineer Prior experience in P&C insurance data modernization programs

Requirements

  • 7+ years of experience in data architecture, data engineering, or analytics platform design.
  • Strong hands on experience of at least 5 years with Azure services.
  • Deep understanding of P&C insurance data domains (policy, claims, underwriting, actuarial, billing).
  • Expertise in data modeling (dimensional, canonical, semantic models).
  • Strong knowledge of data governance, metadata management, and data quality practices.
  • Proficiency with SQL, Python, and modern data engineering frameworks.
  • Excellent communication and stakeholder management skills.

Nice To Haves

  • Experience with Microsoft Purview, Power BI, Databricks, or Azure Synapse.
  • Familiarity with modern data architecture concepts (data mesh, data fabric, Lakehouse).
  • Certifications such as Azure Data Engineer
  • Prior experience in P&C insurance data modernization programs

Responsibilities

  • Build and maintain Azure/Microsoft Fabric data pipelines (batch, streaming, API) using Data Factory, Lakehouse, Warehouse, and Real‑Time Analytics.
  • Design scalable data models and ETL/ELT workflows supporting insurance datasets (policy, claims, billing, actuarial, customer).
  • Implement Lakehouse‑based architectures enabling analytics, reporting, and ML workloads.
  • Optimize pipeline performance, reliability, cost, and storage across Azure and Fabric environments.
  • Apply data governance, lineage, metadata, and data quality rules in collaboration with governance teams.
  • Implement security and compliance controls (RBAC, sensitivity labels, encryption) aligned with NAIC and data privacy regulations.
  • Collaborate with cloud engineering, security, and business stakeholders to deliver robust, standards‑aligned solutions.
  • Support platform monitoring, troubleshoot issues proactively, and contribute to continuous improvement and Fabric feature adoption.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service