About The Position

Partner Engineering Platform (PEP) Reporting team within WECE powers high scale telemetry insights that help Microsoft’s OEM and silicon partners improve device reliability and enhance the Windows customer experience. As a Data Engineer 2, you will design, build, and optimize cloud scale data systems that process high-volume Windows OEM telemetry, enabling engineering teams to make fast, accurate, data driven decisions. You will work across distributed data platforms, near real-time ingestion flows, and large analytical datasets, contributing to next generation reporting and partner experiences across the Windows ecosystem. This role is ideal for engineers who enjoy solving complex data problems, delivering reliable and efficient pipelines, and collaborating with Program Management, Technical Program Managers, Engineering, and Telemetry teams within a supportive, learning-oriented environment.

Requirements

  • Master's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 1+ year(s) experience in business analytics, data science, software development, data modeling, or data engineering OR Bachelor's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 2+ years experience in business analytics, data science, software development, data modeling, or data engineering OR equivalent experience.
  • Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft Cloud background check upon hire/transfer and every two years thereafter.

Nice To Haves

  • 3+ years in data engineering, software engineering, or largescale analytics.
  • Strong experience with KQL and Azure Data Explorer (Kusto).
  • Hands-on experience with Azure Cosmos DB, Azure Data Lake/Lakehouse, and Parquet datasets.
  • Strong SQL skills and proficiency in one programming language (Python, C#, or similar).
  • Demonstrated ability to optimize data pipelines for performance, scalability, and availability.
  • Familiarity with Azure Synapse, Fabric Semantic Models, PySpark, Scala, and distributed compute engines.
  • Experience building high-volume data pipelines (batch and streaming) using Cosmos Streams or equivalent.
  • Experience with device telemetry, reliability metrics, or OEM ecosystem data flows.
  • Knowledge of Azure access control models (RBAC/ABAC) and secure data governance.
  • Experience building observability dashboards and pipeline health metrics.

Responsibilities

  • Build and maintain scalable data pipelines using Azure Data Lake, Lakehouse patterns, Azure Datastores, and Parquet-based datasets.
  • Develop Kusto (ADX) queries, tables, and ingestion processes; optimize KQL for performance and cost efficiency.
  • Design and tune Cosmos DB collections, partitioning, indexing, and RU consumption.
  • Implement streaming and batch workflows using COSMOS Streams and large distributed data systems.
  • Improve pipeline reliability, latency, and availability through monitoring, diagnostics, and performance engineering.
  • Ensure secure and compliant data access using Azure RBAC, ACLs, managed identities, and fine-grained access patterns.
  • Collaborate with PM/TPM, platform, reliability, OEM engagement, and telemetry teams to deliver high value insights.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service