About The Position

Join the team that powers Microsoft’s planetary‑scale geospatial intelligence. The Microsoft Planetary Computer provides open environmental and Earth‑observation data at ~70 petabyte‑scale (PB) with high‑performance Application Programming Interface (APIs) and compute so organizations can analyze raster, vector, and time‑series data and build AI‑driven experiences. Planetary Computer Pro is a first‑party Azure service that brings the same architecture to customers’ private geospatial catalogs, enabling secure ingestion, governance, and analysis side‑by-side with the Planetary Computer’s open data. Together, these services let customers train models, run change detection, and operationalize spatial analytics across public and private datasets with enterprise‑grade reliability, security, and scale. If you enjoy designing resilient cloud services, shipping developer‑friendly APIs and Software Development Kit (SDKs), and applying AI to geospatial problems that matter, this Senior Software Engineer role is for you. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond.

Requirements

  • Bachelor's Degree in Computer Science or related technical field AND 4+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python OR equivalent experience.
  • Security Clearance Requirements: Candidates must be able to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include, but are not limited to the following specialized security screenings:
  • The successful candidate must have an active U.S. Government Top Secret Security Clearance. Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. Failure to maintain or obtain the appropriate clearance and/or customer screening requirements may result in employment action up to and including termination.
  • Clearance Verification: This position requires successful verification of the stated security clearance to meet federal government customer requirements. You will be asked to provide clearance verification information prior to an offer of employment.
  • Microsoft Cloud Background Check: This position will be required to pass the Microsoft Cloud background check upon hire/transfer and every two years thereafter
  • Citizenship & Citizenship Verification: This position requires verification of U.S. citizenship due to citizenship-based legal restrictions. Specifically, this position supports United States federal, state, and/or local United States government agency customer and is subject to certain citizenship-based restrictions where required or permitted by applicable law. To meet this legal requirement, citizenship will be verified via a valid passport, or other approved documents, or verified US government Clearance

Nice To Haves

  • Proven experience building and operating cloud services and public APIs at scale.
  • Programming skills in one or more of: Python, C#, or similar languages, with emphasis on backend systems.
  • Hands‑on with containers and Kubernetes; familiarity with Azure compute, storage, and networking concepts.
  • Practical knowledge of CI/CD, automated testing, observability, and service reliability practices.
  • Solid understanding of distributed systems fundamentals, data modeling, concurrency, and performance optimization.
  • Geospatial expertise: working with raster and vector data, tiling, indexing, coordinate systems, and formats such as COG and vector tiles.
  • Experience building AI/ML pipelines for geospatial data (training, inference, evaluation, and monitoring) using Azure machine learning and data services.
  • Background with PB‑scale data lakes, Spark/Dask‑style processing, and scalable search/discovery over large catalogs.
  • Track record of delivering secure, multi‑tenant services and applying cloud governance and compliance controls.

Responsibilities

  • Design, build, and operate cloud‑native services that provide low‑latency, high‑throughput access to planetary‑scale geospatial data and private catalogs.
  • Develop Representational State Transfer (REST) and g Remote Procedure Calls (gRPC); APIs and SDKs that support discovery, access, and compute orchestration across raster, vector, and time‑series data.
  • Enable AI workflows on geospatial data, including model training, batch and streaming inference, change detection, and forecasting using Azure machine learning and data services.
  • Implement secure multi‑tenant patterns, authentication and authorization, throttling, and data isolation for enterprise and public scenarios.
  • Build automated Continuous Integration / Continuous Delivery (CI/CD) pipelines, tests, and release gates; instrument services with metrics, logs, and traces; define Service Level Indicator (SLIs)/Service Level Objective (SLOs) and drive continuous reliability improvements.
  • Optimize large‑scale storage and compute (tiling, partitioning, indexing, caching) for cost, performance, and robustness at PB scale.
  • Diagnose and resolve production issues; lead incident reviews and long-term remediation; automate operations wherever possible.
  • Collaborate with product, design, and data science to translate customer scenarios into clear technical designs and incremental deliveries.
  • Write clear design docs and code reviews; contribute shared libraries, patterns, and documentation that elevate developer productivity.
  • Champion security, privacy, accessibility, and responsible AI throughout the engineering lifecycle.
  • Embody our Culture and Values
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service