3D Data Pipeline Engineer

World LabsSan Francisco, CA
7d

About The Position

At World Labs, we’re building Large World Models—AI systems that understand, reason about, and interact with the physical world. Our work sits at the frontier of spatial intelligence, robotics, and multimodal AI, with the goal of enabling machines to perceive and operate in complex real-world environments. We’re assembling a global team of researchers, engineers, and builders to push beyond today’s limitations in artificial intelligence. If you’re excited to work on foundational technology that will redefine how machines understand the world—and how people interact with AI—this role is for you. About World Labs: World Labs is an AI research and development company focused on creating spatially intelligent systems that can model, reason, and act in the real world. We believe the next generation of AI will not live only in text or pixels, but in three-dimensional, dynamic environments —and we are building the core models to make that possible. Our team brings together expertise across machine learning, robotics, computer vision, simulation, and systems engineering. We operate with the urgency of a startup and the ambition of a research lab, tackling long-horizon problems that demand creativity, rigor, and resilience. Everything we do is in service of building the most capable world models possible—and using them to empower people, industries, and society. Role Overview We’re looking for a 3D Data Pipeline Engineer to design, build, and operate the core systems that enable high-quality 3D data processing, synthetic data generation, and rendering across our products. This is a hands-on role for someone who is passionate about large-scale 3D data, system performance, and delivering reliable data pipelines to power our product features. You’ll work closely with product engineers, 3D artists, and research scientists to design efficient, robust, and scalable data pipeline capabilities—while keeping data integrity and performance high in a fast-moving startup environment.

Requirements

  • 6+ years of experience building and operating large-scale data pipelines, especially with a focus on 3D, graphics, or simulation data , with deep experience designing scalable, distributed services in production.
  • Strong programming skills in Python and/or C++ , and a solid foundation in data engineering principles and distributed systems architecture.
  • Hands-on experience with 3D data processing libraries, game engines (e.g., Unity, Unreal), or rendering APIs (e.g., OpenGL, Vulkan) .
  • Experience with cloud-based data storage and processing solutions (e.g., Kubernetes, distributed file systems, data warehouses).
  • Experience working in fast-moving or startup environments , ideally having led systems or products from early design through production and growth .
  • A high bar for ownership and execution : you’re comfortable with ambiguity, take responsibility for outcomes, and drive work forward without waiting for perfect clarity.
  • A product-first mindset : you care about data quality, pipeline reliability, and performance as core product features, not afterthoughts.
  • Enjoy collaborating with a small, high-ownership team and raising the quality bar through code, data design, and example.

Responsibilities

  • Design, build, and operate automated pipelines for 3D data ingestion, cleaning, processing, validation, and delivery that sit on the critical path for model training.
  • Own foundational capabilities for synthetic data generation , including developing tools, workflows, and quality metrics to produce high-fidelity training data at scale.
  • Develop and optimize high-performance rendering systems and services for real-time visualization and asset generation.
  • Architect and operate distributed data systems for handling massive volumes of 3D models, textures, and associated metadata, ensuring data consistency and robust failure recovery.
  • Own data quality and production readiness end-to-end : defining data schemas, implementing quality checks, capacity planning, observability, and continuous improvement for the 3D pipeline.
  • Improve developer and researcher velocity by building shared abstractions, tooling, and guardrails that reduce the operational and cognitive load of working with 3D assets.
  • Collaborate with cross-functional teams to integrate the 3D data pipeline with other core product platforms and services.
  • Set technical direction, mentor engineers, and raise the data engineering bar across the product org with a focus on 3D data.

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

11-50 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service