Engineering Intern, Cloud Security, Data Ingestion, and Analytics

IllumioSunnyvale, CA
57d$37 - $47Onsite

About The Position

Onwards Together! Illumio is the leader in ransomware and breach containment, redefining how organizations contain cyberattacks and enable operational resilience. Powered by the Illumio AI Security Graph, our breach containment platform identifies and contains threats across hybrid multi-cloud environments – stopping the spread of attacks before they become disasters. Recognized as a Leader in the Forrester Wave™ for Microsegmentation, Illumio enables Zero Trust, strengthening cyber resilience for the infrastructure, systems, and organizations that keep the world running. This is a 12 week internship program beginning on May 26th 2026 or June 22nd 2026 Location: Onsite 5 day's a week at Headquarters in Sunnyvale, CA Our Team's Vision: Illumio’s Cloud Secure team builds and operates the data backbone that powers our AI-driven Security Graph — a system that maps the world’s most complex cloud environments in real time. We ingest and process petabytes of cloud metadata and network flow logs from AWS, Azure, GCP, and OCI cloud environments using high throughput streaming applications. This data fuels our graph analytics and Zero Trust segmentation engine, giving customers deep visibility and precise control across billions of workloads. Our SaaS platform spans multi-cloud environments and leverages a modern tech stack of Golang, Kafka, Neo4j, Postgres, and advanced data lake technologies that are all containerized and orchestrated at scale. As an intern, you’ll work at the intersection of large-scale data engineering, distributed systems, and cybersecurity analytics, contributing to systems that help secure the world’s most critical workloads.

Requirements

  • Currently enrolled in a full time Bachelor’s or Master’s degree program in Computer Science, Software Engineering, or a related field, with an expected graduation date in Winter 2026/Spring2027
  • Strong coding skills in Golang, Python, or Java, and an understanding of data structures and distributed system design
  • Curiosity about AI applications in cybersecurity, graph analytics, and streaming data pipelines
  • Familiarity with Kafka, Kubernetes, and cloud services (AWS, Azure, GCP)
  • A growth mindset, strong analytical thinking, and enthusiasm for tackling scale challenges

Responsibilities

  • Design, implement, and test features within large-scale data ingestion and analytics pipelines that handle streaming telemetry across multi-cloud environments
  • Contribute to the AI-powered Security Graph, experimenting with data models and graph queries that enrich cloud visibility
  • Collaborate with engineers to optimize performance, reliability, and cost efficiency across Kafka, Neo4j, and Postgres-backed services
  • Develop dashboards and automation for data health, ingestion latency, and observability metrics
  • Explore new ways to extract insights from flow data, using data lakes and analytics frameworks to uncover behavioral patterns
  • Document and share technical findings and impact metrics with engineering and product teams to support data-driven decision-making
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service