Data Analytics Engineer

Activision BlizzardSanta Monica, CA
6dOnsite

About The Position

Are you an excellent communicator who thrives in whitespace, turning ambiguous business problems into well-designed technical solutions? Do you want to build high-scale, cloud-native backend systems that power marketing, profiling, and reporting across an iconic gaming portfolio? Do you enjoy building real applications (not just pipelines) to personalize the player experience? Do you care deeply about doing things the right way: reliable, secure, optimized, and maintainable? Do you love leveraging state-of-the art technologies? If you answered yes, then our Data Engineer role within Marketing Technology may be the right role for you! We are seeking an experienced data engineer to join a diverse team that sits within Marketing Technology. You’ll help build and evolve data products and services that understand and activate the player across Activision titles. This includes helping us understand our players, enabling timely communication both in-game and out-of-game for business units across the company. We are problem solvers, constantly reviewing how and why we do things, learning from each other, and raising the bar. We experiment with new technology, iterate quickly, and bring a practical innovation mindset. This role is based in our Santa Monica, CA office and follows an onsite work schedule of four days per week. Work arrangements may evolve to meet business needs.

Requirements

  • BA/BS degree in Computer Science or related technical field (or equivalent practical experience).
  • 3–5+ years of professional experience building and maintaining production-grade data pipelines or backend data applications.
  • Strong proficiency in Python and SQL, including clean architecture, testing practices, and solid engineering fundamentals.
  • Hands-on experience with Databricks (Spark, Delta Lake, job orchestration, performance tuning, and best practices).
  • Understanding of distributed processing systems and designing for scale, reliability, and cost efficiency.
  • Experience working effectively in ambiguous problem spaces with cross-functional stakeholders and evolving requirements.
  • Able to clearly explain technical designs and tradeoffs to both technical and non-technical audiences.

Nice To Haves

  • Experience with cloud platforms (AWS, GCP, or Azure), including storage, compute, networking, and IAM/security concepts.
  • Experience in modeling and working with marketing and player behavioral data.
  • Experience with Spark Structured Streaming and streaming design patterns (watermarking, late data handling, idempotency, replay strategies).
  • Experience with distributed messaging systems such as Kafka and/or Pub/Sub (including schema evolution, ordering guarantees, retries, and dead-letter queues).
  • Experience building identity resolution, attribution, or cross-platform player profile systems.
  • Experience with Docker and containerized workloads (Kubernetes a plus).
  • Familiarity with Looker/LookML and semantic modeling layers.
  • Experience with data governance concepts, including data contracts, cataloging, lineage, and access controls.
  • Experience scaling internal data products and services to stakeholder teams.
  • Track record applying AI-accelerated engineering practices.

Responsibilities

  • Design and implement scalable data pipelines and backend data services using Databricks (Spark ), Python, and SQL with a strong emphasis on reliability, observability, security, and cost/performance optimization.
  • Build event-driven data applications and services that consume, transform, and publish player events and signals for downstream activation and analytics (e.g., player profiling, marketing triggers, identity resolution, reporting signals).
  • Design both batch and streaming systems with strong guarantees around data correctness, idempotency, replayability, and schema evolution.
  • Translate ambiguous business needs into technical designs, engineering plans, and measurable outcomes; clearly articulate tradeoffs and architectural decisions.
  • Collaborate with stakeholders across Marketing, Analytics, Engineering, Product, and Legal to ensure solutions are aligned, secure, privacy-compliant, and fit-for-purpose.
  • Apply strong engineering judgment: select appropriate architectural patterns, enforce data contracts, document decisions, and build for long-term maintainability and scalability.
  • Improve and uphold team standards across code quality, testing practices, CI/CD pipelines, observability, monitoring, data governance, and shared libraries.
  • Innovate thoughtfully with AI-enabled engineering tools to accelerate development while maintaining high quality, correctness, and security.
  • Works under general guidance from senior engineers or technical leads on complex cross-system and architectural decisions.

Benefits

  • Medical, dental, vision, health savings account or health reimbursement account, healthcare spending accounts, dependent care spending accounts, life and AD&D insurance, disability insurance;
  • 401(k) with Company match, tuition reimbursement, charitable donation matching;
  • Paid holidays and vacation, paid sick time, floating holidays, compassion and bereavement leaves, parental leave;
  • Mental health & wellbeing programs, fitness programs, free and discounted games, and a variety of other voluntary benefit programs like supplemental life & disability, legal service, ID protection, rental insurance, and others;
  • If the Company requires that you move geographic locations for the job, then you may also be eligible for relocation assistance.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service