Data Engineer

Attention ArcDurham, NC
10h$90,000 - $120,000

About The Position

At Attention Arc, we exist to make media matter. As part of our Technology & Analytics team — The Builders and Translators — you build the connective tissue that powers everything we do. Your work ensures data flows cleanly, reliably, and intelligently across systems — turning complexity into clarity and infrastructure into impact. This Data Engineer role centers on architecting and scaling robust data pipelines for complex, well-documented datasets. You will design integrations that unify multiple data sources, create analysis-ready environments, and ensure our data warehouse is optimized for performance and growth. This is a hands-on role for someone who thrives in building elegant solutions to messy problems — and who understands that clean data is the foundation of credible insight.

Requirements

  • 3–6+ years of experience in data engineering, data integration, or analytics engineering
  • Advanced proficiency in SQL and database design
  • Proven experience building and maintaining ETL/ELT pipelines across multiple data sources
  • Hands-on experience with Snowflake data warehouse architecture and optimization
  • Experience developing and maintaining DBT models for transformation workflows
  • Strong understanding of data modeling principles (relational and dimensional)
  • Experience implementing data quality validation and monitoring practices
  • Strong documentation habits and structured version control practices
  • Ability to translate business requirements into scalable technical solutions
  • Clear communicator who can explain complex systems in actionable terms
  • A disciplined approach to testing, validation, and long-term maintainability
  • Familiarity with orchestration tools (e.g., Airflow)

Nice To Haves

  • Background in media, advertising, or marketing analytics is preferred
  • Experience working with Nielsen or advanced measurement datasets is a plus
  • Experience building and maintaining APIs for data access and distribution
  • Experience working in cross-functional, agency-style environments

Responsibilities

  • Build & Scale Data Integrations
  • Develop and maintain scalable ETL/ELT pipelines integrating complex datasets from multiple internal and external sources
  • Build and maintain API endpoints to support secure, efficient data access and delivery
  • Design efficient SQL queries and optimized database structures that support analytics workflows
  • Ensure data transformations are documented, modular, and built for reuse
  • Architect Modern Data Infrastructure
  • Implement and optimize Snowflake data warehouse solutions for performance, scalability, and reliability
  • Design and maintain DBT (Data Build Tool) models to manage transformation logic and business rules
  • Create structured, well-documented data models that support clarity and cross-team adoption
  • Continuously improve pipeline performance, orchestration, and monitoring
  • Deliver Trusted, Analysis-Ready Data
  • Partner closely with Analytics, Planning, and Activation teams to deliver clean, validated datasets
  • Implement data quality checks, validation frameworks, and monitoring systems
  • Ensure consistency and reliability across all data integrations
  • Troubleshoot integration issues with urgency and precision
  • Improve, Automate & Evolve
  • Identify inefficiencies in workflows and proactively recommend scalable improvements
  • Standardize integration patterns to reduce complexity and onboarding friction
  • Evaluate new tools, frameworks, and approaches to enhance infrastructure capabilities
  • Contribute to documentation and best practices that elevate agency-wide data maturity
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service