Data & Integration Engineer - Phoenix

AECOMPhoenix, AZ
7d$74,500 - $137,824Onsite

About The Position

AECOM Hunt is growing its Digital Engineering team and is seeking a Data & Integration Engineer to design and build production-grade integrations and automation systems that move data from operational sources into a central, structured repository, and then enable internal applications and AI/LLM-powered workflows that support construction operations and project delivery. You will help ensure data is usable beyond reporting—supporting agent-driven retrieval, summarization, and decision support. You will work closely with construction professionals, analysts, and technologists to deliver reliable data systems, plus the tools and APIs that make that data usable by the business. This role sits within a construction delivery organization. Success in this role requires comfort working with real-world, project-based operational data that is often messy, fragmented, and shaped by schedules, budgets, contracts, field workflows, and legacy systems. Candidates who have experience with construction, engineering, manufacturing, or other asset- and project-heavy industries will be particularly well-suited for this position. This role sits at the intersection of data engineering, systems integration, automation, and business-facing application development, with increasing emphasis on AI-enabled workflows. This is not a dashboard-only or “pipeline babysitting” role. You will be expected to ship end-to-end integrations into production, including authentication, incremental loads, failure handling, monitoring/alerting, and documentation. You should be comfortable working across messy source systems, defining data contracts, and delivering production-ready automations. (Construction Systems + AI Automation)

Requirements

  • BA/BS Computer Science or relevant technical field and 4 years of experience in data engineering, systems integration, backend engineering, or automation roles with end-to-end ownership of production systems or demonstrated equivalent of education and experience.
  • SQL and experience with relational databases; ability to design schemas that support operational usage and analytical access.
  • Python or similar language for APIs, transformation logic, and automation tooling.

Nice To Haves

  • Experience with APIs and ETL/ELT workflows including pagination, rate limits, retries, incremental loads, and secure credential handling.
  • Experience building systems with monitoring/alerting and clear failure recovery paths.
  • Experience integrating LLM/AI into automation workflows, including structured outputs, basic evaluation, and failure handling (hallucinations, low-confidence extraction, retries/fallbacks).
  • Experience with basic software engineering practices: version control, code review, testing, and secure secret management.
  • Ability to operate independently in a dynamic environment.
  • Experience designing data platforms that support multiple consumption modes: reporting, APIs, applications, and AI agents.
  • Familiarity with orchestration/integration frameworks and patterns (scheduling, triggers, queues/events, dependency management).
  • Experience deploying internal services/tools (admin panels, lightweight apps, APIs) used by non-technical teams.
  • Experience operationalizing AI/LLM workflows into repeatable processes (human-in-the-loop, confidence scoring, structured outputs, drift monitoring).
  • Understanding of data modeling, governance concepts, privacy/permissions, and system reliability.
  • Experience working with construction, engineering, manufacturing, or other project-based industries where data is tied to cost, schedule, contracts, and field operations.
  • Comfort working with non-technical stakeholders.

Responsibilities

  • Design and maintain production-scale integrations (API, database, file-based, event-driven) that reliably deliver data into a central repository.
  • Build and support a centralized data platform that powers analytics, internal applications, and AI/LLM-enabled workflows.
  • Own integrations end-to-end: source ingestion → normalization → storage → access patterns (BI, APIs, apps, agents).
  • Implement data normalization and metadata standards so datasets can be reliably reused across teams and products.
  • Translate business needs into scalable technical solutions, including automation and AI-assisted workflows where appropriate.
  • Improve reliability through monitoring, alerting, observability, data validation, and failure recovery (retries, idempotency, backfills).
  • Contribute to architecture and engineering standards for integration patterns, data modeling, and AI-ready data access (clean interfaces, traceability, auditability).
  • Build and support internal tools and services (web apps, APIs, utilities) that allow business teams to discover, search, and operationalize data.
  • Enable AI/LLM use cases (semantic search, summarization, classification, structured extraction, agent workflows) by implementing repeatable pipelines, structured outputs, evaluation/QA, and human-in-the-loop controls.
  • Design and maintain data access patterns (views, APIs, query endpoints) that support low-latency application usage and governed analytics.
  • Evaluate and recommend tools/platforms pragmatically; the priority is outcomes and maintainability, not specific products.
  • Debug issues across integrations, data storage, and applications; write clear documentation including data lineage, contracts, and runbooks.

Benefits

  • AECOM benefits may include medical, dental, vision, life, AD&D, disability benefits, paid time off, leaves of absences, voluntary benefits, perks, flexible work options, well-being resources, employee assistance program, business travel insurance, service recognition awards, retirement savings plan, and employee stock purchase plan.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service