Aecom

Aecom

Trusted global infrastructure consulting firm delivering engineering, design, construction management services.

51,000IndustrialAgriculture, Food & BeverageAutomotive & Heavy Equipment & MachineryData Centers & Digital InfrastructureHigh Performance LogisticsEnergyRenewable EnergyGrid ModernizationTransportationTransit and Rail ProjectsHighwaysBridgesAviation FacilitiesFacilities ManagementEducational InstitutionsHealthcare FacilitiesCommercial BuildingsCorrections FacilitiesUrban and Regional PlanningBuilding ConstructionOil and Gas IndustryMaritime FacilitiesWater Management SystemsWebsite

Data & Integration Engineer - Phoenix

Build production-grade data integrations and AI workflows for construction operations.

Phoenix, Arizona, United States
74k - 138k USD
Full Time
Junior (1-3 years)
-qualified applicants who are offered a position must pass a pre-employment substance abuse test.

Job Highlights

Environment
Office Full-Time
Visa Sponsorship
-this position does not include sponsorship for united states work authorization now or in the future.
Security Clearance
-qualified applicants who are offered a position must pass a pre-employment substance abuse test.

About the Role

• Design and maintain production-scale integrations (API, database, file-based, event-driven) that reliably deliver data into a central repository. • Build and support a centralized data platform that powers analytics, internal applications, and AI/LLM-enabled workflows. • Implement data normalization and metadata standards so datasets can be reliably reused across teams and products. • Translate business needs into scalable technical solutions, including automation and AI-assisted workflows where appropriate. • Improve reliability through monitoring, alerting, observability, data validation, and failure recovery (retries, idempotency, backfills). • Contribute to architecture and engineering standards for integration patterns, data modeling, and AI-ready data access (clean interfaces, traceability, auditability). • Build and support internal tools and services (web apps, APIs, utilities) that allow business teams to discover, search, and operationalize data. • Enable AI/LLM use cases (semantic search, summarization, classification, structured extraction, agent workflows) by implementing repeatable pipelines, structured outputs, evaluation/QA, and human-in-the-loop controls. • Design and maintain data access patterns (views, APIs, query endpoints) that support low-latency application usage and governed analytics. • Evaluate and recommend tools/platforms pragmatically; the priority is outcomes and maintainability, not specific products. • Debug issues across integrations, data storage, and applications; write clear documentation including data lineage, contracts, and runbooks. • Build 3+ production integrations into the central repository. • Deliver at least one “data-to-application” outcome (internal tool, API, or agent workflow) adopted by the business. • Implement monitoring/alerting and documentation (runbooks, lineage, contracts) so the system is supportable. • Establish repeatable patterns for structured extraction and AI automation, including QA/evaluation.

Key Responsibilities

  • integration development
  • data platform
  • data normalization
  • monitoring
  • ai pipelines
  • tooling

What You Bring

The role works within a construction delivery organization, requiring comfort with messy, fragmented project‑based data shaped by schedules, budgets, contracts, and legacy systems. Candidates with experience in construction, engineering, manufacturing, or other asset‑heavy industries are especially well‑suited. The position is on‑site in Phoenix, AZ, with no remote option, and involves end‑to‑end ownership of integrations, from authentication to monitoring and documentation. Integrations must be production‑grade, supporting secure authentication, incremental loads, observability, and recoverability. Typical source systems include ERP, project management, document management, and scheduling tools, while the central repository must support structured tables, document metadata, and vector/semantic indexing for AI retrieval workflows. A tool‑agnostic approach is emphasized, focusing on strong engineering fundamentals over specific vendors. The position does not offer sponsorship for U.S. work authorization, relocation assistance, or remote work. Candidates must pass a pre‑employment substance‑abuse test and will be employed on an on‑site basis in Phoenix, AZ. • Own integrations end-to-end: source ingestion → normalization → storage → access patterns (BI, APIs, apps, agents). • BA/BS in Computer Science or related field and 4 years of experience in data engineering, systems integration, backend engineering, or automation with end-to-end ownership of production systems. • Proficiency in SQL and relational databases; ability to design schemas for operational usage and analytical access. • Experience with Python or a similar language for APIs, transformation logic, and automation tooling. • Experience with APIs and ETL/ELT workflows including pagination, rate limits, retries, incremental loads, and secure credential handling. • Experience building systems with monitoring/alerting and clear failure recovery paths. • Experience integrating LLM/AI into automation workflows, including structured outputs, basic evaluation, and handling hallucinations or low-confidence extraction. • Familiarity with software engineering best practices: version control, code review, testing, and secure secret management. • Ability to operate independently in a dynamic environment. • Experience designing data platforms that support reporting, APIs, applications, and AI agents. • Knowledge of orchestration/integration frameworks and patterns (scheduling, triggers, queues/events, dependency management). • Experience deploying internal services/tools (admin panels, lightweight apps, APIs) for non-technical teams. • Experience operationalizing AI/LLM workflows into repeatable processes (human-in-the-loop, confidence scoring, structured outputs, drift monitoring). • Understanding of data modeling, governance concepts, privacy/permissions, and system reliability. • Experience working with construction, engineering, manufacturing, or other project‑based industries where data is tied to cost, schedule, contracts, and field operations. • Comfort working with non-technical stakeholders.

Requirements

  • sql
  • python
  • etl
  • llm
  • bs computer
  • construction

Benefits

AECOM provides a comprehensive benefits package that may include medical, dental, vision, life, disability, paid time off, flexible work options, well‑being resources, retirement savings, employee stock purchase, and various voluntary perks, depending on employment status. As a Fortune 500 global infrastructure leader with $16.1 billion in revenue in FY 2024, AECOM offers cutting‑edge technology, award‑winning training, and a collaborative environment that supports professional growth and the ability to make a real impact on local and global projects.

Work Environment

Office Full-Time

Apply Now