Senior Data Enginner

Company logo
Swinerton
A leading construction company offering services in building, infrastructure, and energy sectors.
Design and optimize Azure/Databricks data pipelines and lakehouse solutions.
16 days ago ago
$160,000 - $180,000
Expert & Leadership (13+ years), Intermediate (4-7 years), Experienced (8-12 years)
Full Time
Arvada, CO
Office Full-Time
Company Size
1,300 Employees
Service Specialisms
Construction services
Project Management
Consulting
Engineering
Architecture
Property Development
Design
Technical Services
Sector Specialisms
Healthcare
Life Sciences
Affordable Housing
Aviation
Parking Structures
Industrial
Energy
Mass Timber Construction
Role
What you would be doing
etl pipelines
data quality
data governance
streaming pipelines
data architecture
workflow orchestration
  • Ensure data quality, consistency, and reliability through robust validation, monitoring, and error-handling processes.
  • Design, build, and maintain scalable ETL/ELT pipelines for ingesting, transforming, and delivering data from diverse sources (structured, semi-structured, unstructured) using Azure Data Factory, Databricks, and related tools.
  • Mentor and support junior engineers, promoting a culture of learning and excellence.
  • Implement privacy and security best practices in data pipelines (e.g., data masking, encryption, role-based access control).
  • Architect, optimize, and evolve data storage solutions and data models, including medallion architecture (bronze, silver, gold layers), for scalable and cost-efficient lakehouse platforms on Azure and Databricks.
  • Optimize data engineering capabilities by leveraging existing and emerging tools and technologies, with a focus on performance, cost efficiency, scalability, data workflow management, and reliable deployment.
  • Integrate data from diverse internal and external sources, ensuring interoperability and consistency across platforms.
  • Foster alignment and adoption of data engineering solutions by building strong relationships and ensuring business relevance.
  • Build and manage real-time / streaming data pipelines using Azure Event Hubs, Kafka, or Spark Structured Streaming.
  • Engage with data scientists, analysts, and business stakeholders to understand data requirements and deliver solutions that drive business value.
  • Collaborate with data scientists to enable seamless integration of data pipelines with analytics and machine learning workflows, including ML model deployment and monitoring.
  • Implement workflow orchestration tools (e.g., Airflow, dbt) for managing complex data workflows.
  • Apply data governance, security, privacy, and compliance standards within engineering solutions, following organizational and regulatory guidelines.
What you bring
azure databricks
sql
python
airflow
ci/cd
5+ years
  • Skilled in handling structured, semi-structured (e.g., JSON, XML), and unstructured data.
  • Advanced experience with Databricks, including Spark, Delta Lake, and medallion architecture.
  • Excellent problem-solving, critical thinking, and communication skills; ability to work independently and collaboratively with cross-functional teams.
  • 5+ years in data engineering or related roles, with a track record of delivering production-ready data solutions.
  • Construction industry experience is preferred but not required.
  • Proven ability to work independently in a remote or hybrid environment with minimal supervision, while adapting to shifting priorities and evolving business needs.
  • Bachelor’s degree in Computer Science, Engineering, or a related quantitative field required; Master’s degree preferred. Equivalent work experience will also be considered.
  • Deep expertise in Azure data services (e.g., Azure Databricks, Azure Data Lake).
  • Experience with real-time/streaming data solutions (e.g., Azure Event Hubs, Kafka, Spark Structured Streaming).
  • Experience with GenAI/LLM integration (e.g., vector databases, RAG pipelines) is a plus.
  • Strong experience with data modeling, data warehousing, lakehouse design patterns, and data governance best practices.
  • Experience with workflow orchestration tools (e.g., Airflow, dbt).
  • Proficiency in SQL and Python (including pandas, PySpark, or similar frameworks).
  • Ability to connect business needs with technical capabilities, ensuring solutions are scalable and value-driven.
  • Experience with containerization (Docker, Kubernetes) and/or serverless compute (Azure Functions) is a plus.
  • Communicate complex technical concepts through clear, actionable insights and documentation tailored to both technical and non-technical audiences.
  • Experience with CI/CD for data pipelines and infrastructure as code (e.g., Azure DevOps, GitHub Actions, Terraform).
Benefits
Information not given or found
Training + Development
Information not given or found
Company
Overview
130+ Years
Established as a Construction Leader
Pioneered the industry and built a legacy of trust and expertise over a century of operation.
$10 Billion
Annual Project Value Delivered
Demonstrates consistent performance and scale in managing large and complex construction projects.
  • Tackles projects ranging from high-rise buildings to energy facilities with a diverse portfolio.
  • Known for adaptability in delivering both traditional and cutting-edge projects.
  • Operates across multiple sectors, including residential, commercial, energy, and industrial developments.
  • Combines technical knowledge with a focus on quality and value in every project.
  • Recognized for expertise in large-scale infrastructure, such as bridges, airports, and transport hubs.
  • Achievements include iconic skyscrapers and sustainable energy projects.
Culture + Values
  • Safety is our top priority and a core value.
  • We are committed to teamwork and collaboration.
  • We believe in delivering exceptional quality.
  • We prioritize innovation and continuous improvement.
  • We are dedicated to building lasting relationships with clients, partners, and communities.
  • Integrity and honesty guide our decisions and actions.
  • We value diversity and respect different perspectives.
  • We invest in our people through training and development.
Environment + Sustainability
Net-zero by 2040
Carbon emissions commitment
Targeting complete elimination of carbon emissions by 2040.
200+ LEED-Certified
Sustainable building achievements
Accomplished over 200 LEED认证的建设项目,彰显绿色环保理念。
  • Focusing on reducing energy consumption, waste, and water usage in its operations.
  • Actively implementing sustainable building practices, including LEED certification for projects.
  • Working to minimize environmental impact across all stages of construction, from design to delivery.
Inclusion & Diversity
25% Women
Leadership Representation
Swinerton has achieved significant representation of women in leadership roles within the company.
  • Committed to creating an inclusive workplace and promoting diversity at all levels.
  • Aims to increase diversity in its talent pipeline through outreach programs and strategic partnerships.
  • Monitors and reports on diversity metrics with a focus on growth in underrepresented groups.
Big Kablio Logo
Kablio AIIf you're someone who helps build and power the world (or dreams to), Kablio AI is your pocket-sized recruiter that gets you hired.
Copyright © 2025 Kablio
Senior Data Enginner at Swinerton in Arvada, CO