Description
data architecture
pipeline development
etl development
data modeling
performance tuning
data quality
The Senior Data & Analytics Engineer is responsible for ingesting, transforming and delivering project, financial and operational data across construction projects. Working primarily within the Microsoft Fabric ecosystem, the role builds pipelines, data models and dashboards that provide real‑time insights for project teams and executives.
- Deliver well‑defined, transformed, tested, documented and code‑reviewed datasets for analysis
- Design the structure and layout of databases, data warehouses and data lakes
- Build and optimize pipelines using Azure Data Factory, Fabric Pipelines and Notebooks
- Create robust data models and architectures to support analytics initiatives
- Collaborate with business stakeholders to understand analytics needs and deliver comprehensive models
- Define and manage standards, guidelines and processes to ensure data quality
- Identify and implement optimizations to enhance query performance and reduce processing time
- Develop and maintain efficient, reliable ETL pipelines
- Standardize KPIs for cost, schedule and risk with project teams
- Implement data validation, monitoring and error‑handling to ensure accuracy
Requirements
5+ years
sql
azure
python
spark
powerbi
Candidates must have at least five years of experience in data analytics, engineering or software development, with strong skills in data modeling, ETL processes and SQL. A bachelor’s degree in a related field is required, and experience with Azure‑based platforms, Python or Scala, and construction‑industry data sources is preferred. Knowledge of data governance, security best practices and Power BI visualization is also essential.
- Minimum 5 years experience in data analytics, data engineering, software engineering or similar role
- Expertise in data modeling, ETL development and data analysis
- Experience with construction industry data and project‑based analytics (desired)
- Experience with Apache Spark or Databricks
- Bachelor’s degree in computer science, data science, engineering or related field
- Strong SQL skills and proficiency with Microsoft Fabric components (Data Factory, Lakehouse, OneLake)
- Familiarity with Azure‑based data platforms for storage and processing
- Programming ability in Python, Scala and SQL for data manipulation and scripting
- Understanding of data governance, quality and security best practices
- Knowledge of construction ERP (CMiC), HCM (Workday) and CRM (Dynamics) integrations
- Strong problem‑solving, critical and analytical thinking abilities
- Familiarity with Power BI for data visualization
- Excellent communication skills for cross‑functional collaboration and stakeholder presentations
Benefits
Employees enjoy a comprehensive benefits package that includes health, dental and vision insurance, an Employee Stock Ownership Plan, and a 401(k) plan with up to a 4 % company match. Paid time off is generous, covering vacation, sick leave, twelve holidays, summer Fridays and an annual volunteer day. Additional perks feature a company‑provided cell phone and laptop, tuition reimbursement, pet insurance, financial planning services and more.
Shawmut is an equal‑opportunity employer that prohibits discrimination based on any protected characteristic. The Boston base salary range for this position is $125,000 – $175,000, with placement depending on experience, project scope and internal equity.
Training + Development
Information not given or found