Description
data architecture
pipeline automation
data governance
cloud integration
performance optimization
stakeholder liaison
As a Data Engineer II, you will play a key leadership role in the Enterprise Data Engineering organization by architecting, building, and optimizing scalable data solutions that power advanced analytics and business intelligence. This position requires deep expertise in data engineering, cloud platforms, and modern data architectures. You will lead complex projects, mentor junior engineers, and ensure best practices in data governance, security, and performance across enterprise systems.
You will also build strong relationships with business stakeholders, guiding them through requirements discovery and helping them articulate needs that translate into actionable technical solutions. Additionally, you will contribute to the development, documentation, and enforcement of data engineering standards in collaboration with the Data Engineering Lead, ensuring alignment with enterprise architecture and compliance frameworks.
- Evaluate emerging technologies and recommend improvements to data engineering practices.
- Develop trusted partnerships with business leaders, analysts, and technical teams to understand objectives and challenges.
- Lead the design and architecture of advanced data solutions
- Create and maintain guidelines for data modeling, pipeline design, and transformation logic.
- Define and enforce standards for data modeling, ETL/ELT processes, and pipeline orchestration under the guidance of the Data Engineering Lead.
- Ensure data quality and governance across the enterprise
- Act as a liaison between technical teams and business units, translating needs into scalable, well-documented solutions.
- Champion automation and performance optimization initiatives across pipelines and workflows.
- Facilitate requirements-gathering sessions, asking the right questions to uncover critical details and ensure clarity.
- Provide feedback to improve practices and drive adoption of best practices within project teams.
- Architect and implement data pipelines and frameworks using dbt, Snowflake, and cloud services (AWS, Azure) across medallion architecture layers (bronze, silver, gold).
- Design and manage integrations with enterprise systems (SAP, Salesforce, Manufacturing Efficiency Systems) and external data sources at scale.
- Drive adherence to governance, security, and audit requirements across all data assets.
- Ensure consistency and quality across all data assets by implementing coding standards, naming conventions, and documentation protocols.
- Optimize data ingestion for performance, reliability, and cost efficiency.
- Implement automated data quality frameworks and monitoring solutions to ensure accuracy, consistency, and compliance.
- Collaborate, build relationships, and guide stakeholders
- Develop, document, and enforce standards
Requirements
python
sql
dbt
aws
snowflake
4+ years
- Strong understanding of ETL processes, data modeling (including dimensional modeling), and medallion architecture.
- Experience with database technologies (SQL, NoSQL, Oracle) and API development is highly preferred.
- Skilled at defining and championing standards that improve team efficiency and data reliability.
- Own and optimize complex integrations
- Advanced proficiency in Python, SQL and transformation tools like dbt.
- Bachelor’s or master’s degree in computer science, software or computer engineering or related field, or equivalent work experience.
- Hands-on experience designing and implementing complex ETL/ELT workflows using tools like Azure Data Factory (ADF), AWS Glue, Informatica, or equivalent.
- Ability to lead technical discussions, mentor team members, and communicate effectively with stakeholders.
- Strong relationship-building skills with the ability to influence and guide stakeholders through requirements discovery.
- Strong analytical and problem-solving skills.
- 4+ years of data engineering experience with proven leadership in complex projects.
- Expertise with cloud-based services platforms (e.g., AWS, Azure) and modern data warehouses/lakehouses (e.g., Snowflake, Databricks).
- Must currently live within the Nashville area
- Experience with integration into reporting platforms like Tableau and Power BI.
Benefits
In This Position You Will Have The Opportunity To
LP offers competitive salaries and comprehensive benefits and programs including health and welfare benefits, 401(k) program, career mobility, tuition reimbursement, volunteer opportunities, profit sharing and more.
- This position will work at home and in our Nashville office on a hybrid schedule
Training + Development
Information not given or found