
Autodesk
Design and make software for architecture, engineering, construction, and entertainment industries.
Data Engineer - Toronto
Senior Data Engineer designing scalable ETL pipelines using Snowflake, DBT, Spark, Airflow.
Job Highlights
About the Role
In this role you will apply a product‑focused mindset to understand business requirements and architect scalable, extensible systems. You will participate in or lead design reviews to evaluate and select suitable technologies, and develop ETL processes and data‑processing workflows using tools such as Snowflake, dbt, Apache Spark, and more. Building data‑quality tracking mechanisms, detecting anomalies, and addressing changes in data ingestion are also key responsibilities. You will review code for best practices, ensuring accuracy, testability, and efficiency, and help establish engineering standards through design and code reviews as well as process improvements. The position requires a Bachelor’s degree in Computer Science, Engineering, or equivalent experience and at least five years of experience with ETL/ELT tools and data curation. • Lead or contribute to design reviews to select optimal technologies. • Develop ETL/ELT pipelines using Snowflake, dbt, Apache Spark, PySpark, and Airflow. • Implement data quality tracking, detect anomalies, and address ingestion changes. • Review code for accuracy, testability, and efficiency; enforce best practices. • Establish engineering standards through design/code reviews and process improvements.
Key Responsibilities
- ▸design review
- ▸etl pipelines
- ▸data quality
- ▸code review
- ▸engineering standards
- ▸scalable architecture
What You Bring
We are seeking a Senior Data Engineer to join our Data and Analytics team and support the data needs of the Product Development & Manufacturing Solutions (PDMS) organization. The role operates within a robust ecosystem that includes DBT, PySpark, Python, Airflow, Snowflake, Hive, and AWS. Our ideal candidate is an experienced data engineer skilled in data warehousing, eager to learn new technologies, detail‑oriented, quality‑focused, and enthusiastic about making a significant impact with data at Autodesk. Strong proficiency in Python and SQL, solid data‑warehousing concepts, dimensional modeling, and relational databases is essential, along with familiarity with Snowflake or similar ingestion platforms. Candidates should have a commitment to continuous learning, a desire to uncover the “why” behind business needs, and strong communication, problem‑solving, and interpersonal skills. Preferred qualifications include experience with DBT, PySpark, and Airflow, knowledge of Spark and the Hadoop 2.0 ecosystem, and familiarity with automation tools such as Git and Jenkins as well as Agile Scrum teams. • Architect scalable, extensible data systems aligned with product requirements. • Hold a Bachelor’s degree in Computer Science, Engineering, or equivalent. • Possess 5+ years of experience with ETL/ELT tools and data curation. • Proficient in Python, SQL, dimensional modeling, and relational databases. • Familiar with Snowflake or similar data ingestion platforms. • Demonstrate strong analytical, communication, and problem‑solving skills. • Experience with DBT, PySpark, Airflow, Spark, and Hadoop ecosystem (preferred). • Knowledge of automation tools such as Git and Jenkins; Agile Scrum familiarity.
Requirements
- ▸python
- ▸sql
- ▸snowflake
- ▸dbt
- ▸bachelor’s
- ▸5+ years
Benefits
Autodesk creates innovative software that enables greener buildings, cleaner cars, smarter factories, and blockbuster movies, and we take pride in a culture that guides how we work, treat each other, and connect with customers and partners. Employees enjoy a competitive compensation package; for Canada‑BC based roles the starting base salary ranges from $86,600 to $127,050, plus bonuses, stock grants, and comprehensive benefits.
Work Environment
Office Full-Time