

Provider of non‑hazardous solid waste collection, recycling, disposal and energy services.
The Data Modeler III is a hands‑on technical role focused on full‑stack software development within Republic Services’ Enterprise Data organization. The position will play a crucial part in shaping future big‑data and analytics initiatives for the company.
In this role the incumbent will design and implement Medallion Architecture, dimensional star and snowflake schemas, and metadata‑driven modeling approaches for enterprise data warehouses. They will develop canonical and semantic data models with effective slowly changing dimension handling, write and optimize advanced SQL queries in Snowflake, and align models with ELT/ETL pipelines and analytics frameworks to create scalable structures.
Responsibilities also include building data pipelines that ingest from relational databases such as Oracle, SQL Server, DB2 and Aurora, as well as from file shares, web services, and streaming sources like Kinesis and Kafka. The Data Modeler will construct a performant Data Lake on AWS S3, leverage AWS Glue, Informatica, EMR, Spark, Athena and Python for data engineering, and develop JavaScript modules and REST APIs with MarkLogic to support complex searches.
The candidate will participate in requirements definition, system and data‑architecture design, and follow Agile development methodologies throughout the software lifecycle. Collaboration with cross‑functional teams and strong communication skills are essential.
Required qualifications include five or more years of experience with data‑modeling tools (ERWIN, Lucid, sqlDBM), seven or more years in enterprise information solution architecture and integration (SOA, micro‑services, ETL), and five years of hands‑on experience with MarkLogic and AWS services such as S3, Kinesis, Lambda, Athena, Glue and EMR. Additional experience with analytics tools (SAS, R, Python), web development (Angular, JavaScript, Node.js), JSON/XML modeling, Git, BI platforms, and machine learning is expected.
Minimum qualifications are a bachelor’s degree in Computer Science, Information Systems, Engineering, Statistics or a comparable field, proven experience with AWS data and analytics services, and at least five years of experience in data ingestion, extraction and integration.
Compensation ranges from $109,500 to $164,300 annually, dependent on experience, and includes a comprehensive benefits package. Eligible employees may elect coverage for medical, dental and vision care, health‑care and dependent‑care spending accounts, short‑ and long‑term disability, life and accidental death insurance, an employee assistance program, discount programs, a retirement plan with company match, an employee stock purchase plan and paid time off.