Description
kafka integration
kafka monitoring
glue pipelines
aws integration
event‑driven
stakeholder collaboration
The Senior Integration Engineer will design, develop, and maintain scalable integration solutions, optimize Kafka clusters, build ETL pipelines with AWS Glue, and integrate cloud‑based applications using AWS services. They will champion best practices for event‑driven and microservices integration, collaborate with architects and stakeholders, and ensure security, monitoring, and documentation standards are met.
- Design, develop, and maintain scalable integration solutions using Apache Kafka (Kafka Connect, Streams, Schema Registry).
- Monitor, troubleshoot, and optimize Kafka cluster performance for throughput, latency, and resource utilization.
- Build ETL/ELT pipelines with AWS Glue and manage the Glue Data Catalog for data quality and governance.
- Integrate cloud applications, databases, and third‑party services using AWS services such as Lambda, S3, API Gateway, SNS, SQS, and EventBridge.
- Implement best practices for event‑driven architecture, microservices integration, and API development.
- Collaborate with architects, data engineers, developers, and business stakeholders to translate requirements into technical designs.
- Optimize existing integration solutions to improve performance, reliability, and scalability.
- Create and maintain comprehensive documentation, monitoring, alerting, and logging for integration solutions.
- Mentor junior engineers and provide technical leadership within the integration domain.
- Stay current with emerging technologies and industry trends in data integration, Kafka, and AWS.
Requirements
apache kafka
aws
python
sql
git
confluent certified
We are seeking a highly skilled and experienced Senior Integration Engineer to join our dynamic team. This role will be pivotal in designing, developing, and maintaining robust integration solutions, with a strong focus on leveraging Apache Kafka for real‑time data streaming and AWS integration services, particularly AWS Glue for ETL and data cataloging. The ideal candidate will have a deep understanding of enterprise integration patterns, event‑driven architectures, and cloud‑native solutions, and will collaborate closely with engineering, product, and business teams to ensure seamless data flow across our system landscape.
- Bachelor's degree in Computer Science, Engineering, IT or related field, or equivalent experience.
- 5+ years of experience designing and developing enterprise‑level integration solutions.
- Deep expertise with Apache Kafka, including producers, consumers, topics, connectors, and performance tuning.
- Strong experience with AWS services, especially AWS Glue, S3, Lambda, API Gateway, Kinesis, Redshift, RDS, and DynamoDB.
- Proficiency in programming languages such as Python, Java, or Scala.
- Experience with SQL and NoSQL databases and data formats like JSON, Avro, Parquet, and XML.
- Solid understanding of data modeling, data warehousing, ETL/ELT processes, and integration patterns.
- Familiarity with version control (Git) and CI/CD pipelines.
- Excellent analytical, problem‑solving, communication, and interpersonal skills.
- Preferred: Confluent Certified Developer, AWS certifications, experience with Airflow, Talend, Informatica, Docker/Kubernetes, Terraform/CloudFormation, and data governance/security best practices.
Benefits
GFL is committed to equal opportunity and provides accommodations for applicants who need them. We value our people, offering growth, learning, mentorship, and a purpose‑driven environment focused on safety, sustainability, and environmental solutions.
GFL offers career paths in Field Operations and Professional Services, supporting growth and skill expansion across North America.
Training + Development
Information not given or found