
Quest Global
Solving the world’s hardest engineering challenges through end‑to‑end solutions across industries.
Data Engineering Architect
Lead design and development of enterprise-scale GCP data platform
Job Highlights
About the Role
In this role you will lead the architecture, design, and development of a new enterprise data platform, working closely with other cloud and security architects to align with the overall technology strategy. Collaboration with data scientists, analysts, and cross‑functional teams will be essential to create robust data models and storage solutions. • Architect and design core components using a microservices approach. • Create and maintain data platform SDKs and libraries following best practices. • Develop connector frameworks for on‑prem and cloud data sources. • Optimize storage, processing, and query performance for large datasets while controlling costs. • Define and implement security patterns and practices. • Build data quality frameworks to ensure accuracy and reliability. • Collaborate with data scientists, analysts, and cross‑functional teams to design data models and schemas. • Develop advanced analytics and machine learning capabilities on the platform. • Establish observability and data governance frameworks. • Stay current with data engineering trends and emerging technologies. • Drive deployment and release cycles to deliver a robust, scalable platform.
Key Responsibilities
- ▸microservices
- ▸sdk development
- ▸connector framework
- ▸data optimization
- ▸security patterns
- ▸data quality
What You Bring
We are seeking a dynamic and highly skilled Cloud Architect with extensive experience building enterprise‑scale data platforms. The ideal candidate combines deep knowledge of the data engineering landscape with a hands‑on coding approach and will help shape the future of our data ecosystem. Candidates must have 15+ years of modern cloud data engineering experience, including proven success in architecting green‑field, enterprise‑scale platforms. Strong software engineering skills and a solid background in GCP data services are required. Required technical expertise includes proficiency with BigQuery, Cloud Functions, Cloud Run, Dataform, Dataflow, Dataproc, SQL, Python, and Airflow, as well as experience in streaming technologies (Kafka, Pub/Sub), front‑end development (JavaScript/TypeScript), and microservices (Kubernetes, Docker). Experience with metadata management, data catalogues, data lineage, data quality, DataOps, and observability tools such as Grafana and Datadog is also essential. Preferred qualifications include experience with Data Mesh architectures, building semantic layers, and developing scalable IoT solutions.
Requirements
- ▸cloud architect
- ▸15+ years
- ▸gcp
- ▸bigquery
- ▸python
- ▸kubernetes
Work Environment
Hybrid