Collaborate with stakeholders to gather requirements and design end-to-end data solutions.
Design and implement modern data architectures, including medallion architecture, star and snowflake schemas, and handling of slowly changing dimensions (Type I & II).
Act as a Subject Matter Expert (SME) in Google BigQuery and related GCP services.
Mentor and guide junior team members to enhance their technical skills and knowledge.
Stay updated with emerging technologies and evaluate their potential to improve data engineering practices. Required Skills and Experience:
Monitor and optimize data pipelines for performance, cost efficiency, and scalability.
Develop and implement data integration pipelines for batch and streaming data using tools like Apache Beam (Google Dataflow), Google Cloud Functions, and Google Workflows.
Build and optimize semantic models and layers, such as LookerML or Power BI, to enable data visualization with KPIs and metrics.
Ensure data quality, security, and governance by implementing best practices in IAM, encryption, tokenization, and native security features like row-level and column-level security in Google BigQuery.
Requirements
bigquery
google cloud
sql development
data pipelines
data security
data warehousing
Proven experience in mentoring and leading teams. Additional Information:
Proven ability to troubleshoot and optimize application performance.
Excellent problem-solving skills and ability to provide innovative solutions.
Familiarity with tools like Google Composer, Google Dataplex, Google AutoDQ, Google Pub/Sub, Google Storage, and Google Looker Studio.
Strong expertise in SQL development, data modeling, and optimization techniques.
Experience in building and managing data pipelines for batch and streaming data.
Strong understanding of data security practices, including IAM, encryption, and tokenization.
Minimum 3-5 years of hands-on experience with Google BigQuery and GCP services.
In-depth understanding of data warehousing concepts, medallion architecture, and best practices.
We are looking for candidates who are passionate about data engineering and have a proven track record of delivering high-quality, scalable data solutions.