Description
data pipelines
ci/cd
snowflake
data governance
performance tuning
documentation
As a Data Engineer, you will play a key role in designing, developing, and maintaining our data infrastructure and pipelines. You will collaborate closely with the rest of our Data and Analytics Engineering team and with engineering and operations teams to ensure the smooth flow and availability of high-quality data for analysis and reporting purposes. Your expertise will be essential in optimizing data workflows, ensuring data integrity, and scaling our data infrastructure to support our company's growth.
Eliminating carbon footprints, eliminating carbon copies.
- Monitor data pipelines and proactively address any issues or bottlenecks to ensure uninterrupted data flow.
- Design, build and maintain tooling that enables users and services to interact with our data platform, including CI/CD pipelines for our data lakehouse, unit/integration/validation testing frameworks for our data pipelines, and command-line tools for ad-hoc data evaluation.
- Design, develop, and maintain scalable and efficient data pipelines in an AWS environment, centered on our Snowflake instance and using Fivetran, Prefect, Argo, and dbt.
- Identify and implement best practices for data ingestion, transformation, and storage to ensure data integrity and accuracy.
- Collaborate with business analysts, analytics engineers, and software engineers to understand data requirements and deliver reliable solutions.
- Optimize and tune data pipelines for improved performance, scalability, and reliability.
- Develop and maintain documentation for data pipelines, ensuring knowledge sharing and smooth onboarding of new team members.
- Keep up to date with emerging technologies and trends in data engineering and recommend their adoption as appropriate.
- Implement data governance and security measures to ensure compliance with industry standards and regulations.
Requirements
snowflake
aws
python
sql
dbt
etl
Here at Perch, we cultivate diversity, celebrate individuality, and believe unique perspectives are key to our collective success in creating a clean energy future. Perch is committed to equal employment opportunities regardless of race, color, religion, gender, sexual orientation, gender identity or expression, national origin, age, disability, genetic information, protected veteran status, or any status protected by applicable federal, state, or local law. While we are currently unable to consider candidates who will require visa sponsorship, we welcome applications from all qualified candidates eligible to work in India.
Our core data stack makes heavy use of Snowflake and dbt Core, orchestrated in Prefect and Argo in our broader AWS-based ecosystem. Most of our wide range of data sources are loaded with Fivetran or Segment, but we use custom Python when it’s the right tool for the job.
- 3+ years as a Data Engineer, data-adjacent Software Engineer, or a did-everything small data team member with a focus on building and maintaining data pipelines.
- Strong understanding of database management and design, including experience with Snowflake or an equivalent platform.
- Experience with our stack: AWS, Snowflake, Fivetran, Argo, Prefect, dbt, and Github Actions, along with some ancillary tools
- Familiarity with data integration patterns, ETL/ELT processes, and data warehousing concepts.
- Experience in the energy sector
- Previous experience managing enterprise-level data pipelines and working with large datasets
- Excellent problem-solving and analytical skills with a strong attention to detail.
- Undergraduate and/or graduate degree in math, statistics, engineering, computer science, or related technical field
- Ability to bring a customer-oriented and empathetic approach to understanding how data is used to drive the business.
- Proficiency in SQL
- Strong Python skills, especially in the context of data orchestration.
- Strong communication skills.
- Experience with DevOps practices, especially CI/CD
- Experience with Argo, Prefect, Airflow, or similar data orchestration tools.
Benefits
We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request
- Competitive compensation based on market standards.
- Flexible Leave Policy
- L&D programs to foster professional growth
- Apart from Fixed Base Salary potential candidates are eligible for following benefits
- We are working on a hybrid model with remote first policy
- Annual performance cycle
- We provide comprehensive coverage including accident policy and life Insurance.
- Quarterly team engagement activities and rewards & recognitions
- Medical Insurance (1+5 Family Members)
Training + Development
Information not given or found