Description
sql modeling
data transformation
streamlit prototypes
data quality
api integration
airflow scheduling
This role offers end-to-end ownership of the data pipeline. As a key member of a small, high-growth team, you will do more than just manage data; you will be instrumental in building our data function. This includes establishing best practices, creating automated pipelines, and shaping the foundational framework for how we leverage data to succeed.
- You'll create SQL data models with dbt to power dashboards and applications
- You'll perform in-depth analysis and data transformations with SQL, Python, and Jupyter Notebooks
- You'll prototype internal data applications and tools (Streamlit, Jupyter)
- You'll ensure data quality and reliability throughout the lifecycle
- You'll integrate third-party APIs and databases into our data flows
- Airflow for job scheduling and tracking
- You'll collaborate with product, technology, and strategy teams to deliver high-impact insights and tools
- You'll develop and maintain data pipelines and automated processes in Airflow and Python
Requirements
python
sql
dbt
jupyter
kubernetes
streamlit
We are seeking a passionate data professional to drive the electrification of transport by transforming complex information into strategic assets. You will work with a rich variety of datasets, including charge point telemetry, user behaviour, marketing campaigns, sales figures, and operational metrics. Your mission will be to uncover actionable insights that enhance efficiency and guide decision-making across the entire business.
We may use artificial intelligence (AI) tools to support parts of the hiring process, such as reviewing applications, analyzing resumes, or assessing responses. These tools assist our recruitment team but do not replace human judgment. Final hiring decisions are ultimately made by humans. If you would like more information about how your data is processed, please contact us.
- Collaborative Projects: Experience working on collaborative projects with business teams and familiarity with agile or similar methodologies
- Jupyter and Jupyter Hub for notebook analytics and collaboration
- Dashboard and Data Product Development: Experience in creating data dashboards and developing data products
- Passion for Net Zero: You don’t need to be deeply familiar with the EV market and products (we can teach that), but a passion for the transition to net zero is an excellent start
- Time Management: Ability to manage multiple projects simultaneously in a fast-paced environment
- Streamlit for data applications
- Circle CI for continuous deployment
- Parquet and Delta file formats on S3 for data lake storage
- Growth mindset: Learns fast and is enthusiastic about learning new technologies
- dbt for data modelling
- Data Lifecycle Expertise: Versatility in handling the entire data lifecycle, from ingestion to visualisation and presentation
- Data Analysis Skills: Proficiency in SQL, python, pandas/numpy, Databricks/Jupyter Notebooks, and experience with large datasets in production environments
- Google Analytics, Amplitude and Firebase for client applications event processing
- Kubernetes for data services and task orchestration
- Autonomous Problem-Solving: Ability to work independently, scope problems, and deliver pragmatic solutions
- Data Modelling Experience: Proficiency in using dbt (ideally) for data modelling
- Python as our main programming language
Benefits
We’re making electric vehicle charging as smart and as simple as possible, by building the giant, virtual charging platform of the future.
Are you ready for a career with us? We want to ensure you have all the tools and environment you need to unleash your potential. Need any specific accommodations? Whether you require specific accommodations or have a unique preference, let us know, and we'll do what we can to customise your interview process for comfort and maximum magic!
- Visit our perks hub - Octopus Employee Benefits
Training + Development
Information not given or found