What you would be doingdata quality
sql modeling
data pipelines
api integration
tool prototyping
airflow
This role offers end-to-end ownership of the data pipeline. As a key member of a small, high-growth team, you will do more than just manage data; you will be instrumental in building our data function. This includes establishing best practices, creating automated pipelines, and shaping the foundational framework for how we leverage data to succeed.
- You'll ensure data quality and reliability throughout the lifecycle
- You'll create SQL data models with dbt to power dashboards and applications
- You'll collaborate with product, technology, and strategy teams to deliver high-impact insights and tools
- You'll perform in-depth analysis and data transformations with SQL, Python, and Jupyter Notebooks
- You'll integrate third-party APIs and databases into our data flows
- You'll prototype internal data applications and tools (Streamlit, Jupyter)
- You'll develop and maintain data pipelines and automated processes in Airflow and Python
- Airflow for job scheduling and tracking
What you bringdbt
circleci
python
sql
kubernetes
jupyter
We would prefer someone who can work in our London office 1/2 times a week and you will also need to have full working rights for the UK as we are unable to sponsor.
We are seeking a passionate data professional to drive the electrification of transport by transforming complex information into strategic assets. You will work with a rich variety of datasets, including charge point telemetry, user behaviour, marketing campaigns, sales figures, and operational metrics. Your mission will be to uncover actionable insights that enhance efficiency and guide decision-making across the entire business.
- Data Modelling Experience: Proficiency in using dbt (ideally) for data modelling
- Time Management: Ability to manage multiple projects simultaneously in a fast-paced environment
- Circle CI for continuous deployment
- Collaborative Projects: Experience working on collaborative projects with business teams and familiarity with agile or similar methodologies
- Data Analysis Skills: Proficiency in SQL, python, pandas/numpy, Databricks/Jupyter Notebooks, and experience with large datasets in production environments
- Google Analytics, Amplitude and Firebase for client applications event processing
- Kubernetes for data services and task orchestration
- Dashboard and Data Product Development: Experience in creating data dashboards and developing data products
- dbt for data modelling
- Growth mindset: Learns fast and is enthusiastic about learning new technologies
- Python as our main programming language
- Jupyter and Jupyter Hub for notebook analytics and collaboration
- Data Lifecycle Expertise: Versatility in handling the entire data lifecycle, from ingestion to visualisation and presentation
- Autonomous Problem-Solving: Ability to work independently, scope problems, and deliver pragmatic solutions
- Streamlit for data applications
- Passion for Net Zero: You don’t need to be deeply familiar with the EV market and products (we can teach that), but a passion for the transition to net zero is an excellent start
- Parquet and Delta file formats on S3 for data lake storage
BenefitsAre you ready for a career with us? We want to ensure you have all the tools and environment you need to unleash your potential. Need any specific accommodations? Whether you require specific accommodations or have a unique preference, let us know, and we'll do what we can to customise your interview process for comfort and maximum magic!
We’re making electric vehicle charging as smart and as simple as possible, by building the giant, virtual charging platform of the future.
- Visit our perks hub - Octopus Employee Benefits
Training + DevelopmentInformation not given or found