Want to hear how I work? Hit play.Find roles with Kablio AI to help build and power the world.Kablio AI helps you secure roles in construction, clean energy, facilities management, engineering, architecture, sustainability, environment and other physical world sectors.
Get hired, get rewarded!
Land a job through Kablio and earn a 5% salary bonus.
Exclusive benefits
5%Bonus
Data Engineer - Trading
Shell
Global energy giant exploring, producing, refining and marketing oil, gas, petrochemicals and low‑carbon solutions.
Design & build data foundations, pipelines & analytics to drive data‑driven trading insights.
Communicate with both technical developers and business managers and gain respect and trust of leaders and staff
Drive implementation efficiency and effectiveness across the pilots and future projects to minimize cost, increase speed of implementation and maximize value delivery
Ensure traceability of requirements from Data through testing and scope changes, to training and transition
Continuously grow the transferable skills you need to get ahead
Coordinate the change management process, incident management and problem management process
Actively deliver the roll-out and embedding of Data Foundation initiatives in support of the key business programs advising on the technology and using leading market standard tools
What you bring
python
azure
sql
kafka
agile
degree
Advanced Quantitative & Commodity Modeling Skills: Proficient in statistical analysis and modeling across energy commodities and meteorology
Very Good understanding in Data Foundation initiatives, like Modelling, Data Quality Management, Data Governance, Data Maturity Assessments and Data Strategy in support of the key business stakeholders
Proven technology champion in working with relational, Data warehouses databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience/Knowledge in working with NoSQL databases and can create E2E pipelines
Significant years of experience in the IT industry
Experienced in building and optimizing complex queries. Good with manipulating, processing and extracting value from large, disconnected datasets
Experience in working with AGILE, KANBAN methodologies. Able to run a sprint
Proven champion with in-depth knowledge of any one of the scripting languages: Python, SQL, Spark-SQL/ PySpark
Technical Proficiency: Solid Python skills (NumPy, Pandas, Django) and certified in Microsoft Azure
Communication Skills to engage both technical developers, Architects, and stakeholders
Proven Experience in working with any one of the data engineering technologies like ADLS, ADF, Azure Databricks, Azure SQL, Synapse, SAP HANA, AWS. Your experience in handling big data sets and big data technologies will be an asset
University degree in any IT discipline
Big Data & Workflow Tools: Skilled in tools such as Kafka, Hadoop, Spark, and workflow managers like Airflow, Azkaban, and Luigi
Energy Trading & Risk Expertise: Strong knowledge in short-term gas/power trading (including CCGT, wind, solar, battery) and risk modeling/management
DevOps & CI/CD Experience: Hands-on experience in building and enhancing CI/CD pipelines using scripting languages like YAML, PowerShell, and Terraform
Benefits
Benefit from flexible working hours, and the possibility of remote/mobile working
Take advantage of paid parental leave, including for non-birthing parents
Gain access to a wide range of training and development programmes
Grow as you progress through diverse career opportunities in national and international teams
Perform at your best with a competitive starting salary and annual performance related salary increase – our pay and benefits packages are considered to be among the best in the world
Hey there! Before you dive into all the good stuff on our site, let’s talk cookies—the digital kind. We use these little helpers to give you the best experience we can, remember your preferences, and even suggest things you might love. But don’t worry, we only use them with your permission and handle them with care.