London / Reading – Hybrid £60,000 – £70,000 Azure Data / SQL / NHS I’m currently working with a growing data and technology consultancy who are looking to add a Data Engineer to their expanding delivery team. The company work with organisations across several industries including telecommunications, financial services, insurance and healthcare, helping them design and build modern data platforms and analytics capabilities. As part of a client-facing consulting team, you will work on a variety of data engineering projects, many of which will be healthcare based where you will help build, transform and structure large datasets so they can be used for reporting, analytics and operational systems. The role will involve working closely with internal teams and client stakeholders to design scalable data pipelines and ensure reliable data delivery across modern cloud platforms. The company have 2 sites in London and Reading and would be looking for 2 days a week in the office. Key Responsibilities • Develop and maintain data pipelines using Azure Data Factory (ADF) • Build and manage ETL processes using SQL and Python • Work with Azure Databricks and Azure SQL / SQL Server to process and store large datasets • Design and implement data models using Kimball, 3NF or dimensional modelling techniques • Build metadata-driven pipelines to automate data processing • Collaborate with cross-functional teams to understand client data requirements and deliver appropriate solutions • Ensure data quality, integrity and security throughout the pipeline lifecycle Experience Required • Strong experience working within Azure data environments • Hands-on experience with Azure Data Factory, Azure Databricks and Azure SQL / SQL Server • Proficiency with SQL and Python • Have previously worked in Healthcare or NHS environments. • Understanding of modern data architectures such as medallion architecture • Experience working with data formats such as JSON, CSV and Parquet • Understanding of cloud security, IAM and networking concepts • Familiarity with Agile delivery environments • Experience with CI/CD pipelines (Azure DevOps or similar) Nice to Have • Exposure to Apache Airflow or dbt • Experience with BigQuery (GCP) • Azure or cloud platform certifications • Experience with data governance frameworks Salary: £60,000-£70,000 Benefits: 25 Days Holiday, Bonus, Share Scheme, Health Insurance, Pension, Life Assurance If this role sounds of interest, please apply and I can give you a call. Tim Stock (phone number removed) | (phone number removed) (url removed) (url removed)
Founding Engineer (EnergyTech / AI)
London Hybrid (3 days onsite) 70k- 120k + Equity
A venture-backed EnergyTech start-up is building a new type of power company designed for the electrified future.
As renewable energy adoption accelerates, the challenge is no longer just generating power, but storing, managing, and intelligently dispatching it. This team is developing software that sits at the centre of that transition, enabling a new generation of energy suppliers built around flexibility, storage, and intelligent automation.
Backed by leading investors and founded by operators with experience from some of the most respected names in the European energy ecosystem, the business has already launched its first product and is growing rapidly month-on-month. The next phase is building the core technology platform that will power a fully integrated energy supplier launching later this year.
This is an opportunity to join a small, highly capable engineering team building core infrastructure for a next-generation energy platform.
The Role
You will work across the full product stack, helping design and build the systems that power both the customer experience and the operational backbone of the platform.
Engineers here take ownership of problems end-to-end, from shaping early ideas through to delivering production features used by customers.
Responsibilities include:
Tech Stack
What They’re Looking For
Engineers who thrive here tend to enjoy working close to the problem, communicating clearly across technical and non-technical teams, and taking initiative rather than waiting for direction.
Package
Our client a global technology leader is currently looking for a Senior/Lead Data Engineer to work with the dev team to guide the provision of Software Development for an exciting new AI product.
Key Responsibilities:-
Core Experience:-
Frameworks/Infrastructure:-
Ventilation Engineer (MVHR & Ventilation Systems Specialist)
Location: South UK
Salary: £38,000 £42,000 per annum (DOE)
Company: RGE Services
RGE Services is a growing and reputable engineering services company specialising in ventilation and air management solutions across residential and commercial properties. Due to continued expansion, we are looking for an experienced and motivated Ventilation Engineer to join our team.
The Role
As a Ventilation Engineer, you will be responsible for the installation, maintenance, relocation, and replacement of a wide range of ventilation systems. This is a field-based role requiring high standards, strong fault-finding skills, and the ability to work independently.
Key Responsibilities
Requirements
What We Offer
We are Data Services, our mission is to unlock the value of data by delivering high-quality, reliable, and secure data services that are accessible, understandable, and actionable. We continuously evolve our offerings, leveraging modern cloud-based technologies, and fostering strong partnerships to help our colleagues in the Bank navigate the complexities of a data-driven world and achieve their strategic objectives.
Active SC Clearance
Job Description:
The world of data in Central Banking is evolving rapidly. With the rise of detailed data collection in financial regulation and the swift advancements in cloud-native data technologies, the demand for visionary data engineers is growing. Were seeking a senior Data Engineer to join our Data Engineering team and play a pivotal role in shaping the Banks strategic cloud-first data platform.
As a senior member of the team, you will play a key role in designing and delivering robust, scalable data solutions that support the Banks core responsibilities around monetary policy, financial stability, and regulatory supervision. Youll contribute to technical design decisions, mentor engineers, and collaborate across teams to ensure our data infrastructure continues to evolve and meet future demands.
Role Responsibilities
Changed this now. I was confusing this with PDE role as I am working on that in parallel. Hope this makes sense now.
data solutions rather than architectures?
Should add Python here as a key tech we use
Have mentioned Python in ‘Minimum Criteria’ section below, but will add here too
this could be added to Essential Criteria ?
stakeholder and project management ?
Have updated #1 in essential criteria below. But I have now used the previous version to create requisition in OBS. Will see if it can be changed.
What is the difference between “minimum” and “essential” criteria. Both imply that they are mandatory and so could be one list?
This is a bit confusing. I used to have just one, but this is the standard format of JD that the Bank wants us to follow. Here is the difference:
Min Criteria:
This must list the minimum technical skills/experience/qualifications required to do the job and should be measurable/scoreable. The screening questions you select must link to these, in order to allow candidates to best demonstrate their suitability for the role.
Essential:
This lists other important technical skills/experience/qualifications, and also more behavioural competencies. These are ones that are better assessed at interview rather than on screening questions on the application form
Ok, I think we could go back and ask HR about this as it does seem confusing and to me doesn’t give a good impression of the Bank to applicants at it looks like 2 lists for the same thing.
I had checked this earlier, but seems they want us to follow this format. When I advertised last time, I just mentioned Minimum Criteria, but they said it has to be split into Minimum and Essential.
Don’t think we need to mention Atlas or Cloudera Manager as we hardly ever use those. Airflow could be useful so would leave that in.
Machine Learning Engineer (Python / MLE )
6 Month Contract
£650 - £750
Remote
Umbrella
Urgent Start
We are looking for a number of MLE / Machine Learning Engineers for a critical 6 month contract with a household name. This role is essential for maintaining the stability and performance of their core data and Machine Learning systems.
If your background is diving into complex production codebases and possess excellent problem solving skills, this is the perfect opportunity. We need candidates ready to start by Mid to late November.
This is a deeply technical engineering role focused less on new feature development and more on reliability and fixes.
Key Skills:
Investigating and debugging complex data flow and Machine Learning issues within a live, high impact production environment.
Extensive Python, NumPy and Pandas is required for this role.
You must demonstrate a deep commercial background in the following areas:
Extensive Python: Very strong, production-level Python coding and debugging skills.
Production Environment: Proven experience working directly with and troubleshooting issues in live production codebases (not just isolated development).
Cloud Experience: Solid experience with any major public cloud provider (GCP, AWS, or Azure).
Experience with BigQuery would be good.
Machine Learning: Experience supporting and understanding ML pipelines and models in a production setting.
Direct experience with Google Cloud Platform, BigQuery, and associated tooling.
Experience with workflow tools like Airflow or Kubeflow.
Familiarity with dbt (Data Build Tool).
Please send your CV for more information on these roles.
Reasonable Adjustments:
Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people of all backgrounds and perspectives. Our success is driven by our people, united by the spirit of partnership to deliver the best resourcing solutions for our clients.
If you need any help or adjustments during the recruitment process for any reason , please let us know when you apply or talk to the recruiters directly so we can support you.
TPBN1_UKTJ
Machine Learning/Data Engineer
£700-750/day overall assignment rate to umbrella
Fully remote
3-6 month initial
Apply today to join a forward-thinking, tech-driven FTSE 100 organisation using data science and AI to enhance customer experience, optimise supply chains and drive sustainable growth. With 40% of sales from sustainable products, this is a company that combines scale, innovation and purpose.
As a Machine Learning Engineer, you’ll help maintain the stability and performance of core data and ML systems across Europe. This technical engineering role focuses on reliability, optimisation and critical fixes, ideal if you excel at investigating and debugging complex data flows and ML issues in live production environments.
We’re looking for individuals with:
Experience: Proven background as a Machine Learning Engineer.
Technical Skills: Strong in SQL and Python (Pandas, Scikit-learn, Jupyter, Matplotlib).
Data transformation & manipulation : experience with Airflow, DBT and Kubeflow
Cloud: Experience with GCP and Vertex AI (developing ML services).
Expertise: Solid understanding of computer science fundamentals and time-series forecasting.
Machine Learning: Strong grasp of ML and deep learning algorithms (e.g. Logistic Regression, Random Forest, XGBoost, BERT, LSTM, NLP, Transfer Learning).
Reasonable Adjustments:
Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people of all backgrounds and perspectives. Our success is driven by our people, united by the spirit of partnership to deliver the best resourcing solutions for our clients.
If you need any help or adjustments during the recruitment process for any reason , please let us know when you apply or talk to the recruiters directly so we can support you.
TPBN1_UKTJ
Your new company
This is a financial institution with an office based in the City of London.
Your new role
You will be migrating data from on prem onto Snowflake for a Greenfield project.
What you’ll need to succeed
Strong Python and SQL are crucial to this role
On-prem to Snowflake migration experience
Airflow and DBT experience
GitHub experience as you will be building CI/CD pipelines as part of this role
Any experience within post-trade, OTC, swaps will be extremely beneficial
What you’ll get in return
An exciting opportunity to join an international organisation in financial services. Furthermore, a competitive day rate for this role will be offered in addition to your own dedicated Hays Consultant to guide you through every step of the application process.
What you need to do now
If you’re interested in this role, click ‘apply now’ to forward an up-to-date copy of your CV, or call us now.
If this job isn’t quite right for you, but you are looking for a new position, please contact us for a confidential discussion about your career.
Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C’s, Privacy Policy and Disclaimers which can be found at (url removed)
28-35hrs per week- open to discuss flexible working of these hours Remote with some attendance at our London office in Shoreditch The ASA is the UK’s regulator of advertising across all media, including online. Our work includes taking proactive action against misleading, harmful, offensive or otherwise irresponsible ads and acting on complaints. In short, we make sure ads are legal, decent, honest and truthful. In this role you will join our Data Science team and work on our world-leading Active Ad Monitoring system, which uses AI to proactively monitor online advertising. In 2025 the system captured and processed 60 million ads across social media, search and programmatic display. The ASA uses this intelligence to help regulate ads across high-priority topics like injectable weight-loss medications, green claims companies make to consumers, disclosure of influencer marketing and many more. You will help develop and maintain the tools we use to capture, process, and apply AI models to large datasets of ads within the Active Ad Monitoring system. We’re looking for someone who wants to use their skills and expertise to help shape a safer advertising landscape. Our team mission is to protect UK consumers from adverts that are misleading, cause harm and target those within our society that are the most vulnerable. Working as part of our small agile team you will have the opportunity to own your work end-to-end, seeing directly how the code you write helps protect UK consumers. You will work in a cloud-based environment, primarily in Python, and with a range of industry standard tools such as Snowflake, Docker and Airflow. You will work primarily with unstructured data - namely ads in a variety of formats including images, videos and text from a range of online channels. About you \* You may not have been a Data Engineer before but you will have the ability to work with data in Python to a professional standard, and deliver high-quality code that works reliably in a production setting. \* You‘ll be working with people from both technical and non-technical backgrounds so you’ll need to be adept at being able to translate complex technical language to non-technical people. \* You’ll be impact focused- understanding the problems the ASA faces and prioritising technical solutions that will deliver real impact. \* You will need to be curious and ambitious, creatively solving problems that may arise whilst always having an eye on system/process improvements. \* You’ll enjoy working with others from different technical disciplines each using your unique expertise to further the work, whilst also developing your own technical knowledge and skills. We are committed to building a workforce that reflects the full diversity of the UK population. We believe that varied perspectives and experiences strengthen our organisation and help us deliver our work more effectively. We welcome applications from people of all backgrounds and identities, and we actively encourage candidates from minority or underrepresented groups to apply. Women are currently under‑represented within data engineering roles, and within our Data Science team. In line with our commitment to equality, diversity and inclusion, we particularly encourage applications from women and others who are under‑represented in this area. Our recruitment process ensures applications are absent of names or any identifiable information which supports our aim of finding the best person for the role based on their skills and experience only. How to apply: If you’re interested in applying for this role, please review the job description below and complete our online application process which includes answering some online questions regarding your motivation for applying for this role and your skills and experience. Closing date: 16th March 2026. Please note we will be reviewing applications as they come in and we reserve the right to close the advert early if we receive a significantly high number of applicants. Please feel free to use AI to enhance your application but not to write it for you. We’re interested to know your thoughts, experiences and ideas. You’ll need to stand up what you’ve told us in your application if you attend an interview, so please make sure we feel the person we’ve met on paper is the person we meet in the room
London M25 Patch
£50,000 - £60,000 + Progression + Training + Company Benefits + No Overtime + Local Patch
Are you from a Heat Pumps, HVAC or MVHR background that wants to join a company that has seen huge success in Europe rivalling the likes of Daikin and Toshiba?
Do you want to work for a industry leading business with state of the art new suite of heat pumps that provides full training from industry experts?
This company have gone from strength to strength in the last decade due to their products being best in class and rivalling the usual go to names. They have taken huge amount of market share across Europe and are now finding the same success in the UK.
In this role you will be working the M25 patch with other inspectors, your jobs will be organised in location meaning you won’t be spending all day in traffic and be able to return home at a reasonable hour.
This business are the UK supplier of ventilation, heat recovery, and heat pump systems, providing warranty support, technical diagnostics, commissioning assistance, and product expertise to installers, developers, and end-users.
This position supports our role as the primary UK technical contact for their systems, offering advanced site-based troubleshooting and controller configuration when third-party engineers are unable to resolve issues.
THE ROLE:
THE PERSON:
Reference: BBBH23065
Keywords: AC, HVAC, Air Con, Heat Pumps, Heat Exchange, Airflow, Testing, Field, M25, London,
If you’re interested in this role, click ‘apply now’ to forward an up-to-date copy of your CV.
We are an equal opportunities employer and welcome applications from all suitable candidates. The salary advertised is a guideline for this position. The offered renumeration will be dependent on the extent of your experience, qualifications, and skillset.
Ernest Gordon Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job, you accept the T&C’s, Privacy Policy and Disclaimers which can be found at our website
Based at client locations, working remotely, or based in our Godalming or Milton Keynes offices.
Salary up to £65k plus company benefits.
About Us
Triad Group Plc is an award-winning digital, data, and solutions consultancy with over 35 years’ experience primarily serving the UK public sector and central government. We deliver high-quality solutions that make a real difference to users, citizens and consumers.
At Triad, collaboration thrives, knowledge is shared, and every voice matters. Our close-knit, supportive culture ensures you’re valued from day one. Whether working with cutting-edge technology or shaping strategy for national-scale projects, you’ll be trusted, challenged, and empowered to grow.
We nurture learning through communities of practice and encourage creativity, autonomy, and innovation. If you’re passionate about solving meaningful problems with smart and passionate people, Triad could be the place for you.
Glassdoor score of 4.7
96% of our staff would recommend Triad to a friend
100% CEO approval
See for yourself some of the work that makes us all so proud:
Helping law enforcement with secure intelligence systems that keep the UK safe
Supporting the UK’s national meteorological service in leveraging supercomputers for next-level weather forecasting
Assisting a UK government department responsible for consumer product safety with systems to track unsafe products
Powering systems that help the government monitor and reduce greenhouse gas emissions from commercial transport
Role Summary
Triad is seeking a Senior Data Engineer to play a key role in delivering high-quality data solutions across a range of client assignments, primarily within the UK public sector. You will design, build, and optimise cloud-based data platforms, working closely with multidisciplinary teams to understand data requirements and deliver scalable, reliable, and secure data pipelines. This role offers the opportunity to shape data architecture, influence technical decisions, and contribute to meaningful, data-driven outcomes.
Key Responsibilities
Design, develop, and maintain scalable data pipelines to extract, transform, and load (ETL) data into cloud-based data platforms, primarily AWS.
Create and manage data models that support efficient storage, retrieval, and analysis of data.
Utilise AWS services such as S3, EC2, Glue, Aurora, Redshift, DynamoDB and Lambda to architect and maintain cloud data solutions.
Maintain modular Terraform based IaC for reliable provisioning of AWS infrastructure.
Develop, optimise and maintain robust data pipelines using Apache Airflow.
Implement data transformation processes using Python to clean, preprocess, and enrich data for analytical use.
Collaborate with data analysts, data scientists, developers, and other stakeholders to understand and integrate data requirements.
Monitor, optimise, and tune data pipelines to ensure performance, reliability, and scalability.
Identify data quality issues and implement data validation and cleansing processes.
Maintain clear and comprehensive documentation covering data pipelines, models, and best practices.
Work within a continuous integration environment with automated builds, deployments, and testing.
Skills and Experience
Qualifications & Certifications
Triad’s Commitment to You
As a growing and ambitious company, Triad prioritises your development and well-being:
What Our Colleagues Have to Say
Please see for yourself on Glassdoor and our “Day in the Life” videos at the bottom of our Careers Page.
Our Selection Process
After applying for the role, our in-house talent team will contact you to discuss Triad and the position. If shortlisted, you will be invited for:
We aim to complete interviews and progress candidates to offer stage within 2-3 weeks of the initial conversation.
Other Information
If this role is of interest to you or you would like further information, please contact Ryan Jordanand submit your application now.
Triad is an equal opportunities employer and welcomes applications from all suitably qualified people regardless of sex, race, disability, age, sexual orientation, gender reassignment, religion, or belief. We are proud that our recruitment process is inclusive and accessible to disabled people who meet the minimum criteria for any role. Triad is a signatory to the Tech Talent Charter and a Disability Confident Leader.
Day rate: £500 - £550 Inside IR35 Location: London Key Responsibilities Design, build, and maintain scalable data pipelines, ETL processes, and data integrations. Develop and optimize data models, storage solutions, and analytics environments. Partner with UX/UI designers to create user-friendly dashboards, data tools, and internal products. Implement visualizations that make complex datasets understandable for technical and non-technical users. Work with cross-functional teams to translate product requirements into technical designs. Ensure data quality, governance, and best practices across systems. Contribute to the evolution of our design systems and front-end components for data tools. Required Skills & Experience Proven experience as a Data Engineer, BI Engineer, or similar role. Strong proficiency in SQL, Python, and modern data engineering frameworks (e.g., Airflow, dbt, Spark, etc.). Experience with cloud platforms such as AWS, Azure, or GCP. Solid understanding of data warehousing and ETL/ELT architecture. Demonstrable UX/UI skills: wireframing, prototyping, and designing clean, intuitive interfaces. Experience with front-end technologies (e.g., React, Vue, or similar) is a plus. Familiarity with visualization tools (e.g., Tableau, Power BI, or custom solutions)
Job Title: Snowflake Data Engineer
Location: London (2 days on-site per week)
Salary/Rate: 550 - 600 per day inside IR35
Start Date: March
Job Type: Initial 3-6 month contract
Company Introduction
We have an exciting opportunity now available with one of our sector-leading consultancy clients! They are currently looking for a skilled Snowflake Data Engineer to help on their cloud migration project.
Job Responsibilities/Objectives
You will be responsible for designing and building scalable data pipelines, Data Vault models/Dimension Model, and Snowflake/dbt workloads for cloud migration projects.
? Implement Data Vault 2.0 (Hubs, Links, Satellites) /Dimension Model on Snowflake.
? Build ELT pipelines using Snowflake, dbt, Python/PySpark.
? Develop ingestion from APIs, databases, streams.
? Optimize Snowflake warehouses, cost, and performance.
? Collaborate with architects, analysts, and DevOps.
? Maintain documentation, lineage, governance standards.
Required Skills/Experience
The ideal candidate will have the following:
? Strong SQL; Snowflake ELT; dbt experience.
? Python/PySpark, ETL/ELT design.
? Data Vault 2.0 or dimensional modeling.
? AWS services (S3, Glue, Lambda, Redshift) or GCP equivalents.
? Experience with CI/CD for data pipelines.
Good to have skills
Although not essential, the following skills are desired by the client:
? Kafka/Kinesis, Airflow, CodePipeline.
? BI tools (Power BI/Tableau).
? Docker/OpenShift; metadata driven pipelines.
? 3-8+ years Data Engineering experience.
? Cloud data engineering and Snowflake/dbt hands on exposure.
If you are interested in this opportunity, please apply now with your updated CV in Microsoft Word/PDF format.
Disclaimer
Notwithstanding any guidelines given to level of experience sought, we will consider candidates from outside this range if they can demonstrate the necessary competencies.
Square One is acting as both an employment agency and an employment business, and is an equal opportunities recruitment business. Square One embraces diversity and will treat everyone equally. Please see our website for our full diversity statement.
Day rate: 500 - 550 Inside IR35
Location: London
Key Responsibilities
Required Skills & Experience