Make yourself visible and let companies apply to you.
Roles
Apache Airflow Jobs in London
Overview
Looking for top Apache Airflow jobs in London? Discover the latest opportunities for data engineers, ETL developers, and workflow automation specialists on Haystack. Find your next role working with Apache Airflow in the heart of London’s thriving tech scene today!
Data Engineer
Premier IT
London
Hybrid
Mid - Senior
£60,000 - £70,000
RECENTLY POSTED

London / Reading – Hybrid £60,000 – £70,000 Azure Data / SQL / NHS I’m currently working with a growing data and technology consultancy who are looking to add a Data Engineer to their expanding delivery team. The company work with organisations across several industries including telecommunications, financial services, insurance and healthcare, helping them design and build modern data platforms and analytics capabilities. As part of a client-facing consulting team, you will work on a variety of data engineering projects, many of which will be healthcare based where you will help build, transform and structure large datasets so they can be used for reporting, analytics and operational systems. The role will involve working closely with internal teams and client stakeholders to design scalable data pipelines and ensure reliable data delivery across modern cloud platforms. The company have 2 sites in London and Reading and would be looking for 2 days a week in the office. Key Responsibilities • Develop and maintain data pipelines using Azure Data Factory (ADF) • Build and manage ETL processes using SQL and Python • Work with Azure Databricks and Azure SQL / SQL Server to process and store large datasets • Design and implement data models using Kimball, 3NF or dimensional modelling techniques • Build metadata-driven pipelines to automate data processing • Collaborate with cross-functional teams to understand client data requirements and deliver appropriate solutions • Ensure data quality, integrity and security throughout the pipeline lifecycle Experience Required • Strong experience working within Azure data environments • Hands-on experience with Azure Data Factory, Azure Databricks and Azure SQL / SQL Server • Proficiency with SQL and Python • Have previously worked in Healthcare or NHS environments. • Understanding of modern data architectures such as medallion architecture • Experience working with data formats such as JSON, CSV and Parquet • Understanding of cloud security, IAM and networking concepts • Familiarity with Agile delivery environments • Experience with CI/CD pipelines (Azure DevOps or similar) Nice to Have • Exposure to Apache Airflow or dbt • Experience with BigQuery (GCP) • Azure or cloud platform certifications • Experience with data governance frameworks Salary: £60,000-£70,000 Benefits: 25 Days Holiday, Bonus, Share Scheme, Health Insurance, Pension, Life Assurance If this role sounds of interest, please apply and I can give you a call. Tim Stock (phone number removed) | (phone number removed) (url removed) (url removed)

Senior Founding Engineer
Corecom Consulting
London
Hybrid
Senior
£80,000 - £120,000
RECENTLY POSTED
+2

Founding Engineer (EnergyTech / AI)

London Hybrid (3 days onsite) 70k- 120k + Equity

A venture-backed EnergyTech start-up is building a new type of power company designed for the electrified future.

As renewable energy adoption accelerates, the challenge is no longer just generating power, but storing, managing, and intelligently dispatching it. This team is developing software that sits at the centre of that transition, enabling a new generation of energy suppliers built around flexibility, storage, and intelligent automation.

Backed by leading investors and founded by operators with experience from some of the most respected names in the European energy ecosystem, the business has already launched its first product and is growing rapidly month-on-month. The next phase is building the core technology platform that will power a fully integrated energy supplier launching later this year.

This is an opportunity to join a small, highly capable engineering team building core infrastructure for a next-generation energy platform.

The Role

You will work across the full product stack, helping design and build the systems that power both the customer experience and the operational backbone of the platform.

Engineers here take ownership of problems end-to-end, from shaping early ideas through to delivering production features used by customers.

Responsibilities include:

  • Building full-stack product features across web, backend services and data systems
  • Designing APIs and integrating with hardware and energy data platforms
  • Developing AI-enabled capabilities, including LLM-powered workflows and operational tooling
  • Contributing to core supplier infrastructure such as billing systems, operational tooling, and internal platforms
  • Working closely with founders and customers to shape the product direction

Tech Stack

  • Frontend: React, React Native, TypeScript
  • Backend: Python, FastAPI, PostgreSQL
  • Infrastructure: GCP, Terraform, Airflow
  • AI: LLM integrations and AI-driven automation

What They’re Looking For

  • Experience as a Full-Stack Engineer building production systems
  • Strong engineering fundamentals and the ability to work across backend and frontend systems
  • Comfort operating in a fast-moving start-up environment where priorities evolve quickly
  • A proactive mindset with a bias towards ownership and solutions
  • Interest in the energy transition or climate technology is a strong plus

Engineers who thrive here tend to enjoy working close to the problem, communicating clearly across technical and non-technical teams, and taking initiative rather than waiting for direction.

Package

  • 70k- 120k salary depending on experience
  • Meaningful equity
  • Private health insurance
  • Meal allowance and team dinners
  • Hybrid working (London office 3 days per week)
Lead Python Data Engineer - Leading Technology AI Brand
MLR Associates
London
In office
Senior
£70,000 - £100,000
RECENTLY POSTED
+1
  • Senior Engineer/Architect
  • Leading Technology AI Brand
  • SaaS - Platform based Technology Services
  • London/City
  • £70-100k salary + equity package

Our client a global technology leader is currently looking for a Senior/Lead Data Engineer to work with the dev team to guide the provision of Software Development for an exciting new AI product.

Key Responsibilities:-

  • Architect and build scalable data pipelines and infrastructure
  • Design and maintain data ingestion, transformation, and storage architectures for operational and AI workloads.
  • Develop and manage batch and Real Time data pipelines.
  • Build and optimize systems for vector search, retrieval, and ML data pipelines.
  • Ensure data reliability, security, and governance across the platform.
  • Collaborate with AI and Back End engineering teams to support training, inference, and product features.
  • Implement monitoring, observability, and data quality frameworks.

Core Experience:-

  • 7+ years of experience in data engineering or Back End engineering roles.
  • Strong experience designing and building data pipelines and distributed data systems.
  • Experience working with relational databases (PostgreSQL preferred, but MySQL or similar is acceptable).
  • Experience with NoSQL databases.
  • Experience with vector databases used in modern AI systems.
  • Strong programming experience in Python.

Frameworks/Infrastructure:-

  • Apache Spark
  • Apache Airflow
  • Kafka
  • Elasticsearch/OpenSearch
Ventilation Engineer
RGE Services Ltd
London
In office
Junior - Mid
£40,000
RECENTLY POSTED

Ventilation Engineer (MVHR & Ventilation Systems Specialist)

Location: South UK
Salary: £38,000 £42,000 per annum (DOE)
Company: RGE Services

RGE Services is a growing and reputable engineering services company specialising in ventilation and air management solutions across residential and commercial properties. Due to continued expansion, we are looking for an experienced and motivated Ventilation Engineer to join our team.

The Role

As a Ventilation Engineer, you will be responsible for the installation, maintenance, relocation, and replacement of a wide range of ventilation systems. This is a field-based role requiring high standards, strong fault-finding skills, and the ability to work independently.

Key Responsibilities

  • Installation, servicing, and commissioning of MVHR (Mechanical Ventilation with Heat Recovery) systems
  • Maintenance, fault diagnosis, and repair of ventilation fans
  • Installation and servicing of extraction units (bathroom, kitchen, whole-house systems)
  • Working on MEV (Mechanical Extract Ventilation) systems
  • Maintenance and replacement of inline fans, centrifugal fans, and commercial ventilation units
  • Ductwork installation, modification, and system upgrades
  • System balancing and airflow testing
  • Relocation and replacement of existing ventilation systems
  • Ensuring compliance with current Building Regulations and Health & Safety standards
  • Completing job sheets and reports accurately

Requirements

  • Proven experience working with MVHR and ventilation systems
  • Experience with extraction units and MEV systems
  • Strong understanding of airflow principles and system balancing
  • NVQ Level 2/3 in HVAC, Ventilation, or similar qualification (preferred)
  • PASMA / IPAF (desirable)
  • 18th Edition (desirable but not essential)
  • Full UK driving licence
  • Ability to work independently and as part of a team

What We Offer

  • Competitive salary (£38,000 £40,000 DOE)
  • Company vehicle & fuel card
  • Tools and uniform provided
  • Ongoing training and development
  • Pension scheme
  • Paid holidays
  • Opportunity to grow within a expanding company
Data Engineer
Peregrine
London
In office
Senior
Private salary
RECENTLY POSTED
+3

We are Data Services, our mission is to unlock the value of data by delivering high-quality, reliable, and secure data services that are accessible, understandable, and actionable. We continuously evolve our offerings, leveraging modern cloud-based technologies, and fostering strong partnerships to help our colleagues in the Bank navigate the complexities of a data-driven world and achieve their strategic objectives.

Active SC Clearance

Job Description:

The world of data in Central Banking is evolving rapidly. With the rise of detailed data collection in financial regulation and the swift advancements in cloud-native data technologies, the demand for visionary data engineers is growing. Were seeking a senior Data Engineer to join our Data Engineering team and play a pivotal role in shaping the Banks strategic cloud-first data platform.

As a senior member of the team, you will play a key role in designing and delivering robust, scalable data solutions that support the Banks core responsibilities around monetary policy, financial stability, and regulatory supervision. Youll contribute to technical design decisions, mentor engineers, and collaborate across teams to ensure our data infrastructure continues to evolve and meet future demands.

Role Responsibilities

  • Lead the design, development, and deployment of scalable, secure, and cost-effective distributed data solutions using Azure services (e.g., Azure Databricks, Azure Data Lake Storage, Azure Data Factory).
  • Architect and implement advanced data pipelines using Databricks, Delta Lake, Python and Spark, ensuring performance, reliability, and maintainability across cloud and on-prem environments.
  • Champion data quality, governance, and observability, ensuring data is accurate, timely, and fit-for-purpose for analytics, BI, and operational use cases.
  • Drive the modernization of legacy systems, leading the migration of data infrastructure to Azure with minimal disruption and long-term scalability.
  • Act as a technical authority on Azure-native data engineering, guiding best practices and setting standards across the team.
  • Mentor and coach junior and mid-level engineers, fostering a culture of continuous learning, innovation, and technical excellence.
  • Collaborate with architects, analysts, and stake holders to align data engineering efforts with strategic business goals and enterprise data strategy.
  • Evaluate and introduce emerging technologies, tools, and methodologies to enhance the Banks data capabilities.
  • Own the end-to-end delivery of complex data solutions, from requirements gathering to production deployment and support.
  • Contribute to the development of reusable frameworks, templates, and patterns to accelerate delivery and ensure consistency across projects.
Minimum Criteria
  • Extensive experience with Azure services including Azure Databricks, Azure Data Lake Storage, and Azure Data Factory.
  • Advanced proficiency in SQL, Python, and Spark (PySpark), with a strong focus on performance optimization and distributed processing.
  • Proven experience in CI/CD practices using industry-standard tools (e.g., GitHub Actions, Azure DevOps).
  • Strong understanding of data architecture principles and cloud-native design patterns.
Essential Criteria
  • Demonstrated ability to lead technical delivery, mentor engineering teams and collaborate with stakeholders to ensure alignment between data solutions and business strategy.
  • Proficiency in Linux/Unix environments and shell scripting.
  • Deep understanding of source control, testing strategies, and agile development practices.
  • Self-motivated with a strategic mindset and a passion for driving innovation in data engineering.
Desirable Criteria
  • Experience delivering data pipelines on Hortonworks/Cloudera on-prem and leading cloud migration initiatives.
  • Familiarity with: Apache Airflow
  • Data modelling and metadata management
  • Experience influencing enterprise data strategy and contributing to architectural governance.

Changed this now. I was confusing this with PDE role as I am working on that in parallel. Hope this makes sense now.

data solutions rather than architectures?

Should add Python here as a key tech we use

Have mentioned Python in ‘Minimum Criteria’ section below, but will add here too

this could be added to Essential Criteria ?

stakeholder and project management ?

Have updated #1 in essential criteria below. But I have now used the previous version to create requisition in OBS. Will see if it can be changed.

What is the difference between “minimum” and “essential” criteria. Both imply that they are mandatory and so could be one list?

This is a bit confusing. I used to have just one, but this is the standard format of JD that the Bank wants us to follow. Here is the difference:

Min Criteria:

This must list the minimum technical skills/experience/qualifications required to do the job and should be measurable/scoreable. The screening questions you select must link to these, in order to allow candidates to best demonstrate their suitability for the role.

Essential:

This lists other important technical skills/experience/qualifications, and also more behavioural competencies. These are ones that are better assessed at interview rather than on screening questions on the application form

Ok, I think we could go back and ask HR about this as it does seem confusing and to me doesn’t give a good impression of the Bank to applicants at it looks like 2 lists for the same thing.

I had checked this earlier, but seems they want us to follow this format. When I advertised last time, I just mentioned Minimum Criteria, but they said it has to be split into Minimum and Essential.

Don’t think we need to mention Atlas or Cloudera Manager as we hardly ever use those. Airflow could be useful so would leave that in.

Machine Learning Engineer (Python / MLE)
Sanderson Recruitment
London
Fully remote
Mid - Senior
£650/day - £750/day

Machine Learning Engineer (Python / MLE )

6 Month Contract

£650 - £750

Remote

Umbrella

Urgent Start

We are looking for a number of MLE / Machine Learning Engineers for a critical 6 month contract with a household name. This role is essential for maintaining the stability and performance of their core data and Machine Learning systems.

If your background is diving into complex production codebases and possess excellent problem solving skills, this is the perfect opportunity. We need candidates ready to start by Mid to late November.

This is a deeply technical engineering role focused less on new feature development and more on reliability and fixes.

Key Skills:

Investigating and debugging complex data flow and Machine Learning issues within a live, high impact production environment.

Extensive Python, NumPy and Pandas is required for this role.

You must demonstrate a deep commercial background in the following areas:

Extensive Python: Very strong, production-level Python coding and debugging skills.

Production Environment: Proven experience working directly with and troubleshooting issues in live production codebases (not just isolated development).

Cloud Experience: Solid experience with any major public cloud provider (GCP, AWS, or Azure).

Experience with BigQuery would be good.

Machine Learning: Experience supporting and understanding ML pipelines and models in a production setting.

Direct experience with Google Cloud Platform, BigQuery, and associated tooling.

Experience with workflow tools like Airflow or Kubeflow.

Familiarity with dbt (Data Build Tool).

Please send your CV for more information on these roles.

Reasonable Adjustments:

Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people of all backgrounds and perspectives. Our success is driven by our people, united by the spirit of partnership to deliver the best resourcing solutions for our clients.

If you need any help or adjustments during the recruitment process for any reason , please let us know when you apply or talk to the recruiters directly so we can support you.

TPBN1_UKTJ

Machine Learning Engineer
Sanderson Recruitment
London
Fully remote
Mid - Senior
£700/day - £750/day
+1

Machine Learning/Data Engineer

£700-750/day overall assignment rate to umbrella

Fully remote

3-6 month initial

Apply today to join a forward-thinking, tech-driven FTSE 100 organisation using data science and AI to enhance customer experience, optimise supply chains and drive sustainable growth. With 40% of sales from sustainable products, this is a company that combines scale, innovation and purpose.

As a Machine Learning Engineer, you’ll help maintain the stability and performance of core data and ML systems across Europe. This technical engineering role focuses on reliability, optimisation and critical fixes, ideal if you excel at investigating and debugging complex data flows and ML issues in live production environments.

We’re looking for individuals with:

Experience: Proven background as a Machine Learning Engineer.

Technical Skills: Strong in SQL and Python (Pandas, Scikit-learn, Jupyter, Matplotlib).

Data transformation & manipulation : experience with Airflow, DBT and Kubeflow

Cloud: Experience with GCP and Vertex AI (developing ML services).

Expertise: Solid understanding of computer science fundamentals and time-series forecasting.

Machine Learning: Strong grasp of ML and deep learning algorithms (e.g. Logistic Regression, Random Forest, XGBoost, BERT, LSTM, NLP, Transfer Learning).

Reasonable Adjustments:

Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people of all backgrounds and perspectives. Our success is driven by our people, united by the spirit of partnership to deliver the best resourcing solutions for our clients.

If you need any help or adjustments during the recruitment process for any reason , please let us know when you apply or talk to the recruiters directly so we can support you.

TPBN1_UKTJ

Senior Data Engineer
Hays Technology
London
In office
Senior
Private salary

Your new company
This is a financial institution with an office based in the City of London.

Your new role
You will be migrating data from on prem onto Snowflake for a Greenfield project.

What you’ll need to succeed

Strong Python and SQL are crucial to this role
On-prem to Snowflake migration experience
Airflow and DBT experience
GitHub experience as you will be building CI/CD pipelines as part of this role
Any experience within post-trade, OTC, swaps will be extremely beneficial
What you’ll get in return
An exciting opportunity to join an international organisation in financial services. Furthermore, a competitive day rate for this role will be offered in addition to your own dedicated Hays Consultant to guide you through every step of the application process.

What you need to do now
If you’re interested in this role, click ‘apply now’ to forward an up-to-date copy of your CV, or call us now.
If this job isn’t quite right for you, but you are looking for a new position, please contact us for a confidential discussion about your career.

Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C’s, Privacy Policy and Disclaimers which can be found at (url removed)

Data Engineer (Online Monitoring)
Advertising Standard Authority
London
Hybrid
Junior - Mid
£42,000 - £52,000

28-35hrs per week- open to discuss flexible working of these hours Remote with some attendance at our London office in Shoreditch The ASA is the UK’s regulator of advertising across all media, including online. Our work includes taking proactive action against misleading, harmful, offensive or otherwise irresponsible ads and acting on complaints. In short, we make sure ads are legal, decent, honest and truthful. In this role you will join our Data Science team and work on our world-leading Active Ad Monitoring system, which uses AI to proactively monitor online advertising. In 2025 the system captured and processed 60 million ads across social media, search and programmatic display. The ASA uses this intelligence to help regulate ads across high-priority topics like injectable weight-loss medications, green claims companies make to consumers, disclosure of influencer marketing and many more. You will help develop and maintain the tools we use to capture, process, and apply AI models to large datasets of ads within the Active Ad Monitoring system. We’re looking for someone who wants to use their skills and expertise to help shape a safer advertising landscape. Our team mission is to protect UK consumers from adverts that are misleading, cause harm and target those within our society that are the most vulnerable. Working as part of our small agile team you will have the opportunity to own your work end-to-end, seeing directly how the code you write helps protect UK consumers. You will work in a cloud-based environment, primarily in Python, and with a range of industry standard tools such as Snowflake, Docker and Airflow. You will work primarily with unstructured data - namely ads in a variety of formats including images, videos and text from a range of online channels. About you \* You may not have been a Data Engineer before but you will have the ability to work with data in Python to a professional standard, and deliver high-quality code that works reliably in a production setting. \* You‘ll be working with people from both technical and non-technical backgrounds so you’ll need to be adept at being able to translate complex technical language to non-technical people. \* You’ll be impact focused- understanding the problems the ASA faces and prioritising technical solutions that will deliver real impact. \* You will need to be curious and ambitious, creatively solving problems that may arise whilst always having an eye on system/process improvements. \* You’ll enjoy working with others from different technical disciplines each using your unique expertise to further the work, whilst also developing your own technical knowledge and skills. We are committed to building a workforce that reflects the full diversity of the UK population. We believe that varied perspectives and experiences strengthen our organisation and help us deliver our work more effectively. We welcome applications from people of all backgrounds and identities, and we actively encourage candidates from minority or underrepresented groups to apply. Women are currently under‑represented within data engineering roles, and within our Data Science team. In line with our commitment to equality, diversity and inclusion, we particularly encourage applications from women and others who are under‑represented in this area. Our recruitment process ensures applications are absent of names or any identifiable information which supports our aim of finding the best person for the role based on their skills and experience only. How to apply: If you’re interested in applying for this role, please review the job description below and complete our online application process which includes answering some online questions regarding your motivation for applying for this role and your skills and experience. Closing date: 16th March 2026. Please note we will be reviewing applications as they come in and we reserve the right to close the advert early if we receive a significantly high number of applicants. Please feel free to use AI to enhance your application but not to write it for you. We’re interested to know your thoughts, experiences and ideas. You’ll need to stand up what you’ve told us in your application if you attend an interview, so please make sure we feel the person we’ve met on paper is the person we meet in the room

Heat Pump Engineer (Commisioning)
Ernest Gordon Recruitment
Multiple locations
In office
Junior - Mid
£60,000

London M25 Patch

£50,000 - £60,000 + Progression + Training + Company Benefits + No Overtime + Local Patch

Are you from a Heat Pumps, HVAC or MVHR background that wants to join a company that has seen huge success in Europe rivalling the likes of Daikin and Toshiba?

Do you want to work for a industry leading business with state of the art new suite of heat pumps that provides full training from industry experts?

This company have gone from strength to strength in the last decade due to their products being best in class and rivalling the usual go to names. They have taken huge amount of market share across Europe and are now finding the same success in the UK.

In this role you will be working the M25 patch with other inspectors, your jobs will be organised in location meaning you won’t be spending all day in traffic and be able to return home at a reasonable hour.

This business are the UK supplier of ventilation, heat recovery, and heat pump systems, providing warranty support, technical diagnostics, commissioning assistance, and product expertise to installers, developers, and end-users.

This position supports our role as the primary UK technical contact for their systems, offering advanced site-based troubleshooting and controller configuration when third-party engineers are unable to resolve issues.

THE ROLE:

  • London M25 patch shared with other inspectors
  • Carrying out diagnostics and commissioning
  • Performing airflow balancing and assisting with system commissioning
  • Diagnosing issues with controllers, heat pump components, airflow performance and ventilations issues

THE PERSON:

  • Experience in either, HVAC, MVHR, or Heat Pumps
  • London based
  • Full UK driving license

Reference: BBBH23065

Keywords: AC, HVAC, Air Con, Heat Pumps, Heat Exchange, Airflow, Testing, Field, M25, London,

If you’re interested in this role, click ‘apply now’ to forward an up-to-date copy of your CV.

We are an equal opportunities employer and welcome applications from all suitable candidates. The salary advertised is a guideline for this position. The offered renumeration will be dependent on the extent of your experience, qualifications, and skillset.

Ernest Gordon Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job, you accept the T&C’s, Privacy Policy and Disclaimers which can be found at our website

Senior Data Engineer (AWS, Airflow, Python)
Triad
London
Remote or hybrid
Senior
£60,000 - £65,000
+1

Based at client locations, working remotely, or based in our Godalming or Milton Keynes offices.
Salary up to £65k plus company benefits.

About Us

Triad Group Plc is an award-winning digital, data, and solutions consultancy with over 35 years’ experience primarily serving the UK public sector and central government. We deliver high-quality solutions that make a real difference to users, citizens and consumers.

At Triad, collaboration thrives, knowledge is shared, and every voice matters. Our close-knit, supportive culture ensures you’re valued from day one. Whether working with cutting-edge technology or shaping strategy for national-scale projects, you’ll be trusted, challenged, and empowered to grow.

We nurture learning through communities of practice and encourage creativity, autonomy, and innovation. If you’re passionate about solving meaningful problems with smart and passionate people, Triad could be the place for you.

Glassdoor score of 4.7

96% of our staff would recommend Triad to a friend

100% CEO approval

See for yourself some of the work that makes us all so proud:

Helping law enforcement with secure intelligence systems that keep the UK safe

Supporting the UK’s national meteorological service in leveraging supercomputers for next-level weather forecasting

Assisting a UK government department responsible for consumer product safety with systems to track unsafe products

Powering systems that help the government monitor and reduce greenhouse gas emissions from commercial transport

Role Summary

Triad is seeking a Senior Data Engineer to play a key role in delivering high-quality data solutions across a range of client assignments, primarily within the UK public sector. You will design, build, and optimise cloud-based data platforms, working closely with multidisciplinary teams to understand data requirements and deliver scalable, reliable, and secure data pipelines. This role offers the opportunity to shape data architecture, influence technical decisions, and contribute to meaningful, data-driven outcomes.

Key Responsibilities

Design, develop, and maintain scalable data pipelines to extract, transform, and load (ETL) data into cloud-based data platforms, primarily AWS.

Create and manage data models that support efficient storage, retrieval, and analysis of data.

Utilise AWS services such as S3, EC2, Glue, Aurora, Redshift, DynamoDB and Lambda to architect and maintain cloud data solutions.

Maintain modular Terraform based IaC for reliable provisioning of AWS infrastructure.

Develop, optimise and maintain robust data pipelines using Apache Airflow.

Implement data transformation processes using Python to clean, preprocess, and enrich data for analytical use.

Collaborate with data analysts, data scientists, developers, and other stakeholders to understand and integrate data requirements.

Monitor, optimise, and tune data pipelines to ensure performance, reliability, and scalability.

Identify data quality issues and implement data validation and cleansing processes.

Maintain clear and comprehensive documentation covering data pipelines, models, and best practices.

Work within a continuous integration environment with automated builds, deployments, and testing.

Skills and Experience

  • Strong experience designing and building data pipelines on cloud platforms, particularly AWS.
  • Excellent proficiency in developing ETL processes and data transformation workflows.
  • Strong SQL skills (postgresql) and advanced Python coding capability (essential).
  • Experience working with AWS services such as S3, EC2, Glue, Aurora, Redshift, DynamoDB and Lambda (essential).
  • Understanding of Terraform codebases to create and manage AWS infrastructure.
  • Experience developing, optimising, and maintaining data pipelines using Apache Airflow.
  • Familiarity with distributed data processing systems such as Spark or Databricks.
  • Experience working with high-performing, low-latency, or large-volume data systems.
  • Ability to collaborate effectively within cross-functional, agile, delivery-focused teams.
  • Experience defining data models, metadata, and data dictionaries to ensure consistency and accuracy.

Qualifications & Certifications

  • A degree or equivalent qualification in Computer Science, Data Science, or a related discipline (desirable).
  • Due to the nature of this position, you must be willing and eligible to achieve a minimum of SC clearance. To be eligible, you must have been a resident in the UK for a minimum of 5 years and have the right to work in the UK.

Triad’s Commitment to You

As a growing and ambitious company, Triad prioritises your development and well-being:

  • Continuous Training & Development: Access to top-rated Udemy Business courses.
  • Work Environment: Collaborative, creative, and free from discriminatioBenefits:
    • 25 days of annual leave, plus bank holidays.
    • Matched pension contributions (5%).
    • Private healthcare with Bupa.
    • Gym membership support or Lakeshore Fitness access.
    • Perkbox membership.
    • Cycle-to-work scheme.

What Our Colleagues Have to Say

Please see for yourself on Glassdoor and our “Day in the Life” videos at the bottom of our Careers Page.

Our Selection Process

After applying for the role, our in-house talent team will contact you to discuss Triad and the position. If shortlisted, you will be invited for:

  1. A technical test including numerical, logical and verbal reasoning
  2. A technical interview with our consultants
  3. A management interview to assess cultural fit

We aim to complete interviews and progress candidates to offer stage within 2-3 weeks of the initial conversation.

Other Information

If this role is of interest to you or you would like further information, please contact Ryan Jordanand submit your application now.

Triad is an equal opportunities employer and welcomes applications from all suitably qualified people regardless of sex, race, disability, age, sexual orientation, gender reassignment, religion, or belief. We are proud that our recruitment process is inclusive and accessible to disabled people who meet the minimum criteria for any role. Triad is a signatory to the Tech Talent Charter and a Disability Confident Leader.

SC Cleared Data Engineer
83zero Ltd
City of London
In office
Mid - Senior
£500/day - £550/day
+1

Day rate: £500 - £550 Inside IR35 Location: London Key Responsibilities Design, build, and maintain scalable data pipelines, ETL processes, and data integrations. Develop and optimize data models, storage solutions, and analytics environments. Partner with UX/UI designers to create user-friendly dashboards, data tools, and internal products. Implement visualizations that make complex datasets understandable for technical and non-technical users. Work with cross-functional teams to translate product requirements into technical designs. Ensure data quality, governance, and best practices across systems. Contribute to the evolution of our design systems and front-end components for data tools. Required Skills & Experience Proven experience as a Data Engineer, BI Engineer, or similar role. Strong proficiency in SQL, Python, and modern data engineering frameworks (e.g., Airflow, dbt, Spark, etc.). Experience with cloud platforms such as AWS, Azure, or GCP. Solid understanding of data warehousing and ETL/ELT architecture. Demonstrable UX/UI skills: wireframing, prototyping, and designing clean, intuitive interfaces. Experience with front-end technologies (e.g., React, Vue, or similar) is a plus. Familiarity with visualization tools (e.g., Tableau, Power BI, or custom solutions)

Snowflake Data Engineer
Square One Resources
London
Hybrid
Mid - Senior
£550/day - £600/day

Job Title: Snowflake Data Engineer
Location: London (2 days on-site per week)
Salary/Rate: 550 - 600 per day inside IR35
Start Date: March
Job Type: Initial 3-6 month contract

Company Introduction
We have an exciting opportunity now available with one of our sector-leading consultancy clients! They are currently looking for a skilled Snowflake Data Engineer to help on their cloud migration project.

Job Responsibilities/Objectives
You will be responsible for designing and building scalable data pipelines, Data Vault models/Dimension Model, and Snowflake/dbt workloads for cloud migration projects.

? Implement Data Vault 2.0 (Hubs, Links, Satellites) /Dimension Model on Snowflake.
? Build ELT pipelines using Snowflake, dbt, Python/PySpark.
? Develop ingestion from APIs, databases, streams.
? Optimize Snowflake warehouses, cost, and performance.
? Collaborate with architects, analysts, and DevOps.
? Maintain documentation, lineage, governance standards.

Required Skills/Experience
The ideal candidate will have the following:

? Strong SQL; Snowflake ELT; dbt experience.
? Python/PySpark, ETL/ELT design.
? Data Vault 2.0 or dimensional modeling.
? AWS services (S3, Glue, Lambda, Redshift) or GCP equivalents.
? Experience with CI/CD for data pipelines.

Good to have skills
Although not essential, the following skills are desired by the client:

? Kafka/Kinesis, Airflow, CodePipeline.
? BI tools (Power BI/Tableau).
? Docker/OpenShift; metadata driven pipelines.

? 3-8+ years Data Engineering experience.
? Cloud data engineering and Snowflake/dbt hands on exposure.

If you are interested in this opportunity, please apply now with your updated CV in Microsoft Word/PDF format.

Disclaimer
Notwithstanding any guidelines given to level of experience sought, we will consider candidates from outside this range if they can demonstrate the necessary competencies.
Square One is acting as both an employment agency and an employment business, and is an equal opportunities recruitment business. Square One embraces diversity and will treat everyone equally. Please see our website for our full diversity statement.

SC Cleared Data Engineer
83zero Ltd
London
In office
Mid - Senior
£500/day - £550/day

Day rate: 500 - 550 Inside IR35

Location: London

Key Responsibilities

  • Design, build, and maintain scalable data pipelines, ETL processes, and data integrations.
  • Develop and optimize data models, storage solutions, and analytics environments.
  • Partner with UX/UI designers to create user-friendly dashboards, data tools, and internal products.
  • Implement visualizations that make complex datasets understandable for technical and non-technical users.
  • Work with cross-functional teams to translate product requirements into technical designs.
  • Ensure data quality, governance, and best practices across systems.
  • Contribute to the evolution of our design systems and front-end components for data tools.

Required Skills & Experience

  • Proven experience as a Data Engineer, BI Engineer, or similar role.
  • Strong proficiency in SQL, Python, and modern data engineering frameworks (e.g., Airflow, dbt, Spark, etc.).
  • Experience with cloud platforms such as AWS, Azure, or GCP.
  • Solid understanding of data warehousing and ETL/ELT architecture.
  • Demonstrable UX/UI skills: wireframing, prototyping, and designing clean, intuitive interfaces.
  • Experience with front-end technologies (e.g., React, Vue, or similar) is a plus.
  • Familiarity with visualization tools (e.g., Tableau, Power BI, or custom solutions).
Page 1 of 1
Frequently asked questions
In London, you can find a variety of Apache Airflow roles including Data Engineer, DevOps Engineer, Cloud Engineer, Workflow Automation Specialist, and Big Data Engineer positions. These roles typically involve designing, building, and maintaining data pipelines using Apache Airflow.
Employers in London typically seek candidates with strong experience in Apache Airflow, Python programming, SQL, cloud platforms (such as AWS, GCP, or Azure), and knowledge of ETL processes. Familiarity with containerization tools like Docker and orchestration tools like Kubernetes is also highly valued.
Yes, many companies in London offer remote or hybrid work arrangements for Apache Airflow roles, especially post-pandemic. When browsing jobs, look for listings that specify remote-friendly or hybrid options.
The average salary for Apache Airflow roles in London varies depending on experience and job level but generally ranges from £50,000 to £90,000 per year. Senior or specialist roles can offer salaries above this range.
To increase your chances, build a solid portfolio showcasing your experience with Apache Airflow, contribute to open-source projects, acquire relevant certifications, and stay updated with the latest cloud and workflow automation technologies. Networking with local tech communities and attending industry events in London can also be very helpful.