Make yourself visible and let companies apply to you.
Roles
Data Engineer Jobs in London
Overview
Looking for top Data Engineer jobs in London? Explore the latest opportunities on Haystack, your go-to IT job board connecting skilled data engineers with leading companies in the heart of the UK’s tech hub. Whether you're an experienced data engineer or just starting your career, find roles that match your skills and ambitions in London’s vibrant tech scene. Start your job search today and take the next step in your data engineering career!
Senior Data Engineer
Lynx Recruitment Limited
London
Hybrid
Senior
£85,000
RECENTLY POSTED
+1

Are you a Data Engineer passionate about buildingscalable, cloud-first data platformsthat power business insights and innovation? Our client is seeking an experienced engineer to design robust, secure, and future-ready data solutions that support analytics, AI, and strategic decision-making.

What Youll Do:

  • Build, test, and maintainETL/ELT pipelinesfor structured and unstructured data.
  • Design and optimisedata warehouses, lakes, and cloud-native platforms.
  • Implementmonitoring, security, and compliance frameworksin line with regulations (e.g., GDPR, financial standards).
  • Deliverhigh-quality datasetsto analysts, data scientists, and business stakeholders.
  • Work with modern technologies, includingcloud platforms, orchestration tools, and streaming frameworks.

What Were Looking For:

  • StrongSQL & Pythonskills.
  • Hands-on experience withETL/ELT tools(e.g., Matillion, Talend, FiveTran, Azure Data Factory).
  • Experience designing and managingcloud data platforms(Snowflake essential; AWS/Azure/GCP desirable).
  • Soliddata modelling knowledgeand performance optimisation skills.
  • Awareness ofdata governance, security, and regulatory compliance.
  • Ability to work closely with technical and business teams to deliver impactful solutions.

Nice-to-Haves:

  • Orchestration tools (Airflow, dbt, Prefect) or streaming frameworks (Kafka, Kinesis).
  • CI/CD experience and DevOps practices for data workflows.
  • Exposure to data visualisation platforms (PowerBI, Tableau, MicroStrategy).
  • Experience in financial services or other highly regulated industries.
DV Cleared Data Pipeline Developer
Sanderson Government and Defence
London
In office
Senior
£600/day - £650/day
RECENTLY POSTED

DV Cleared Data Pipeline Developer - Central Government (Contract)

Duration: ASAP start (4 weeks onboarding) until 31 October 2026
Rate: £600-£650 per day (Outside IR35)
Location: Vauxhall until May, then Stratford - 5 days per week onsite
Clearance: UKSV DV (active, minimum 12 months remaining)
Nationality: UK Nationals only

We are seeking an experienced DV Cleared Data Pipeline Developer to support a large UK Central Government department on a long-term, high-assurance programme. This engagement is Outside IR35, operating on a project-based delivery model with a clear statement of work and deliverables.

The Role

You will be responsible for the design, build, and delivery of secure data pipelines and orchestration platforms within a highly classified environment. The role focuses on outcomes and technical delivery rather than headcount substitution, aligning with Outside IR35 status.

Key responsibilities include:

  • Designing, building, and delivering data pipelines and orchestration workflows using Prefect or Airflow
  • Developing Python-based services and APIs for data ingestion and enrichment
  • Building and deploying containerised workloads using Docker and Kubernetes (or Swarm)
  • Managing dependencies and build pipelines in air-gapped environments with no internet access
  • Implementing and maintaining data governance and secure data handling practices
  • Automating and configuring business workflow tooling, including Jira-based processes
  • Working with Elastic, MongoDB, Cassandra, Redis, and S3-compatible storage
  • Producing technical documentation and delivering agreed outcomes against the statement of work
  • Collaborating with platform, DevOps, and security teams while retaining autonomy over technical approach

About You

Essential experience:

  • Active UKSV DV clearance (minimum 12 months remaining)
  • Strong Python development experience in secure or restricted environments
  • Proven delivery of data pipelines using Prefect or Airflow
  • Hands-on experience with Docker, Kubernetes, and container orchestration
  • Experience operating in air-gapped systems, managing builds without external connectivity
  • Strong understanding of data governance, security, and compliance
  • Experience building APIs for data services
  • Confident working 5 days per week onsite in secure locations (Vauxhall / Stratford)
  • Comfortable delivering against clearly defined outcomes, consistent with Outside IR35 engagements

Nice to Have

  • Experience with AI / ML, including PyTorch, NLP, or LLMs
  • Advanced data handling and large-scale processing experience
  • Familiarity with the Atlassian Suite (Jira, Confluence)
  • Experience using GitHub in mirrored or restricted environments
  • Strong Linux CLI skills

Reasonable Adjustments:

Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people of all backgrounds and perspectives. Our success is driven by our people, united by the spirit of partnership to deliver the best resourcing solutions for our clients.

If you need any help or adjustments during the recruitment process for any reason, please let us know when you apply or talk to the recruiters directly so we can support you.

Informatica MDM Engineer
Stackstudio Digital Ltd.
London
Hybrid
Senior
£375,000 - £400,000
RECENTLY POSTED

Role DetailsRole / Job Title:Informatica MDM EngineerWork Location:250 Bishopsgate, London, UKMode of Working Hybrid / Office Based:HybridIf Hybrid, how many days are required in office?2 daysThe RoleWe are seeking an Informatica MDM Engineer with strong on-prem MDM and IDMC experience to lead our migration to the IDMC cloud. The role requires solid expertise in MDM architecture, data integration, and cloud migration.Your Responsibilities (Up to 10, Avoid repetition) Design, develop, and implement Informatica MDM solutions on-premises and on IDMC Cloud platforms.
Lead the migration process of MDM systems from on-premises infrastructure to Informatica IDMC Cloud.
Collaborate with data architects, business analysts, and IT teams to define migration scope, strategy, and timelines.
Perform data profiling, data cleansing, and data quality checks to ensure integrity during migration.
Develop and maintain MDM workflows, match/merge rules, survivorship rules, and business rules.
Configure and customize Informatica MDM and IDMC Cloud components to meet business requirements.
Troubleshoot issues related to MDM data models, hub servers, IDMC services, and integration points.
Assist in performance tuning and optimization of MDM and IDMC processes.
Document migration processes, architecture changes, and provide training/support to internal teams.
Stay updated with the latest Informatica IDMC Cloud features and MDM best practices.Your ProfileEssential Skills / Knowledge / Experience (Up to 10, Avoid repetition) Bachelor’s degree in Computer Science, Information Technology, or a related field.
7+ yrs of proven experience with Informatica MDM on-premises implementations.
Hands-on experience with Informatica Intelligent Data Management Cloud (IDMC), including Cloud MDM modules.
Strong understanding of MDM concepts: data modelling, survivorship, match and merge strategies.
Experience in migrating data and applications from on-premises to cloud environments.
Proficient in ETL processes, data integration, and data governance frameworks.
Solid knowledge of SQL and relational databases.
Familiarity with cloud platforms and services (AWS, Azure, or GCP) is an advantage.
Excellent analytical, problem-solving, and communication skills.
Ability to work collaboratively in a cross-functional team environment.Desirable Skills / Knowledge / Experience (As applicable) Informatica MDM and IDMC certifications
Experience with Agile/Scrum methodologies
Knowledge of data privacy and compliance standards (GDPR, HIPAA, etc.)
Exposure to other Informatica products like PowerCenter, Data Quality, or EDC

Senior Data Engineer
Lynx Recruitment Limited
London
Remote or hybrid
Senior
£85,000
RECENTLY POSTED
+3

Are you a Data Engineer passionate about building scalable, cloud-first data platforms that power business insights and innovation? Our client is seeking an experienced engineer to design robust, secure, and future-ready data solutions that support analytics, AI, and strategic decision-making.

What Youll Do:

Build, test, and maintain ETL/ELT pipelines for structured and unstructured data.

Design and optimise data warehouses, lakes, and cloud-native platforms .

Implement monitoring, security, and compliance frameworks in line with regulations (e.g., GDPR, financial standards).

Deliver high-quality datasets to analysts, data scientists, and business stakeholders.

Work with modern technologies, including cloud platforms, orchestration tools, and streaming frameworks .

What Were Looking For:

Strong SQL & Python skills.

Hands-on experience with ETL/ELT tools (e.g., Matillion, Talend, FiveTran, Azure Data Factory).

Experience designing and managing cloud data platforms (Snowflake essential; AWS/Azure/GCP desirable).

Solid data modelling knowledge and performance optimisation skills.

Awareness of data governance, security, and regulatory compliance .

Ability to work closely with technical and business teams to deliver impactful solutions.

Nice-to-Haves:

Orchestration tools (Airflow, dbt, Prefect) or streaming frameworks (Kafka, Kinesis).

CI/CD experience and DevOps practices for data workflows.

Exposure to data visualisation platforms (PowerBI, Tableau, MicroStrategy).

Experience in financial services or other highly regulated industries.

TPBN1_UKTJ

OpenShift Telemetry Engineer
Stackstudio Digital Ltd.
London
Hybrid
Mid - Senior
£450,000 - £500,000
RECENTLY POSTED

Open Shift Telemetry Job Description Role Overview

Role / Job Title: OpenShift Telemetry

Work Location: London / Sheffield
Mode of Working: Hybrid
Office Requirement (if hybrid): 10 days a month

The Role

We are seeking a skilled OpenShift Telemetry Engineer to join our team.

Your Responsibilities

In this role, you will be:

Primarily responsible for implementing, managing, and optimizing the observability stack within a Red Hat OpenShift Container Platform environment to ensure system health, performance, and security.

Bridge the gap between application monitoring and infrastructure, leveraging different tools like tools.

Your Profile Essential Skills / Knowledge / Experience

Design, implement, and maintain data pipelines to ingest and process OpenShift telemetry (metrics, logs, traces) at scale.
Stream OpenShift telemetry via Kafka (producers, topics, schemas) and build resilient consumer services for transformation and enrichment.
Engineer data models and routing for multi-tenant observability; ensure lineage, quality, and SLAs across the stream layer.
Integrate processed telemetry into Splunk for visualization, dashboards, alerting, and analytics to achieve Observability Level 4 (proactive insights).
Implement schema management (Avro/Protobuf), governance, and versioning for telemetry events.
Build automated validation, replay, and backfill mechanisms for data reliability and recovery.
Instrument services with Open Telemetry; standardize tracing, metrics, and structured logging across platforms.
Use LLMs to enhance observability capabilities (e.g., query assistance, anomaly summarization, runbook generation).
Collaborate with platform, SRE, and application teams to integrate telemetry, alerts, and SLOs.
Ensure security, compliance, and best practices for data pipelines and observability platforms.
Document data flows, schemas, dashboards, and operational runbooks.

Desirable Skills / Knowledge / Experience / Personal Attributes

Hands-on experience building streaming data pipelines with Kafka (producers/consumers, schema registry, Kafka Connect/KSQL/Stream).
Proficiency with OpenShift/Kubernetes telemetry (Open Telemetry, Prometheus) and CLI tooling.
Experience integrating telemetry into Splunk (HEC, UF, source types, CIM), building dashboards and alerting.
Strong data engineering skills in Python (or similar) for ETL/ELT, enrichment, and validation.
Knowledge of event schemas (Avro/Protobuf/JSON), contracts, and backward/forward compatibility.
Familiarity with observability standards and practices; ability to drive toward Level 4 maturity (proactive monitoring, automated insights).
Understanding of hybrid cloud and multi-cluster telemetry patterns.
Security and compliance for data pipelines: secret management, RBAC, encryption in transit/at rest.
Good problem-solving skills and ability to work in a collaborative team environment.
Strong communication and documentation skills.

Senior Data Engineer (AWS, Airflow, Python)
Triad
London
Remote or hybrid
Senior
£60,000 - £65,000
RECENTLY POSTED
+1

Based at client locations, working remotely, or based in our Godalming or Milton Keynes offices.
Salary up to £65k plus company benefits.

About Us

Triad Group Plc is an award-winning digital, data, and solutions consultancy with over 35 years’ experience primarily serving the UK public sector and central government. We deliver high-quality solutions that make a real difference to users, citizens and consumers.

At Triad, collaboration thrives, knowledge is shared, and every voice matters. Our close-knit, supportive culture ensures you’re valued from day one. Whether working with cutting-edge technology or shaping strategy for national-scale projects, you’ll be trusted, challenged, and empowered to grow.

We nurture learning through communities of practice and encourage creativity, autonomy, and innovation. If you’re passionate about solving meaningful problems with smart and passionate people, Triad could be the place for you.

Glassdoor score of 4.7

96% of our staff would recommend Triad to a friend

100% CEO approval

See for yourself some of the work that makes us all so proud:

Helping law enforcement with secure intelligence systems that keep the UK safe

Supporting the UK’s national meteorological service in leveraging supercomputers for next-level weather forecasting

Assisting a UK government department responsible for consumer product safety with systems to track unsafe products

Powering systems that help the government monitor and reduce greenhouse gas emissions from commercial transport

Role Summary

Triad is seeking a Senior Data Engineer to play a key role in delivering high-quality data solutions across a range of client assignments, primarily within the UK public sector. You will design, build, and optimise cloud-based data platforms, working closely with multidisciplinary teams to understand data requirements and deliver scalable, reliable, and secure data pipelines. This role offers the opportunity to shape data architecture, influence technical decisions, and contribute to meaningful, data-driven outcomes.

Key Responsibilities

Design, develop, and maintain scalable data pipelines to extract, transform, and load (ETL) data into cloud-based data platforms, primarily AWS.

Create and manage data models that support efficient storage, retrieval, and analysis of data.

Utilise AWS services such as S3, EC2, Glue, Aurora, Redshift, DynamoDB and Lambda to architect and maintain cloud data solutions.

Maintain modular Terraform based IaC for reliable provisioning of AWS infrastructure.

Develop, optimise and maintain robust data pipelines using Apache Airflow.

Implement data transformation processes using Python to clean, preprocess, and enrich data for analytical use.

Collaborate with data analysts, data scientists, developers, and other stakeholders to understand and integrate data requirements.

Monitor, optimise, and tune data pipelines to ensure performance, reliability, and scalability.

Identify data quality issues and implement data validation and cleansing processes.

Maintain clear and comprehensive documentation covering data pipelines, models, and best practices.

Work within a continuous integration environment with automated builds, deployments, and testing.

Skills and Experience

  • Strong experience designing and building data pipelines on cloud platforms, particularly AWS.
  • Excellent proficiency in developing ETL processes and data transformation workflows.
  • Strong SQL skills (postgresql) and advanced Python coding capability (essential).
  • Experience working with AWS services such as S3, EC2, Glue, Aurora, Redshift, DynamoDB and Lambda (essential).
  • Understanding of Terraform codebases to create and manage AWS infrastructure.
  • Experience developing, optimising, and maintaining data pipelines using Apache Airflow.
  • Familiarity with distributed data processing systems such as Spark or Databricks.
  • Experience working with high-performing, low-latency, or large-volume data systems.
  • Ability to collaborate effectively within cross-functional, agile, delivery-focused teams.
  • Experience defining data models, metadata, and data dictionaries to ensure consistency and accuracy.

Qualifications & Certifications

  • A degree or equivalent qualification in Computer Science, Data Science, or a related discipline (desirable).
  • Due to the nature of this position, you must be willing and eligible to achieve a minimum of SC clearance. To be eligible, you must have been a resident in the UK for a minimum of 5 years and have the right to work in the UK.

Triad’s Commitment to You

As a growing and ambitious company, Triad prioritises your development and well-being:

  • Continuous Training & Development: Access to top-rated Udemy Business courses.
  • Work Environment: Collaborative, creative, and free from discriminatioBenefits:
    • 25 days of annual leave, plus bank holidays.
    • Matched pension contributions (5%).
    • Private healthcare with Bupa.
    • Gym membership support or Lakeshore Fitness access.
    • Perkbox membership.
    • Cycle-to-work scheme.

What Our Colleagues Have to Say

Please see for yourself on Glassdoor and our “Day in the Life” videos at the bottom of our Careers Page.

Our Selection Process

After applying for the role, our in-house talent team will contact you to discuss Triad and the position. If shortlisted, you will be invited for:

  1. A technical test including numerical, logical and verbal reasoning
  2. A technical interview with our consultants
  3. A management interview to assess cultural fit

We aim to complete interviews and progress candidates to offer stage within 2-3 weeks of the initial conversation.

Other Information

If this role is of interest to you or you would like further information, please contact Ryan Jordanand submit your application now.

Triad is an equal opportunities employer and welcomes applications from all suitably qualified people regardless of sex, race, disability, age, sexual orientation, gender reassignment, religion, or belief. We are proud that our recruitment process is inclusive and accessible to disabled people who meet the minimum criteria for any role. Triad is a signatory to the Tech Talent Charter and a Disability Confident Leader.

Lead Big Data Ops Engineer
Hunter Bond
London
Hybrid
Senior
£90,000
RECENTLY POSTED
TECH-AGNOSTIC ROLE

My leading Tech client are looking for a talented and motivated individual to ensure the resilience, performance, and cost-effectiveness of their Azure-based data platform. This role is essential to their data ecosystem, combining platform reliability, incident response, SLA management, cost optimization (FinOps), and deployment oversight.

You will be the single point of contact for operational issues, driving rapid resolution during outages, leading communications with stakeholders, and shaping the processes that keeps their platform running smoothly and efficiently.

This is a newly created role in a growing business. A brilliant opportunity!

The following skills/experience is required:

  • Proven operational leadership for large-scale data platforms.
  • Expertise in incident management, SLA enforcement, and stakeholder communication.
  • Hands-on experience with Azure Synapse, Databricks, ADF, Power BI.
  • Familiarity with CI/CD and automation.
  • Strong FinOps mindset and cost management experience.
  • Knowledge of monitoring and observability frameworks.

Salary: Up to £90,000 + bonus + package

Level: Lead Engineer

Location: London (good work from home options available)

If you are interested in this Lead Big Data Ops Engineer position and meet the above requirements please apply immediately.

Data Scientist
Randstad Technologies Recruitment
London
Remote or hybrid
Mid - Senior
£479/day - £565/day
RECENTLY POSTED

Data Scientist: Country Risk & Advanced Analytics Join an integrated team of economists, political scientists, and computer scientists to shape the strategic decisions of the world’s leading organizations. How You’ll Make an Impact: Innovate: Prototype new approaches for extracting insights from structured and unstructured data. Build: Design and optimize risk models for analytics and generative AI applications using proprietary NLP data. Collaborate: Partner with cross-domain experts to turn non-technical ideas into scalable, interpretable research designs. Deploy: Develop and maintain robust ML pipelines for both experimentation and production.Who You Are: Technical Expert: You have substantial experience with Python or R, and are skilled in querying and analyzing big data. NLP Specialist: You have a proven track record of developing and refining NLP models. Clear Communicator: You can explain complex ML/NLP methodologies to non-technical stakeholders with ease. Methodical: You are familiar with experiment tracking (DVC, Weights & Biases) and model evaluation metrics.Stand Out From the Crowd: Candidates with an advanced degree in ML/NLP, exposure to cloud platforms (AWS, Databricks, Snowflake), or experience in agile, fast-paced environments are highly encouraged to apply or share your updated CV to saisaranya.gummadi @ Randstad Technologies is acting as an Employment Business in relation to this vacancy

Lead Data Engineer
83zero Ltd
London
In office
Senior
£90,000 - £100,000
RECENTLY POSTED
+1

Lead Data Engineer (with Data Analytics Background)

Location: City Of London

Employment Type: Full-time

Salary: £90,000 - £100,000k

Sector: Fintech / Payments

Overview

We are looking for a highly skilled Lead Data Engineer with a strong foundation in data analytics to join a growing team. The ideal candidate will have previously worked as a Data Analyst and since transitioned into a more engineering-focused role. You’ll help us scale our data infrastructure, design and build robust data models, and contribute directly to our data platform’s evolution.

This is a hands-on role where you’ll be expected to hit the ground running, contribute to ongoing projects with minimal hand-holding, and help us maintain (and improve) the current team’s velocity.

Key Responsibilities

Design, develop, and maintain data models to support analytical and operational use cases.
Write efficient, production-grade SQL to build data pipelines and transformations.
Develop and maintain data workflows and automation scripts in Python.
Collaborate with analysts, engineers, and stakeholders to deliver high-quality data solutions.
(Optional but highly valued) Contribute to our infrastructure as code efforts using tools like Terraform.
Work with modern data warehousing technologies such as Snowflake to ensure scalable and high-performing solutions.

Skills & Experience

5+ years of experience in data roles, ideally transitioning from Data Analyst to Data Engineer.
Proven expertise in SQL and building complex data models.
Strong proficiency in Python for data processing, ETL, and workflow automation.
Experience with cloud data platforms (Snowflake experience highly desirable).
Exposure to or experience with Terraform or similar infrastructure-as-code tools is a strong plus.
Comfortable working in fast-paced environments and able to contribute quickly without extensive onboarding.

Nice to Have

Experience with modern data stack tools (e.g., dbt, Airflow, etc.).
Understanding of CI/CD pipelines and data infrastructure automation.
Familiarity with data governance, security, and best practices in a cloud environment

Principal Data Engineer (GCP)
VIQU IT Recruitment
London
Hybrid
Senior
£100,000
RECENTLY POSTED

Northwest – Hybrid

Up to £100,000

VIQU are seeking a Principal Data Engineer to join a leading social enterprise that reinvests profits to create thriving, sustainable communities. Following a full transition to a 100% cloud-based data platform, this role will play a key part in shaping and leading the organisation’s data engineering capability, with a strong focus on technical leadership, platform design and mentoring engineers within a Google Cloud environment.

Key Responsibilities of the Principal Data Engineer:

Provide technical leadership, coaching and mentoring to data engineers

Define how data engineering projects are approached, delivered and governed

Work closely with architects to design and evolve the cloud data platform

Ensure robust, scalable and compliant data pipelines across the platform

Lead best practices around data ingestion, transformation, quality, security and monitoring

Drive automation, CI/CD adoption and continuous improvement

Challenge technical decisions and raise engineering standards across the team

Key Requirements of the Principal Data Engineer:

Extensive experience in Data Engineering, with time spent in a Lead or Principal role

Strong background working in a cloud-native data platform

Hands-on experience with Google Cloud Platform is highly preferred, with consideration for AWS or Azure alongside some GCP

Experience with Terraform, Docker and dbt

Strong expertise in Data Lake / Data Warehouse solutions

Advanced SQL and Python skills

Proven experience optimising complex queries

Strong understanding of Data Governance, including lineage, data legislation and PII

Experience working within Agile / Scrum and the SDLC

Willingness to undergo a DBS check

Apply now to speak with VIQU IT in confidence. Or reach out to Katie Dark via the VIQU IT website.

Do you know someone great? We’ll thank you with up to £1,000 if your referral is successful (terms apply). For more exciting roles and opportunities like this, please follow us on LinkedIn @VIQU IT Recruitment

Principal Data Engineer (GCP)

Northwest – Hybrid

Up to £100,000

TPBN1_UKTJ

Data Engineer – TV Advertising Data (FAST)
Datatech
London
Hybrid
Mid - Senior
£75,000 - £85,000
RECENTLY POSTED

Data Engineer - TV Advertising Data (FAST) Location: London - 3 days onsite Salary £75,000 - £85,000 Neg DOE Reference : J13057 Note: Full and current UK working rights required for this role We’re currently seeking a Data Engineer to build the foundations behind the rapidly growing FAST (Free Ad Supports Streaming TV channels) A pioneering opportunity to be involved with direct to consumer advertising for a Global player in the field. Someone who is passionate about how data drives the industry and to help optimise campaigns, measure performance, and monetise content. Key Responsibilities ·Design, build, and maintain scalable ETL/ELT pipelines that transform raw data into reliable, analytics-ready datasets ·Ingest, integrate, and manage new data sources across advertising, audience, platform, and content data within Fremantle’s Microsoft Fabric environment ·Deliver robust data flows that underpin global FAST dashboards, monetisation insights, and audience viewing metrics ·Work closely with the central Data & Analytics team to enable high-quality Power BI reporting and analysis ·Ensure strong data governance, integrity, and security across the Azure/Fabric ecosystem ·Optimise data pipelines for performance, scalability, and efficiency, following best-practice engineering standards including version control and code reviews ·Monitor pipeline health, data freshness, and quality, implementing proactive alerting and issue resolution ·Translate business and analytical needs into well-structured data models and technical solutions ·Automate data workflows to minimise manual processes and improve operational reliability ·Maintain clear documentation of pipelines, datasets, and data flows to support collaboration and smooth handovers ·Stay current with data engineering best practices, particularly within the Microsoft technology stack Skills & Experience ·5+ years’ experience working as a Data Engineer or in a similar role ·Proven experience with cloud-based data platforms (Azure, AWS, SQL, Snowflake, Springserv); Microsoft Fabric experience is a strong plus ·Strong proficiency in Spark SQL and PySpark, including complex transformations ·Experience building ETL/ELT pipelines using tools such as Azure Data Factory or equivalent ·Ability to write efficient, reusable scripts for transformation, validation, and automation ·Hands-on experience integrating data from APIs (REST, JSON), including automated data collection ·Solid understanding of data modelling best practices for analytics and dashboards ·Confidence working with large, complex datasets across multiple formats (CSV, JSON, Parquet, databases, APIs) ·Strong problem-solving skills and the ability to diagnose and resolve data issues ·Excellent communication skills and experience working with cross-functional teams ·Genuine curiosity about how data drives content performance, audience behaviour, and monetisation If this sounds like the role for you then please apply today

Data Analyst
FERROVIAL CONSTRUCTION (UK) LIMITED
London
In office
Mid - Senior
Private salary
TECH-AGNOSTIC ROLE

The Data Analyst plays a key role on a large-scale infrastructure project, focusing on the development and ongoing maintenance of the project s connected digital environment. The role involves analysing data to support decision-making and ensure project objectives are met.

You will work closely with information management and project controls teams, using data to improve project efficiency and support digital transformation initiatives.

You will join the FBRS (Ferrovial BAM Joint Venture) Information Management Team (IM), where your responsibilities will include ensuring systems integration, designing data modelling processes, and developing algorithms and predictive models to extract the data required by the project. You will also collaborate with teams across the project to support data analysis and share insights.

Candidates need to demonstrate outstanding attention to detail, self-motivation, and the ability to take initiative. They should also have strong Power BI expertise and experience using FME for data integration.

Key Responsibilities:

  • Collect, process, and analyse construction project data from multiple sources.
  • Support project teams with data quality checks.
  • Use FME to support information sharing and provide basic training on FME to project teams. Ensure project team members receive essential instruction on ETL tools (FME).
  • Drive digital transformation by identifying and implementing process and workflow efficiency improvements.
  • Support the integration of project systems with internal and client platforms.
  • Work closely with digitalisation and project controls teams to ensure accurate data flow and project insights.
  • Analyse datasets to identify trends, patterns and actionable insights.
  • Create and maintain Power BI dashboards, visualisations, and reports for executive and project stakeholders.
  • Work closely with the client, RSA delivery team and Project Information Manager to ensure system stability and improvement.
  • Ensure the project complies with relevant legislation, project standards, and client requirements.

Key Skills and qualifications:

  • Strong organisational skills to manage multiple tasks, projects, and data streams effectively.
  • Ability to perform Quality Assurance checks according to the project and industry standards
  • Ability to coordinate and manage own workload support project delivery.
  • Familiarity with BIM, Python/R and UK construction data standards.
  • Familiarity with ETL tools like FME and GIS integrations.
  • Strong communication, stakeholder engagement, and problem-solving skills.
  • Experience in large infrastructure projects.

Location: London

Please note that this job description does not represent a comprehensive list of activities and employees may be requested to undertake other reasonable duties.

The Ferrovial BAM Joint Venture (FBJV) has a successful history of delivering critical infrastructure for the UK on time and to budget together in joint venture partnership. They first worked together in 2010 as BFK, delivering three Crossrail contracts, including the longest stretch of tunnelling works between Royal Oak and Farringdon and Farringdon Station, the first central station to be completed on the Elizabeth Line. The team is also delivering the Silvertown Tunnel project together in East London and has been delivering excellence at each stage of HS2, such as Fusion JV for the Enabling Works packages, EKFB for the central Main Works Contract and now delivering the track infrastructure across the entire HS2 route.

Seize the challenge. Move the world together! Innovative, creative, respectful, and diverse are some of the ways we describe ourselves. We are motivated by challenges, and we collaborate across our business units to move the world together. Your journey to a fulfilling career starts here!

Ferrovial is an equal opportunity employer. We treat all jobs applications equally, regardless of gender, color, race, ethnicity, religion, national origin, age, disability, pregnancy, sexual orientation, gender identity and expression, covered veteran status or protected genetic information (each, a Protected Class ), or any other protected class in accordance with applicable laws.

Data Engineer - TV Advertising Data (FAST)
Datatech
London
Hybrid
Mid - Senior
£75,000 - £85,000

Location: London - 3 days onsite
Salary 75,000 - 85,000 Neg DOE
Reference : J13057

Note: Full and current UK working rights required for this role

We’re currently seeking a Data Engineer to build the foundations behind the rapidly growing FAST (Free Ad Supports Streaming TV channels) A pioneering opportunity to be involved with direct to consumer advertising for a Global player in the field. Someone who is passionate about how data drives the industry and to help optimise campaigns, measure performance, and monetise content.

Key Responsibilities
Design, build, and maintain scalable ETL/ELT pipelines that transform raw data into reliable, analytics-ready datasets
Ingest, integrate, and manage new data sources across advertising, audience, platform, and content data within Fremantle’s Microsoft Fabric environment
Deliver robust data flows that underpin global FAST dashboards, monetisation insights, and audience viewing metrics
Work closely with the central Data & Analytics team to enable high-quality Power BI reporting and analysis
Ensure strong data governance, integrity, and security across the Azure/Fabric ecosystem
Optimise data pipelines for performance, scalability, and efficiency, following best-practice engineering standards including version control and code reviews
Monitor pipeline health, data freshness, and quality, implementing proactive alerting and issue resolution
Translate business and analytical needs into well-structured data models and technical solutions
Automate data workflows to minimise manual processes and improve operational reliability
Maintain clear documentation of pipelines, datasets, and data flows to support collaboration and smooth handovers
Stay current with data engineering best practices, particularly within the Microsoft technology stack

Skills & Experience
5+ years’ experience working as a Data Engineer or in a similar role
Proven experience with cloud-based data platforms (Azure, AWS, SQL, Snowflake, Springserv); Microsoft Fabric experience is a strong plus
Strong proficiency in Spark SQL and PySpark, including complex transformations
Experience building ETL/ELT pipelines using tools such as Azure Data Factory or equivalent
Ability to write efficient, reusable scripts for transformation, validation, and automation
Hands-on experience integrating data from APIs (REST, JSON), including automated data collection
Solid understanding of data modelling best practices for analytics and dashboards
Confidence working with large, complex datasets across multiple formats (CSV, JSON, Parquet, databases, APIs)
Strong problem-solving skills and the ability to diagnose and resolve data issues
Excellent communication skills and experience working with cross-functional teams
Genuine curiosity about how data drives content performance, audience behaviour, and monetisation

If this sounds like the role for you then please apply today!

Principal Data Engineer (GCP)
VIQU Ltd
London
Hybrid
Senior
£85,000 - £100,000

Northwest - Hybrid
Up to £100,000

VIQU are seeking a Principal Data Engineer to join a leading social enterprise that reinvests profits to create thriving, sustainable communities. Following a full transition to a 100% cloud-based data platform, this role will play a key part in shaping and leading the organisation’s data engineering capability, with a strong focus on technical leadership, platform design and mentoring engineers within a Google Cloud environment.

Key Responsibilities of the Principal Data Engineer:

  • Provide technical leadership, coaching and mentoring to data engineers
  • Define how data engineering projects are approached, delivered and governed
  • Work closely with architects to design and evolve the cloud data platform
  • Ensure robust, scalable and compliant data pipelines across the platform
  • Lead best practices around data ingestion, transformation, quality, security and monitoring
  • Drive automation, CI/CD adoption and continuous improvement
  • Challenge technical decisions and raise engineering standards across the team

Key Requirements of the Principal Data Engineer:

  • Extensive experience in Data Engineering, with time spent in a Lead or Principal role
  • Strong background working in a cloud-native data platform
  • Hands-on experience with Google Cloud Platform is highly preferred, with consideration for AWS or Azure alongside some GCP
  • Experience with Terraform, Docker and dbt
  • Strong expertise in Data Lake/Data Warehouse solutions
  • Advanced SQL and Python skills
  • Proven experience optimising complex queries
  • Strong understanding of Data Governance, including lineage, data legislation and PII
  • Experience working within Agile/Scrum and the SDLC
  • Willingness to undergo a DBS check

Apply now to speak with VIQU IT in confidence. Or reach out to Katie Dark via the VIQU IT website.

Do you know someone great? We’ll thank you with up to £1,000 if your referral is successful (terms apply).

Principal Data Engineer (GCP)

Northwest - Hybrid
Up to £100,000

Data Modeler (specialty insurance)
Consol Partners
London
Hybrid
Mid - Senior
£35

Data Modeller (Specialty Insurance)

We are looking for an experienced Data Modeller to support a key partner-level engagement with our client

Role Overview:
* Translate business requirements into high-quality conceptual, logical, and physical data models.
* Work closely with business stakeholders and technology teams to ensure models meet strategic and operational needs.
* Support the design and delivery of data solutions

Sector Context: Specialty insurance refers to non-standard, complex or bespoke insurance products, covering areas such as marine, aviation, cyber, or political risk. Data in this domain is typically detailed and varied, with specific underwriting and regulatory requirements. Effective data modelling here means structuring complex policy terms, multi-party exposures, and claims information in a clear, usable way.

Key Requirements:
* Proven experience in data modelling across conceptual, logical, and physical layers.
* Familiarity with specialty insurance or other complex insurance domains.
* Strong Snowflake experience.
* Very clear and confident communication skills-must be able to work directly with senior stakeholders and explain models and requirements effectively.
* Must be ready to start as soon as possible

Location:
* London client site, 2 days per week onsite.

Data Scientist
Randstad Technologies Recruitment
London
Remote or hybrid
Mid - Senior
£479/day - £565/day

Data Scientist: Country Risk & Advanced Analytics

Join an integrated team of economists, political scientists, and computer scientists to shape the strategic decisions of the world’s leading organizations.

How You’ll Make an Impact:

  • Innovate: Prototype new approaches for extracting insights from structured and unstructured data.
  • Build: Design and optimize risk models for analytics and generative AI applications using proprietary NLP data.
  • Collaborate: Partner with cross-domain experts to turn non-technical ideas into scalable, interpretable research designs.
  • Deploy: Develop and maintain robust ML pipelines for both experimentation and production.

Who You Are:

  • Technical Expert: You have substantial experience with Python or R, and are skilled in querying and analyzing big data.
  • NLP Specialist: You have a proven track record of developing and refining NLP models.
  • Clear Communicator: You can explain complex ML/NLP methodologies to non-technical stakeholders with ease.
  • Methodical: You are familiar with experiment tracking (DVC, Weights & Biases) and model evaluation metrics.

Stand Out From the Crowd: Candidates with an advanced degree in ML/NLP, exposure to cloud platforms (AWS, Databricks, Snowflake), or experience in agile, fast-paced environments are highly encouraged to apply or share your updated CV to removed)

Randstad Technologies is acting as an Employment Business in relation to this vacancy.

Data Scientist
Spectrum IT Recruitment
London
Hybrid
Mid - Senior
£60,000 - £70,000

Our client is looking for an experienced Data Scientist to design, build, and optimise machine learning models and advanced analytics solutions that support institutional priorities across a large, complex network. The role blends hands-on data science with strategic impact, using AWS technologies to deliver predictive insights that drive proactive interventions and data-driven decision-making. This is a hybrid role with the expectation of working 2 days pw in the London office.

Skills and experience required:

  • Bachelor’s degree in data science, Statistics, Computer Science, Mathematics, or similar
  • Experience delivering predictive analytics or machine learning solutions
  • Strong skills in Python, SQL, and ML libraries (e.g. scikit-learn, XGBoost, PyTorch, TensorFlow)
  • Hands-on experience with AWS ML services (SageMaker, Lambda, Redshift)
  • Ability to clearly communicate insights to non-technical stakeholders
  • Strong analytical thinking, collaboration skills, and a results-driven mindset

Role responsibilities:

  • Build, tune, and maintain predictive and ML models using AWS SageMaker
  • Analyse large datasets and perform feature engineering to improve model performance
  • Run experiments, test hypotheses, and optimise models for accuracy and value
  • Monitor model performance and manage retraining over time
  • Collaborate with Data Engineers, BI Developers, and Analysts to integrate outputs into dashboards and reports
  • Partner with academic, operational, and IT stakeholders to translate insights into action
  • Document models and support knowledge sharing and scalability
  • Contribute to the expansion of predictive analytics into advanced ML/AI use cases

Spectrum IT Recruitment (South) Limited is acting as an Employment Agency in relation to this vacancy.

BI Data Analyst
Syntax Consultancy Ltd
London
Hybrid
Mid - Senior
£450/day - £475/day

Croydon (Hybrid)

6 Month Contract

£(Apply online only)/day (Outside IR35)

BI Data Analyst needed with both active SC Clearance and NPPV3 Security Clearance. 6 Month Contract based in Croydon (Hybrid).

Paying up to £(Apply online only)/day (Outside IR35). Start ASAP in Feb/March 2026.

Hybrid Working - 3 days/week remote (WFH), and 2 days/week working on-site in the Croydon office, plus occasional travel to the Birmingham office.

A chance to work with a leading global IT transformation business specialising in delivering large-scale Government / Public Sector projects.

Key experience + tasks will include:

BI Data Analyst needed to transform raw data into meaningful insights, create performance dashboards + support strategic decision-making.

In-depth experience of Power BI, Tableau + ETL tools for reporting dashboards + data visualisation.

Strong SQL + advanced MS Excel skills for data extraction, analysis + modelling.

Experience of AWS Cost Management tools including: Cost Explorer, Budgets, CUR, tagging strategies.

Analysing AWS cloud spend, producing AWS Cloud cost reporting, forecasting, usage analysis, cost insights + expenditure tracking.

Able to analyse + interpret complex datasets and translate them into meaningful insights and reports.

Creating programme-level dashboards, reports + performance packs for key stakeholders and governance boards.

Analysing delivery progress, risks, dependencies + KPI metrics to support sound decision-making.

Producing programme level reporting packs, KPI dashboards + performance summaries.

Providing ‘deep dive’ analysis on delivery performance, operational trends + financial impacts.

Understanding of AWS Cloud architecture concepts including: EC2, Lambda, VPC, S3, RDS, CloudWatch.

Government / Public Sector / (url removed) transformation project experience preferred, especially Cloud migration

Lead Data Engineer
83zero Ltd
London
In office
Senior
£90,000 - £100,000

Lead Data Engineer (with Data Analytics Background)

Location: City Of London

Employment Type: Full-time

Salary: 90,000 - 100,000k

Sector: Fintech / Payments

Overview

We are looking for a highly skilled Lead Data Engineer with a strong foundation in data analytics to join a growing team. The ideal candidate will have previously worked as a Data Analyst and since transitioned into a more engineering-focused role. You’ll help us scale our data infrastructure, design and build robust data models, and contribute directly to our data platform’s evolution.

This is a hands-on role where you’ll be expected to hit the ground running, contribute to ongoing projects with minimal hand-holding, and help us maintain (and improve) the current team’s velocity.

Key Responsibilities

  • Design, develop, and maintain data models to support analytical and operational use cases.
  • Write efficient, production-grade SQL to build data pipelines and transformations.
  • Develop and maintain data workflows and automation scripts in Python.
  • Collaborate with analysts, engineers, and stakeholders to deliver high-quality data solutions.
  • (Optional but highly valued) Contribute to our infrastructure as code efforts using tools like Terraform.
  • Work with modern data warehousing technologies such as Snowflake to ensure scalable and high-performing solutions.

Skills & Experience

  • 5+ years of experience in data roles, ideally transitioning from Data Analyst to Data Engineer.
  • Proven expertise in SQL and building complex data models.
  • Strong proficiency in Python for data processing, ETL, and workflow automation.
  • Experience with cloud data platforms (Snowflake experience highly desirable).
  • Exposure to or experience with Terraform or similar infrastructure-as-code tools is a strong plus.
  • Comfortable working in fast-paced environments and able to contribute quickly without extensive onboarding.

Nice to Have

  • Experience with modern data stack tools (e.g., dbt, Airflow, etc.).
  • Understanding of CI/CD pipelines and data infrastructure automation.
  • Familiarity with data governance, security, and best practices in a cloud environment.
Data Modeller
Cathcart Technology
Watford
Hybrid
Mid - Senior
£550/day - £600/day

Contract Data Modeller (Finance & HR)
Watford (Hybrid - 2-3 days onsite)
550- 600 per day - Outside IR35
3-month contract

A global organisation based in Watford is looking for an experienced Contract Data Modeller to deliver foundational Finance and HR data models.

This is a pure data modelling role - focused on structure, consistency, and correctness of enterprise data (no reporting or analytics delivery).

Key Responsibilities

  • Design and own global canonical data models for Finance and HR
  • Integrate data from multiple source systems (ERP, HRIS, payroll, planning tools)
  • Define conformed dimensions, facts, hierarchies, and reference data
  • Resolve data inconsistencies across regions and legacy systems
  • Work closely with Finance and HR stakeholders to validate definitions
  • Produce clear, long-term data model documentation

Required Experience

  • Strong background as a Data Modeller in enterprise environments
  • Deep Finance data modelling experience (GL, cost centres, entities, actuals, budgets, forecasts)
  • Solid HR data modelling exposure (employee, organisation, payroll, rewards)
  • Experience with global, multi-source data landscapes
  • Strong stakeholder communication skills

Hybrid working (2-3 days onsite), plenty of parking, competitive rate, and outside IR35.

If this interests you, please apply immediately and call Andy Weir at Cathcart Technology.

Cathcart Technology is acting as an Employment Business in relation to this vacancy.

Lead Data Engineer
Hays Technology
London
In office
Senior
Private salary

Your new company
London-based travel company

Your new role
You will be responsible for leading the design, implementation and management of a digital solution. There is a focus on modernising their systems, automation, and leveraging data to enhance decision-making.

What you’ll need to succeed

  • Experience with Microsoft Power Platform (Power Apps, Power Automate, Power BI, Power Virtual Agents)
  • Strong Data Engineering background - specifically with Azure Data Factory ETL, Matillion ETL and SQL
  • Experience designing and managing Snowflake Data Warehouse solutions
  • Strong experience with Qliksense, data modelling and self-service analytics
  • Experience in API Development

What you’ll get in return
An exciting opportunity to join an international organisation in financial services. Furthermore, a competitive day rate inside IR35 for this role will be offered in addition to your own dedicated Hays Consultant to guide you through every step of the application process.

What you need to do now
If you’re interested in this role, click ‘apply now’ to forward an up-to-date copy of your CV, or call us now.
If this job isn’t quite right for you, but you are looking for a new position, please contact us for a confidential discussion about your career.

Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C’s, Privacy Policy and Disclaimers which can be found at (url removed)

Frequently asked questions
Haystack features a wide range of Data Engineer roles in London, including positions in startups, established tech companies, financial institutions, and more, covering junior to senior levels.
Most London-based Data Engineer jobs require valid work authorization or visa sponsorship. Job listings typically specify these requirements to help you apply accordingly.
Yes, many employers offer remote or hybrid working options for Data Engineers in London. You can filter job listings based on work location preferences on Haystack.
Key skills include proficiency in SQL, Python, ETL tools, cloud platforms like AWS or Azure, and experience with big data technologies such as Hadoop or Spark.
You can browse listings, create a profile, upload your CV, and apply directly through the platform. Many listings also provide contact details for recruiters if you prefer direct communication.