Make yourself visible and let companies apply to you.
Roles

Data Engineer Jobs

Overview

Looking for top Data Engineer jobs? Explore the latest data engineering opportunities on Haystack, your go-to IT job board. Whether you're skilled in ETL, data pipelines, or big data technologies, find the perfect role to advance your career today. Start your search for Data Engineer positions now!
Filters applied
Data Engineer
Search
Salary
Location
Remote preference
Role type
Seniority
Tech stack
Sectors
Contract type
Company size
Visa sponsorship
Business Intelligence Developer
James Andrews Recruitment
Bristol
In office
Mid
£40k - £45k
TECH-AGNOSTIC ROLE
James Andrews Technology Portishead (Agile Working)
£45,000 per annum 6-month FTC
Our client, a leading housing provider, is seeking a Business Intelligence Analyst/Developer to join their Data & Insight team, delivering a project-focused deployment of reporting across core business applications.
The Opportunity
You ll work with the Head of Data and Insight and project teams across Assets and Home Repairs, designing and developing reporting that s accurate, timely, and insightful. This is a hands-on BI development role with a real focus on stakeholder engagement and delivering impact for housing services.
Key Responsibilities
Develop and maintain transparent, standardised reporting solutions aligned to business needs
Design interactive dashboards in Power BI (or similar BI tools)
Work with stakeholders to translate data into actionable insight for Exec, Board, and managers
Collaborate with Data Management to resolve quality issues and ensure robust reporting
Support colleagues with training, advice, and best practice in BI reporting
Drive adoption of BI principles and contribute to Alliance s BI maturity model
Essential Requirements
2/3+ years experience with Power BI/Tableau/Qlik
Strong MS SQL skills
Experience in Agile development environments
Knowledge of BI principles and reporting best practice
Excellent communication skills, with the ability to engage technical and non-technical users
Personable, values-driven, and able to build strong relationships
Desirable
Experience within housing associations or asset management services
Exposure to commercial insight functions (qualitative and quantitative analysis)
James Andrews is acting as an employment agency and business in relation to this role.
At James Andrews Recruitment Solutions we try to respond to all applications personally, however, due to the high volume of applications this is not always possible. If you have not heard back from us within 72 hours, please assume that your application has been unsuccessful on this occasion.
Don’t forget our recommendation scheme: Recommend a friend or colleague to us and receive up to £100 each once they have completed 20 days in a role via James Andrews! Terms and conditions apply, contact us for details.
Databricks Engineer
Experis
London
Hybrid
Mid
£425/day - £430/day
aws
github
delta-lake
gitlab
sql
vault
+1
London- hybrid- 3 days per week on-site
6 Months +
UMBRELLA only- Inside IR35
Key Responsibilities
Design, develop, and maintain ETL/ELT pipelines using Airflow for orchestration and scheduling.
Build and manage data transformation workflows in DBT running on Databricks.
Optimize data models in Delta Lake for performance, scalability, and cost efficiency.
Collaborate with analytics, BI, and data science teams to deliver clean, reliable datasets.
Implement data quality checks (dbt tests, monitoring) and ensure governance standards.
Manage and monitor Databricks clusters & SQL Warehouses to support workloads.
Contribute to CI/CD practices for data pipelines (version control, testing, deployments).
Troubleshoot pipeline failures, performance bottlenecks, and scaling challenges.
Document workflows, transformations, and data models for knowledge sharing.
Required Skills & Qualifications
3-6 years of experience as a Data Engineer (or similar).
Hands-on expertise with:
DBT (dbt-core, dbt-databricks adapter, testing & documentation).
Apache Airflow (DAG design, operators, scheduling, dependencies).
Databricks (Spark, SQL, Delta Lake, job clusters, SQL Warehouses).
Strong SQL skills and understanding of data modeling (Kimball, Data Vault, or similar).
Proficiency in Python for scripting and pipeline development.
Experience with CI/CD tools (e.g., GitHub Actions, GitLab CI, Azure DevOps).
Familiarity with cloud platforms (AWS, Azure, or GCP).
Strong problem-solving skills and ability to work in cross-functional teams.
All profiles will be reviewed against the required skills and experience. Due to the high number of applications we will only be able to respond to successful applicants in the first instance. We thank you for your interest and the time taken to apply!
SSIS Developer
Experis
London
Hybrid
Mid
£430/day - £450/day
ssis
git
sql
ssas
London- Hybrid- 3 days on-site per week
6 months +
UMBRELLA only- Inside IR35
Experienced SSIS Developer to design, develop, and maintain robust ETL processes using SQL Server Integration Services (SSIS). The ideal candidate will work closely with business analysts, data architects, and stakeholders to ensure reliable data integration, transformation, and delivery across enterprise systems.
Key Responsibilities
Design, develop, test, and deploy ETL packages using SSIS to extract, transform, and load data from multiple sources into data warehouses, data marts, or operational systems.
Optimize and troubleshoot existing SSIS packages for performance, scalability, and reliability.
Develop and maintain SQL queries, stored procedures, functions, and views to support ETL and reporting processes.
Implement error handling, logging, and auditing within SSIS packages.
Schedule and automate ETL jobs using SQL Server Agent or equivalent job schedulers.
Collaborate with data architects and business analysts to understand business requirements and translate them into technical solutions.
Perform data validation and quality checks to ensure integrity and accuracy across systems.
Provide support for production deployments, monitoring, and issue resolution.
Maintain documentation for ETL processes, data flows, and technical designs.
Required Qualifications
Proven hands-on experience with Microsoft SQL Server Integration Services (SSIS).
Strong knowledge of T-SQL (queries, stored procedures, functions, indexing, performance tuning).
Experience working with relational databases (SQL Server, Oracle, etc.) and flat file/CSV/XML/JSON data sources.
Strong understanding of ETL design patterns, data warehousing concepts.
Experience with SQL Server Agent or other scheduling tools.
Strong troubleshooting and problem-solving skills.
Preferred Skills
Familiarity with Azure Data Factory (ADF), SSRS, SSAS, or Power BI.
Knowledge of data governance, security, and compliance practices.
Exposure to DevOps, CI/CD pipelines, and version control tools (e.g., Git, Azure DevOps).
Experience working in Agile/Scrum environments.
All profiles will be reviewed against the required skills and experience. Due to the high number of applications we will only be able to respond to successful applicants in the first instance. We thank you for your interest and the time taken to apply!
Data Reporting Analyst
ACS Business Performance Ltd
Crawley
Hybrid
Mid
Private salary
python
sql
Data Reporting Analyst - Crawley (Hybrid, 3 Days in Office)
Sector: Technology / Manufacturing
Contract: Permanent Hybrid (3 days in-office, 2 remote)
We’re recruiting for a Data Reporting Analyst to join a well-established technology manufacturing company based in Crawley. This is a hybrid role focused on delivering accurate and insightful data reporting across the business, supporting decision-making at all levels. Ideal for someone with strong SQL, Power BI, and Excel skills who enjoys working with complex data sets and engaging with stakeholders.
Key Responsibilities:
Work with internal stakeholders to gather reporting requirements and deliver tailored reports.
Manage and manipulate large, complex data sets from multiple sources.
Develop and maintain dashboards to support data quality improvements.
Support reporting and analytics delivery through corporate intranet tools.
Perform ETL (Extract, Transform, Load) processes on various data sources.
Collaborate with Applications team to ensure accurate information delivery.
Provide 1st line application/reporting support to internal users.
Key Skills & Experience:
4+ years in data reporting or analytics.
Strong SQL skills for querying relational databases.
Proficient in Microsoft Power BI and advanced Excel.
Excellent analytical thinking and attention to detail.
Confident communicator - able to explain complex data to non-technical stakeholders.
Self-starter - comfortable working independently and proactively seeking data.
Desirable (Not Essential):
Experience with SAP Business One, Jira/Confluence.
Programming exposure (e.g., Python).
Experience in a commercial IT or tech manufacturing environment.
Additional languages, especially Mandarin, Italian, or Japanese.
This is a great opportunity for someone who thrives in a data-driven role and enjoys translating complex datasets into clear business insights.
ACS are recruiting for aData Reporting Analyst . If you feel that you have the skills and experience required in this advertisement to be a Data Reporting Analyst submit your CV including an outline of your experience as a Data Reporting Analyst . It is always a good idea to include a covering letter outlining your experience as a Data Reporting Analyst with your application as this will enhance your chances of selection and improve your prospects of landing the Data Reporting Analystrole you desire.
GIS Data Specialist
Bowerford Associates
Multiple locations
Remote or hybrid
Mid
£40k - £45k
python
postgresql
sql
postgis
We are searching for an experienced GIS Data Specialist / Geospatial Data Specialist for an extremely exciting technology focused business, someone who has extensive experience of working with and managing Geospatial Data and Geodata.
This role is offered on a hybrid or remote working basis. You MUST however be able to attend team meeting at offices located in either Exeter or Reading two (2) times per month.
Ideally you will need to live within a commutable distance of one of the above offices, either Reading or Exeter.
We are searching for someone who is very passionate about data, automation, and driving change through smart processes. You will be leading the way with regards to transforming how data is managed, processed and delivered across our client’s business.
You will take ownership of complex ETL workflows, you will be developing automation tools, and you will ensure data is accurate, delivered in a timely fashion whilst making sure it is fit for purpose.
Key Duties: -
Lead and maintain ETL processes, improving automation and data quality.
Use tools like Power BI / Excel or similar to report on team metrics and performance.
Support data integration and provisioning across products and services.
Collaborate with internal teams and suppliers to resolve data issues and enhance workflows.
Manage bespoke data requests and provide technical support to internal teams.
Ensure metadata accuracy and drive improvements in data governance.
Mentor more junior staff and contribute to team KPIs and continuous improvement.
About You: -
You will have a qualification in GIS or a data-related discipline, or you will have significant equivalent professional/commercial experience.
Ideally you will have at least 5 years commercial experience in data analysis or data curation.
Experience of Geospatial Data and/or Geodata is 100% required for this role.
You will be confident in designing and optimizing ETL/ELT processes, ideally using tools such as FME Form and/or FME Flow.
Experience of Data Governance Best Practices.
Skilled with database technologies such as Oracle, SQL Server, PostgreSQL or PostGIS.
Hands-on experience of SQL and/or Python.
Experience with cloud-based data tools and storage is a real bonus but this is not a prerequisite for the role.
You will be detail-oriented, inquisitive, and thrive on solving complex problems and finding efficiencies in existing data pipelines.
Our client is proud to be an equal opportunities employer. They celebrate diversity and are committed to creating an inclusive environment for all employees.
Please note, to be considered for this role you MUST have the Right to Work in the UK long-term without Company Sponsorship.
Please note that due to a high level of applications, we can only respond to applicants whose skills and qualifications are suitable for this position. No terminology in this advert is intended to discriminate against any of the protected characteristics that fall under the Equality Act 2010.
Bowerford Associates Ltd is acting as an Employment Agency in relation to this vacancy.
Test SME - Data Migration
Michael Page Technology
London
Hybrid
Mid
£600
aws
mysql
talend
sql
ssis
The Test SME - Data Migration will be responsible for supporting and executing with the testing phase of data migration projects to ensure accuracy, integrity, and consistency of data. This role demands extensive experience in data migration execution, data migration testing, data management, and a deep understanding of banking systems.
Client Details
Our client is a leading global financial services organisation undergoing a major transformation across its EMEA operations. They pride themselves on strong governance, innovative data practices, and a commitment to building cutting-edge capabilities within their growing Data Office.
Description
Test Strategy Development: Be part of the Development and documenting a comprehensive test strategy for data migration activities ensuring data accuracy, consistency, integrity and performance. Collaborate with project managers, data migration teams, and stakeholders to align the test strategy with project objectives.
Test Planning and Execution: Design, and execution of data migration testing, including functional, system, and acceptance testing. Identify and document data migration risks and ensure mitigation strategies are in place. Be part of defect tracking and resolution processes, ensuring all issues are addressed in a timely manner.
Data Management and Quality Assurance: Ensure data quality standards are adhered to during the migration process. Utilise data profiling tools to identify and rectify data anomalies and inconsistencies.
Team membership and Coordination: Adhere to best practices and standards. Foster a collaborative environment, ensuring clear communication among team members and stakeholders.
Reporting and Documentation: Generate detailed reports on testing activities, including defects discovered, resolution status, and recommendations for improvements. Maintain thorough documentation of all testing activities and findings for future reference and audits.
Stakeholder Engagement: Engage with stakeholders to ensure requirements are understood and met, and to report on testing progress and outcomes. Act as a liaison between technical teams and business stakeholders, ensuring alignment and understanding of data migration objectives and outcomes.
Data Lineage identification and Data Mapping - use mapping of source data fields to target field. Validate transformation rules and logic
Data Quality Control, Collection and Cleansing: Identify and collect relevant data from multiple sources, including internal databases, external vendors, and APIs. Perform data cleaning, transformation, and standardization processes to ensure data accuracy and consistency.
Data Analysis, validation and Interpretation: Utilise tools to perform advanced data analytics, including data blending, analysis, and modelling. Collaborate with business stakeholders to understand their requirements and translate them into data analysis tasks.
Problem Solving and Testing: Technical skills to use data to support investigations and testing, understand how to get root cause of problems, working with SMEs and team members. Review and recommend improvements to processes, data, controls to support improving the Bank.
Continuous Improvement: Keep abreast of industry best practices and emerging technologies in data migration and testing. Recommend process improvements to enhance the efficiency and effectiveness of data migration and testing activities.
Test and validate - Semantic recognition, Data Rules and tool configuration: A number of our programmes or work have a configuration component. The ability to evaluate, test and improve these rules is key.
Work with operations teams to approve setup of reference/ static data for testing and go-live
Coordinate with business process stakeholders to ensure that the business context and impacts are considered
Post-Migration Review: Conduct a post-migration review to identify any lessons learned and areas for improvement in future migrations
Profile
Experience in:
Data Migration execution and Testing
The use of any Data Migration Test tools - Data gaps ETL Validator, Testing Xperts, Tx tools, Dextrus, Azure Document DB, Beyond compare, Informix, Integrate.io, Excel
Rule writing and testing
Working with Large and Small Data
Data Management
Proficiency in using ETL tools - Talend, Informatica, SSIS and with Migration tools - AWS DMS, Azure Data Migration Service
Strong SQ skills and familiarity with database technologies - Oracle, SQL Server, MySQL
Principles of Data Testing, Data Governance and Data Quality
Data Remediation
Data Security and Handling within a Bank.
Documentation and Requirements Gathering
The Ability to solve problems, develop options and champion change
The ability to identify, manage and maintain Risk (RAID) Logs
The ability to defined and realise benefits
You must be able to work on multiple projects at one time, and be able to organise and priories work
Job Offer
Rate: £600/day
IR35: Inside IR35
Location: London - 2 to 3 days per week onsite
Contract: Until February 2026
Start: ASAP (flexible for the right candidate)
Senior Data Engineer Databricks SQL Azure
Client Server Ltd.
Nottingham
Fully remote
Senior
£55k - £65k
sql
Senior Data Engineer (Databricks SQL Azure) Nottingham / WFH to £65k
Opportunity to progress your career in a senior, hands-on Data Engineer role at a SaaS tech company.
As a Senior Data Engineer you’ll join a newly formed team that deals with customer facing reporting on big data sets, they process 120 billion lines of data per day. You’ll be primarily working with advanced SQL with Databricks in Azure including data modelling and low level data design work.
As a senior member of the team you’ll also contribute to technical discussions, strategic decision making and help to mentor more mid-level data engineers.
**Location / WFH:**There’s a remote interview and onboarding process and you’ll be able to work from most of the time, meeting up with the team for constructive meetings once a month / quarter in the Nottingham office.
About you:
You have advanced SQL and Databricks experience
You have experience in cloud based environments, Azure preferred
You have strong analysis and problem solving skills
You have experience of working in Agile development environments
You’re collaborative with great communication skills
Ideally you will some experience within an accountancy or finance environment
What’s in it for you:
As a Senior Data Engineer (Databricks SQL) you will earn a competitive salary plus a range of benefits:
Salary to £65k
25 days holiday
Vitality health insurance
5% non-contributory pension
Death in Service
Travel allowance to the Nottingham office
Apply now to find out more about this Senior Data Engineer (Databricks SQL Azure) opportunity.
At Client Server we believe in a diverse workplace that allows people to play to their strengths and continually learn. We’re an equal opportunities employer whose people come from all walks of life and will never discriminate based on race, colour, religion, sex, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status. The clients we work with share our values.
Data Architect - eDV Cleared
Searchability NS&D
London
In office
Mid
£65k - £100k
kafka
hadoop
DATA ARCHITECT - DV CLEARED
NEW PERMANENT JOB OPPORTUNITY AVAILABLE WITHIN A LEADING NATIONAL SECURITY SME FOR A DATA ARCHITECT WITH ENHANCED DV CLEARANCE
Permanent job opportunity for a Data Architect
Leading National Security SME
Salary up to £100,000 + Bonus
London based organisation in an easily accessible location
To apply please call or email
WHO WE ARE?
We are recruiting multiple a Data Architect to support urgent National Security & Defence projects in London. Due to the nature of these projects you must hold enhanced DV Security Clearance.
WHAT WILL THE DATA ARCHITECT BE DOING?
You will be joining a leading SME who is working hard to support National Security projects within UK Govt. Departments in London. As part of a team, you will be responsible for designing and implementing Data Solutions in Mission-Critical areas.
WE NEED THE DATA ARCHITECT TO HAVE .
Current DV clearance - Enhanced
Good at understanding complexity and abstracting that into a form that is consumable for a non-technical audience.
Experience of achieving data interoperability between 2 or more systems or organisations.
Experience of developing data standards and the negotiation and collaboration that this requires.
Good stakeholder engagement skills.
Self-starter, who can work autonomously within parameters set.
Experience with Data Modelling
Knowledge of Data Standards and writing technical specifications
Experience designing: data lakes, data warehouses, data lakehouses, data pipelines, data meshes or data marketplaces.
TO BE CONSIDERED .
Please either apply by clicking online or emailing me directly to For further information please call me on . I can make myself available outside of normal working hours to suit from 7am until 10pm. If unavailable, please leave a message and either myself or one of my colleagues will respond. By applying for this role, you give express consent for us to process & submit (subject to required skills) your application to our client in conjunction with this vacancy only. Also feel free to connect with me on LinkedIn, just search Dominic Barbet. I look forward to hearing from you.
DATA ARCHITECT - DV CLEARED
KEY SKILLS:
BIG DATA DEVELOPER / BIG DATA ENGINEER / SENIOR BIG DATA DEVELOPER / SENIOR BIG DATA ENIGNEER / DATA ENGINEER / DATA DEVELOPER / SENIOR SOFTWARE DEVELOPER LEAD SOFTWARE ENGINEER / LEAD SOFTWARE DEVELOPER / SENIOR SOFTWARE DEVELOPER / DV CLEARED / DV CLEARANCE / DEVELOPPED VETTING / DEVELOPED VETTED / DEEP VETTING / DEEP VETTED / SC CLEARED / SC CLEARANCE / SECURITY CLEARED / SECURITY CLEARANCE / NIFI / CLOUDERA / HADOOP / KAFKA / ELASTIC SEARCH
Contract Palantir Foundry Engineer - DV Cleared
Searchability NS&D
London
In office
Mid
£500 - £650
python
typescript
CONTRACT PALANTIR DATA ENGINEER - DV CLEARED
NEW OUTSIDE IR35 CONTRACT OPPORTUNITY AVAILABLE WITHIN A HIGHLY SECURE DELIVERY PROGRAMME FOR A DV CLEARED PALANTIR FOUNDRY DATA ENGINEER.
Contract opportunity for a Palantir Data Engineer to support cutting-edge National Security projects
Up to £650 per day (Outside IR35) DV clearance is essential
Based full-time onsite in London
Initial 12-month contract, with strong potential for long-term extension
To apply, please email:
WHO WE ARE?
We are growing our customer delivery team off the back of major success in the secure consultancy space. As part of a key programme of work, we are looking to build an additional Palantir Data Engineering team to help deliver mission-impacting solutions within a fast-paced, secure environment.
WE NEED THE PALANTIR DATA ENGINEER TO HAVE
Proven experience building and deploying applications using Palantir Foundry
Strong software/data engineering background - Computer Science, Physics, Maths, Data Science.
Proficiency in Python and JavaScript/TypeScript
Experience solving technical problems involving cloud, infrastructure, data pipelines, or full-stack apps
Ability to design and manage Ontologies and build applications using Foundry tools like Workshop
Strong communication skills to collaborate with technical and military/non-technical stakeholders
Experience integrating data from multiple sources into Foundry via Pipeline Builder and custom code
Comfortable working on-site within a secure environment and at pace with end-users
TO BE CONSIDERED . Please either apply by clicking online or emailing me directly to For further information, call me on . I’m happy to speak outside of normal working hours (7am-10pm). If unavailable, leave a message and I’ll get back to you ASAP.
By applying for this role you give express consent for us to process and submit your application to our client in conjunction with this vacancy only.
KEY SKILLS: PALANTIR / FOUNDARY / DATA ENGINEER / PIPELINE BUILDER / ONTOLOGY / PYTHON / TYPESCRIPT / FULL STACK / DV CLEARED / LONDON / CONTRACT
Big Data Engineer - DV Cleared
Searchability NS&D
London
In office
Mid
£40k - £80k
ansible
kafka
jenkins
elasticsearch
docker
apache-nifi
+2
BIG DATA ENGINEER - DV CLEARED
NEW PERMANENT JOB OPPORTUNITY AVAILABLE WITHIN A LEADING NATIONAL SECURITY SME FOR A BIG DATA ENGINEER WITH DV CLEARANCE
Permanent job opportunity for a Big Data Engineer
Leading National Security & Defence SME
Salary up to £80,000 plus clearance bonus
London based organisation in an easily accessible location
To apply please call or email
WHO WE ARE?
We are recruiting multiple Big Data Engineers to support urgent National Security & Defence projects in London. Due to the nature of these projects you must hold DV or enhanced DV Security Clearance.
WHAT WILL THE BIG DATA ENGINEER BE DOING?
You will be joining a leading SME who is working hard to support National Security projects within UK Govt. Departments in London. As part of a team, you will be responsible for implementing Big Data Solutions in Mission-Critical areas.
WE NEED THE BIG DATA ENGINEER TO HAVE .
Current DV clearance - Standard or Enhanced
Must have experience with big data tools such as Hadoop, Cloudera or Elasticsearch
Experience with Palantir Foundry is preferred but not essential
Experience working in an Agile Scrum environment
Experience in design, development, test and integration of software
IT WOULD BE NICE FOR THE BIG DATA ENGINEER TO HAVE .
Cloud based architectures
Microservice architecture or server-less architecture
Messaging / routing technologies such as Apache Nifi / RabbitMQ
Experience of DevSecOps automated deployment tools such as Jenkins, Ansible, Docker
TO BE CONSIDERED .
Please either apply by clicking online or emailing me directly to For further information please call me on . I can make myself available outside of normal working hours to suit from 7am until 10pm. If unavailable, please leave a message and either myself or one of my colleagues will respond. By applying for this role, you give express consent for us to process & submit (subject to required skills) your application to our client in conjunction with this vacancy only. Also feel free to connect with me on LinkedIn, just search Dominic Barbet. I look forward to hearing from you.
BIG DATA ENGINEER - DV CLEARED
KEY SKILLS:
BIG DATA DEVELOPER / BIG DATA ENGINEER / SENIOR BIG DATA DEVELOPER / SENIOR BIG DATA ENIGNEER / DATA ENGINEER / DATA DEVELOPER / SENIOR SOFTWARE DEVELOPER LEAD SOFTWARE ENGINEER / LEAD SOFTWARE DEVELOPER / SENIOR SOFTWARE DEVELOPER / DV CLEARED / DV CLEARANCE / DEVELOPPED VETTING / DEVELOPED VETTED / DEEP VETTING / DEEP VETTED / SC CLEARED / SC CLEARANCE / SECURITY CLEARED / SECURITY CLEARANCE / NIFI / CLOUDERA / HADOOP / KAFKA / ELASTIC SEARCH
Data Scientist - eDV Cleared
Searchability NS&D
London
In office
Mid
£45k - £95k
processing-js
DATA SCIENTIST - eDV CLEARED
NEW PERMANENT OPPORTUNITY AVAILABLE FOR A DATA SCIENTIST IN LONDON WITH ENHANCED DV CLEARANCE
Permanent opportunity for a Data Scientist
Enhanced DV security clearance is required
Salary up to £95,000 + Clearance Bonus + Company Bonus
Central London based organisation in an accessible location
To apply please call / or email
WHO WE ARE?
We are recruiting a permanent Data Scientist to work with a leading technology and engineering company that do excellent work in the National Security sector across a range of exciting projects. Due to the nature of our clients, you must hold enhanced DV security clearance.
WHAT WILL THE DATA SCIENCE ENGINEER BE DOING?
As a Data Science Engineer, you will be working alongside our team on a wide variety of projects with a special focus on innovation and AI. You could be working with all types of data (both structured and unstructured) using a variety of software - so we’re open to seeing engineers with different technology backgrounds!
THE DATA SCIENCE ENGINEER SHOULD HAVE .
Strong and demonstratable technical background in data science
Demonstratable experience developing & delivering data science strategies
Experience with big data ecosystems, cloud infrastructure and analytic frameworks
Techniques and toolkits for data cleansing, data processing and data preparation
Any experience with techniques and toolkits for combining data or analysing data streams would be desirable
Current enhanced DV security clearance
TO BE CONSIDERED .
Please either apply by clicking online or emailing me directly to For further information please call me on - I can make myself available outside of normal working hours to suit from 7am until 10pm. If unavailable, please leave a message and either myself or one of my colleagues will respond. By applying for this role, you give express consent for us to process & submit (subject to required skills) your application to our client in conjunction with this vacancy only. Also feel free to connect with me on LinkedIn, just search Dominic Barbet. I look forward to hearing from you.
DATA SCIENTIST
KEY SKILLS:
PRINCIPAL DATA SCIENTIST / LEAD DATA SCIENTIST / MACHINE LEARNING / ML / NLP / NATURAL LANGUAGE PROCESSING / DEEP LEARNING / DATA SCIENCE / LSTM / LONG SHORT TERM MEMORY / CNN / COGNETIVE NEURAL NETWORKS / CLOUD INFRASTRUCTURE / ARTIFICIAL INTELLIGENCE / INNOVATION / SECURITY CLEARANCE / DV CLEARED / SC CLEARED / SC CLEARANCE / SECURITY CLEARANCE / DV CLEARANCE / DEVELOPPED VETTING / DEEP VETTING / DEVELOPPED VETTED
Contract Palantir Foundry Data Engineer - DV Cleared
Searchability NS&D
London
Hybrid
Mid
£550 - £800
python
sql
Palantir Foundry Data Engineer - DV Cleared
NEW CONTRACT OPPORTUNITY FOR A PALANTIR FOUNDRY DATA ENGINEER TO WORK ON A NATIONAL SECURITY PROJECT IN LONDON WITH DV CLEARANCE
Contract role in London for a Palantir Foundry Data Engineer
Must hold DV Security Clearance
Central London based
Daily rate up to £800
Hybrid position
To apply, email: or call
Who we are
We are seeking an experienced Palantir Foundry Data Engineer with current DV clearance to join a high-profile programme. This is a contract position offering hybrid working and a daily rate of up to £800.
In this role, you will be responsible for designing, developing, and optimising data pipelines and integrations within Palantir Foundry, ensuring data is efficiently processed, transformed, and made available for analysis and operational use. You will collaborate closely with analysts, data scientists, and business stakeholders to deliver robust, secure, and scalable data solutions.
What we’re looking for
Key Responsibilities:
Develop and maintain data pipelines and workflows in Palantir Foundry
Integrate diverse data sources, ensuring data quality and integrity
Optimise performance of data ingestion, transformation, and visualisation
Collaborate with stakeholders to define requirements and deliver solutions
Ensure security and compliance with DV-level clearance standards
Skills & Experience:
Current DV clearance (essential)
Proven experience working with Palantir Foundry in complex environments
Strong skills in data engineering, ETL processes, and data modelling
Proficiency in relevant programming/scripting languages (e.g. Python, SQL)
Experience working with large-scale datasets in secure environments
Strong problem-solving skills and stakeholder engagement abilities
TO BE CONSIDERED .
Please either apply by clicking online or emailing me directly to For further information please call me on . I can make myself available outside of normal working hours to suit from 7am until 10pm. If unavailable, please leave a message and either myself or one of my colleagues will respond. By applying for this role, you give express consent for us to process & submit (subject to required skills) your application to our client in conjunction with this vacancy only. Also feel free to connect with me on LinkedIn, just search Dominic Barbet. I look forward to hearing from you.
PALANTIR FOUNDRY DATA ENGINEER - DV CLEARED
Contract Data Engineer - eDV Cleared
Searchability NS&D
London
In office
Mid
£450 - £650
confluence
kafka
elasticsearch
jira
apache-nifi
rabbitmq
+1
CONTRACT DATA ENGINEER - eDV CLEARED
NEW OUTSIDE IR35 CONTRACT OPPORTUNITY AVAILABLE WITHIN A LEADING NATIONAL SECURITY SME FOR A DATA ENGINEER WITH eDV CLEARANCE
Contract job opportunity for a Data Engineer
National Security client
Palantir Foundry
Outside IR35
Central London based organisation in an easily accessible location
To apply please call / or email
WHO WE ARE?
We are recruiting a contract Data Engineer to work with a National Security SME in central London. Due to the nature of the work, you must hold enhanced DV Clearance.
WE NEED THE DATA ENGINEER TO HAVE .
Current enhanced DV Security Clearance
Experience with big data tools such as Hadoop, Cloudera or Elasticsearch
Experience With Palantir Foundry
Experience working in an Agile Scrum environment with tools such as Confluence / Jira
Experience in design, development, test and integration of software
Willingness to learn new technologies
IT WOULD BE NICE FOR THE DATA ENGINEER TO HAVE .
Cloud based architectures
Microservice architecture or server-less architecture
Messaging / routing technologies such as Apache Nifi / RabbitMQ
TO BE CONSIDERED .
Please either apply by clicking online or emailing me directly to For further information please call me on . I can make myself available outside of normal working hours to suit from 7am until 10pm. If unavailable, please leave a message and either myself or one of my colleagues will respond. By applying for this role, you give express consent for us to process & submit (subject to required skills) your application to our client in conjunction with this vacancy only. Also feel free to connect with me on LinkedIn, just search Dominic Barbet. I look forward to hearing from you.
DATA ENGINEER - DV CLEARED
KEY SKILLS:
BIG DATA DEVELOPER / BIG DATA ENGINEER / SENIOR BIG DATA DEVELOPER / SENIOR BIG DATA ENIGNEER / DATA ENGINEER / DATA DEVELOPER / SENIOR SOFTWARE DEVELOPER LEAD SOFTWARE ENGINEER / LEAD SOFTWARE DEVELOPER / SENIOR SOFTWARE DEVELOPER / DV CLEARED / DV CLEARANCE / DEVELOPPED VETTING / DEVELOPED VETTED / DEEP VETTING / DEEP VETTED / SC CLEARED / SC CLEARANCE / SECURITY CLEARED / SECURITY CLEARANCE / NIFI / CLOUDERA / HADOOP / KAFKA / ELASTIC SEARCH / LEAD BIG DATA ENGINEER / LEAD BIG DATA DEVELOPER
Contract Python Engineer - DV Cleared
Searchability NS&D
London
In office
Mid
£500 - £650
python
confluence
kafka
elasticsearch
jira
hadoop
+1
CONTRACT PYTHON / DATA ENGINEER - DV CLEARED
NEW OUTSIDE IR35 CONTRACT OPPORTUNITY AVAILABLE WITHIN A LEADING NATIONAL SECURITY SME FOR A PYTHON / DATA ENGINEER WITH DV CLEARANCE
Contract job opportunity for a Data Engineer
National Security & Defence client
Outside IR35
12 month rolling contract
Central London based organisation in an easily accessible location
To apply please call / or email
WHO WE ARE?
We are recruiting a contract Python / Data Engineer to work with a National Security & Defence SME in central London. Due to the nature of the work, you must hold UKSV/MOD or Enhanced DV Clearance.
WE NEED THE PYTHON/DATA ENGINEER TO HAVE .
Current DV Security Clearance (Standard or Enhanced)
Experience with big data tools such as Hadoop, Cloudera or Elasticsearch
Python / PySpark experience
Experience With Palantir Foundry is nice to have
Experience working in an Agile Scrum environment with tools such as Confluence / Jira
Experience in design, development, test and integration of software
Willingness to learn new technologies
TO BE CONSIDERED .
Please either apply by clicking online or emailing me directly to For further information please call me on . I can make myself available outside of normal working hours to suit from 7am until 10pm. If unavailable, please leave a message and either myself or one of my colleagues will respond. By applying for this role, you give express consent for us to process & submit (subject to required skills) your application to our client in conjunction with this vacancy only. Also feel free to connect with me on LinkedIn, just search Dominic Barbet. I look forward to hearing from you.
PYTHON/DATA ENGINEER - DV CLEARED
KEY SKILLS:
BIG DATA DEVELOPER / BIG DATA ENGINEER / SENIOR BIG DATA DEVELOPER / SENIOR BIG DATA ENIGNEER / DATA ENGINEER / DATA DEVELOPER / SENIOR SOFTWARE DEVELOPER LEAD SOFTWARE ENGINEER / LEAD SOFTWARE DEVELOPER / SENIOR SOFTWARE DEVELOPER / DV CLEARED / DV CLEARANCE / DEVELOPPED VETTING / DEVELOPED VETTED / DEEP VETTING / DEEP VETTED / SC CLEARED / SC CLEARANCE / SECURITY CLEARED / SECURITY CLEARANCE / NIFI / CLOUDERA / HADOOP / KAFKA / ELASTIC SEARCH / LEAD BIG DATA ENGINEER / LEAD BIG DATA DEVELOPER
Data Engineer - PySpark, Databricks
Tenth Revolution Group
Multiple locations
Hybrid
Mid
£50k - £75k
pyspark
python
typescript
airflow
sql
dbt
A rapidly growing company in the B2B Software as a Service (SaaS) space are looking for a Deployed Engineer to join their expanding team in London (hybrid working - 2-3 days a week in their modern office space).
Their product is a platform that acts as a “digital twin” of a their customers businesses - integrating internal and external data from a variety of sources to act as a “single source of truth”, powering actionable insights at scale. When combined with AI algorithms, the platform drives strategic decision-making, and enables planning and effective execution, allowing businesses to achieve their targeted state. They are a true pioneer in their field!
They believe the future of B2B SaaS is about delivering tailored, dynamic solutions for their clients, rather than implementing static tools. This is where you come in - you’ll be working within a team who believe value is created not just in the codebase, but in the implementation layer - making this role ideal for someone who thrives in dynamic, customer-facing environments.
The role:
Adapt and deploy a powerful data platform to solve complex business problems
Design scalable generative AI workflows using modern platforms like Palantir AIP
Execute advanced data integration using PySpark and distributed technologies
Collaborate directly with clients to understand priorities and deliver outcomes
What We’re Looking For:
Strong skills in PySpark, Python, and SQL
Ability to translate ambiguous requirements into clean, maintainable pipelines
Quick learner with a passion for new technologies
Experience in startups or top-tier consultancies is a plus
Nice to Have:
Familiarity with dashboarding tools, Typescript, and API development
Exposure to Airflow, DBT, Databricks
Experience with ERP (e.g. SAP, Oracle) and CRM systems
What’s On Offer:
Salary: 50,000- 75,000 + share options
Hybrid working: 2-3 days per week in a vibrant Soho office
A highly social culture with regular team events and activities
Work alongside seasoned tech and business leaders
Be part of a mission-driven company with a strong social impact ethos
If you’re excited by the idea of working at the intersection of AI, data, and enterprise transformation - and want to be part of a fast-scaling, values-led team - we’d love to hear from you.
Please Note: This is a role for UK residents only. This role does not offer Sponsorship. You must have the right to work in the UK with no restrictions. Some of our roles may be subject to successful background checks including a DBS and Credit Check.
Tenth Revolution Group / Nigel Frank are the go-to recruiter for Data and AI roles in the UK, offering more opportunities across the country than any other. We’re the proud sponsor and supporter of SQLBits, and the London Power BI User Group. To find out more and speak confidentially about your job search or hiring needs, please contact me directly at (url removed)
Data Engineer (ML/AI Focus) - Healthcare Analytics London Hybrid
Avanti
London
Hybrid
Mid
£40k - £55k
aws
r
python
sql
Data Engineer (ML/AI Focus) - Healthcare Analytics London Hybrid Working
The Opportunity
I’m working with a boutique analytics consultancy in London that’s transforming healthcare through data and AI. Working on both Data Consulting and Product delivery for the NHS and healthtech companies, you’ll build end-to-end machine learning solutions that make a tangible difference to patients and healthcare professionals. This isn’t just another engineering role - you’ll be client-facing, working alongside the people who use your solutions to understand their data needs, and see the real-world impact of your work.
What You’ll Be Doing
Building and deploying production ML pipelines in cloud environments (AWS, Azure, or GCP)
Implementing MLOps best practices to maintain and scale AI applications
Working directly with NHS trusts and healthcare clients to understand their challenges
Contributing to both project delivery and the development of repeatable product propositions
Bringing innovation to how healthcare organisations use data and machine learning
Skills and experience required
Essential:
1-3 years of professional experience in data engineering, ML engineering, or similar
Strong Python or R skills, with experience in SQL
Understanding of machine learning algorithms and their practical application
Ability to deliver high-quality work to tight deadlines
Genuine interest in healthcare - you want your work to matter
Highly Desirable:
Experience with MLOps in cloud environments
Background in healthcare, consulting, or research settings
Exposure to modern cloud platforms (AWS/Azure/GCP)
Salary: £40,000 - £55,000 + good benefits + bonus
Location: Central London onsite with some hybrid working
The role is available now so please APPLY TODAY for immediate consideration.
They do not offer visa sponsorship so you must have the right to work in the UK.

Frequently asked questions

What qualifications do I need to become a Data Engineer?
Typically, a Data Engineer should have a strong background in computer science or related fields, proficiency in programming languages like Python or Java, and experience with data warehousing, ETL processes, and big data technologies such as Hadoop or Spark.
What types of Data Engineer jobs can I find on Haystack?
Haystack features a wide range of Data Engineer positions, including roles in startups, large enterprises, and remote opportunities. You can find jobs specializing in cloud data engineering, real-time data processing, data pipeline development, and more.
How can I improve my chances of getting hired as a Data Engineer?
To improve your chances, tailor your resume to highlight relevant skills and projects, gain hands-on experience with popular data tools, contribute to open-source projects, and stay updated with the latest trends in data engineering.
Are entry-level Data Engineer jobs available on Haystack?
Yes, Haystack lists entry-level Data Engineer roles suitable for recent graduates or professionals transitioning into data engineering, as well as internships and junior positions to help you start your career.
Can I find remote Data Engineer positions on Haystack?
Absolutely. Haystack offers many remote and flexible Data Engineer job listings to suit your preferred working style and location.