Make yourself visible and let companies apply to you.
Roles
Data Engineer Jobs
Overview
Looking for top Data Engineer jobs? Explore the latest data engineering opportunities on Haystack, your go-to IT job board. Whether you're skilled in ETL, data pipelines, or big data technologies, find the perfect role to advance your career today. Start your search for Data Engineer positions now!
Data Engineer
Randstad Technologies Recruitment
Multiple locations
Remote or hybrid
Senior - Leader
ÂŁ60/hour - ÂŁ80/hour
RECENTLY POSTED

Senior Data Engineer (Data Modernization)

We are looking for a proactive, goal-oriented Senior Data Engineer to build and scale high-performance big data pipelines. This role is ideal for a collaborative problem-solver who excels in distributed environments and values technical excellence through CI/CD and Agile.

The Essentials

  • Pipeline Tech: 5+ years of experience with Kafka, Spark, Hadoop, and DBT.
  • Modern Modeling: Expert skills in Data Vault and Dimensional modeling.
  • Cloud & DevOps: Strong AWS experience with a focus on CI/CD and writing secure code.
  • Data Governance: Proven ability to embed quality, lineage, and monitoring into every pipeline.
  • Leadership: A passion for guiding and coaching fellow team members both technically and procedurally.

The Mission

You will lead data modernization efforts, ensuring large-scale systems remain compliant and secure while maintaining a “right tool for the job” mindset. You’ll bridge the gap between complex data engineering and actionable analytics/ML concepts.

Apply

If you have a passion for building well-governed, scalable systems and want to lead a high-impact team, please apply with your CV

Randstad Technologies is acting as an Employment Business in relation to this vacancy.

Contract Principal Data Analyst - Hybrid - Reading
CBSbutler Holdings Limited trading as CBSbutler
Reading
Hybrid
Senior
ÂŁ70/hour - ÂŁ84/hour
RECENTLY POSTED

Principal Data Analyst
6-month Contract
70 - 83 per hour insideIR35
Based in Reading - hybrid working - 2-3 days onsite per week
SC Clearance is essential for this role

Urgently hiring for a Contract Principal Data Analyst to drive advanced analytics solutions, shape enterprise data practices, and lead innovation across complex environments.

Responsibilities:
Lead the design and delivery of advanced analytics and reporting solutions
Own data modelling, dashboard design, and scalable reporting frameworks
Define governance, security, and quality standards
Partner with architects, engineers, and business leaders to influence data strategy
Mentor and develop analysts within a collaborative team

Skills and Experience:
Strong expertise in SQL and data warehousing
Advanced Power BI / Tableau dashboard development
Deep understanding of data lifecycle, governance, and quality
Experience in cloud or hybrid data environments
Proven ability to influence technical decisions and communicate insights clearly
Experience working in Agile / DevOps environments
Python and ML would be an added advantage.

Please apply for immediate interview.

CBSbutler is operating and advertising as an Employment Agency for permanent positions and as an Employment Business for interim / contract / temporary positions. CBSbutler is an Equal Opportunities employer and we encourage applicants from all backgrounds.

Data Migration Engineer
JAM Recruitment Ltd
Rochester
In office
Mid - Senior
ÂŁ41/hour - ÂŁ42/hour
RECENTLY POSTED

Join a world-renowned aerospace and defence organisation as a Data Migration Engineer in Rochester full time onsite

Would you like to join a global aerospace and defence organisation? Do you want to work as part of an organisation protecting people and maintaining maximum national security?

Due to a drive for greater success, this advanced manufacturing business is currently searching for a Data Migration Engineer to add to their talented, hardworking team in Rochester on a 12 month contract. Striving for innovation and creativity you can ensure no two days will be the same.

Rate:

ÂŁ42.11 per hour inside / Umbrella

The role:

The Data Migration Engineer is responsible for designing, developing, and executing data migration solutions to support ERP data transformation into Oracle Cloud.

The role ensures data is migrated accurately, securely, and efficiently, maintaining data integrity, quality, and compliance in accordance with business transformation rules.

You’ll work closely with business stakeholders, data architects, and project team to deliver reliable and auditable data migration.

Knowledge & Skills:

? Strong experience with Oracle SQL and PL/SQL
? Solid understanding of Oracle ERP database architecture
? Experience with joins, subqueries, analytic functions, and performance tuning of data extractions
? Knowledge of ETL techniques and methods
? Experience in Oracle 11i data structures or Oracle Cloud environments
? Previous experience of data migration in a similar role

Background

This is an exciting opportunity to work for a global business that has developed a strategy of working closely with its contingent workforce. Within the role, you can expect to:

Work on complex, cutting-edge projects
Achieve a work/life balance
Develop your skills

APPLY NOW

If this sounds like the role for you, we’d love to hear from you! Send your CV to Stella today!

Lead PySpark Engineer - Data, SAS, AWS
Randstad Technologies Recruitment
London
Remote or hybrid
Senior
ÂŁ350/day - ÂŁ380/day
RECENTLY POSTED

Lead Data Engineer - Pyspark / AWS / Python / SAS - Financial Sector

As a Lead PySpark Engineer, you will design, develop, and fix complex data processing solutions using PySpark on AWS. You will work hands-on with code, modernising legacy data workflows and supporting large-scale SAS-to-PySpark migrations. The role requires strong engineering discipline, deep data understanding, and the ability to deliver production-ready data pipelines in a financial services environment.

Essential Skills

PySpark & Data Engineering

  • Minimum 5+ years of hands-on PySpark experience.
  • SAS to Pyspark migration experience
  • Proven ability to write production-ready PySpark code.
  • Strong understanding of data and data warehousing concepts, including: ETL/ELT, Data models, Dimensions and facts, Data marts, SCDs

Spark Performance & Optimisation

  • Strong knowledge of Spark execution concepts, including partitioning, optimisation, and performance tuning.
  • Experience troubleshooting and improving distributed data processing pipelines.
  • Python & Engineering Quality
  • Strong Python coding skills with the ability to refactor, optimise, and stabilise existing codebases.
  • Experience implementing parameterisation, configuration, logging, exception handling, and modular design.

SAS & Legacy Analytics

  • Strong foundation in SAS (Base SAS, SAS Macros, SAS DI Studio).
  • Experience understanding, debugging, and modernising legacy SAS code.

Data Engineering & Testing

  • Ability to understand end-to-end data flows, integrations, orchestration, and CDC.
  • Experience writing and executing data and ETL test cases.
  • Ability to build unit tests, comparative testing, and validate data pipelines.

Engineering Practices

  • Proficiency in Git-based workflows, branching strategies, pull requests, and code reviews.
  • Ability to document code, data flows, and technical decisions clearly.
  • Exposure to CI/CD pipelines for data engineering workloads.

AWS & Platform Skills

  • Strong understanding of core AWS services, including: S3, EMR / Glue, Workflows, Athena, IAM
  • Experience building and operating data pipelines on AWS.
  • Big data processing on cloud platforms.

Desirable Skills

  • Experience in banking or financial services.
  • Experience working on SAS modernisation or cloud migration programmes.
  • Familiarity with DevOps practices and tools.
  • Experience working in Agile/Scrum delivery environments.

I have three roles available all of which can be worked remotely so dont delay and apply today. I have interview slots ready to be filled

Randstad Technologies is acting as an Employment Business in relation to this vacancy.

Junior / Graduate Data Scientist
Adria Solutions Ltd
Manchester
Hybrid
Graduate - Junior
ÂŁ25,000 - ÂŁ35,000
RECENTLY POSTED

Junior / Graduate Data Scientist Our client, a fast-growing and innovative organisation, is looking for a Junior / Graduate Data Scientist to join their expanding data function.

This is an excellent opportunity for an early-career data professional to gain hands-on experience across the full machine learning lifecycle in a regulated, real-world environment. You ll work closely with Senior Data Scientists, Data Engineers, and Analysts to develop, test, and support production-ready models using modern cloud technologies.

The Role

You will work primarily with Python, SQL, and AWS (including Amazon SageMaker) to:

  • Extract, transform, and analyse data from AWS data platforms
  • Perform exploratory data analysis and communicate insights clearly
  • Build and evaluate baseline machine learning models (classification & regression)
  • Support model experimentation in Amazon SageMaker Unified Studio
  • Contribute to model deployment, monitoring, and safe rollout practices
  • Follow best practices in Git, code review, testing, and Agile delivery
  • Support data governance, documentation, and privacy-by-design principles

This role offers structured mentorship and exposure to data science beyond modelling including productionisation, compliance, and engineering collaboration.

Essential Requirements

  • Degree in a quantitative discipline (Data Science, Computer Science, Maths, Statistics, Physics, Engineering) or equivalent experience
  • Early career stage (graduate, placement, bootcamp, or personal projects)
  • Strong Python skills (pandas, scikit-learn)
  • Solid SQL skills (joins, aggregations, relational data)
  • Understanding of ML fundamentals (train/test splits, overfitting, evaluation metrics)
  • Clear communication skills
  • Strong learning mindset and interest in AWS/cloud technologies
  • Comfortable working in a regulated environment

Desirable

  • Exposure to Amazon SageMaker
  • Experience with Jupyter Notebooks
  • Git and basic software engineering practices
  • Data visualisation tools (e.g., Power BI)
  • Financial services data exposure (risk, fraud, payments)

What s on Offer

  • Structured development and mentorship
  • Hybrid working model
  • Company pension
  • days holiday + bank holidays
  • Birthday leave, charity day, wellbeing day, wedding leave

Interested? Please Click Apply Now! Junior / Graduate Data Scientist

DBA
Hyperloop Recruitment
Liverpool
Hybrid
Mid - Senior
ÂŁ60,000
RECENTLY POSTED

ÂŁ60,000 (DOE)
Liverpool

Hyperloop are working with a leading client based in Liverpool who are looking for a DBA to join their growing team.

The successful candidate will be responsible for designing and maintaining robust database solutions across Azure VM & cloud environments, whilst enforcing best practices in SQL development and source control.

The role would suit an experienced DBA with a strong background in SQL Server and Azure, with a proven track record of managing a secure, high performance database and data warehouse.

Key skills required:

  • Proven record developing & supporting MI & Data Warehouse systems
  • SQL Server Management Studio
  • SSIS
  • MS Azure (Data Factory)
  • Visual Studio
  • Sharedo
  • Azure Synapse / Databricks
  • Python (desirable)

The DBA role is based in Liverpool city centre & will be hybrid working with 3 days per week in the office.

The role is paying up to ÂŁ60,000 + benefits and is commutable from Bolton, Birkenhead, Chester, Deeside, Manchester, Runcorn, Warrington and Wirral. To apply, click here!

Senior SAS Data Engineer - SC Cleared
Sanderson Government and Defence
Telford
Hybrid
Senior
ÂŁ525/day - ÂŁ550/day
RECENTLY POSTED

Senior SAS Data Engineer - SC Cleared - Telford (2 Days per Week) - ÂŁ550/day (Inside IR35) - 6-Month Contract

We are seeking an experienced Senior SAS Data Engineer to lead data engineering initiatives across multiple teams. This 6-month contract requires 2 days per week onsite in Telford, with the remainder remote. The role focuses on delivering secure, scalable SAS data solutions within a complex environment.

The Role

You will provide technical leadership across data engineering activities, ensuring best practice in architecture, security, and governance. Working closely with analytics, DevOps, and business teams, you will design and deliver robust SAS-based data platforms that enable reliable, insight-driven decision-making.

Key Responsibilities

  • Lead the design and delivery of SAS data platforms, pipelines, and data processes.
  • Develop, optimise, and maintain complex SQL-based data models and transformations.
  • Build and manage ETL processes and large-scale data integration workflows.
  • Implement strong data governance, security, and performance standards aligned to SC environments.
  • Collaborate with stakeholders to align technical solutions with business priorities.
  • Promote engineering best practices and continuous improvement.

Essential Experience

  • Proven experience in a senior data engineering role, with strong SAS expertise.
  • Excellent SQL skills with experience handling large, complex datasets.
  • Strong experience building and maintaining enterprise-scale SAS data solutions.
  • Experience working in secure or government environments.
  • Strong stakeholder management and Agile delivery experience.
  • Must hold active SC clearance.

Reasonable Adjustments:

Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people of all backgrounds and perspectives. Our success is driven by our people, united by the spirit of partnership to deliver the best resourcing solutions for our clients.

If you need any help or adjustments during the recruitment process for any reason, please let us know when you apply or talk to the recruiters directly so we can support you.

Integration Engineer
JAM Recruitment Ltd
Rochester
In office
Mid - Senior
ÂŁ40/hour - ÂŁ42/hour
RECENTLY POSTED

Join a world-renowned aerospace and defence organisation as a Integration Engineer in Rochester full time onsite

Would you like to join a global aerospace and defence organisation? Do you want to work as part of an organisation protecting people and maintaining maximum national security?

Due to a drive for greater success, this advanced manufacturing business is currently searching for a Integration Engineer to add to their talented, hardworking team in Rochester on a 12 month contract. Striving for innovation and creativity you can ensure no two days will be the same.

Rate:

ÂŁ42.11 per hour inside / Umbrella

The role:

? Design - Input into the definition of requirements for Integration Functional Specifications; document technical specifications
? Build & Configuration - Develop source to OICS integrations, configure non-Oracle adapters.
? Quality & Robustness - Implement error handling, retries, idempotency, logging and alerts; align mappings to canonical models and maintain lookups.
? Testing & Deployment - Unit test integrations, support SIT/E2E/UAT, package and promote artifacts.
? Operations & Support - Monitor dashboards/alerts, troubleshoot incidents and reprocessing per run books, and provide Hypercare support.

Knowledge & Skills:

? Strong experience with Oracle SQL and PL/SQL
? Solid understanding of Oracle ERP database architecture and/or Oracle Cloud environments (EPM / Fusion)
? Experience with joins, subqueries, analytic functions, and performance tuning of data extractions
? Knowledge of ETL techniques and methods
? Powershell or other scripting/ automation tools
? Experience in Oracle 11i data structures or Oracle Cloud environments
? Previous experience of data migration in a similar role
? FBDI experience
Background

This is an exciting opportunity to work for a global business that has developed a strategy of working closely with its contingent workforce. Within the role, you can expect to:

Work on complex, cutting-edge projects
Achieve a work/life balance
Develop your skills

APPLY NOW

If this sounds like the role for you, we’d love to hear from you! Send your CV to Stella today!

IT - Business System Engineer
MorePeople
Hereford
Hybrid
Mid - Senior
ÂŁ40,000 - ÂŁ45,000

Job Description

We are looking for an Experienced Business Support Engineer to manage, maintain, and improve our client’s SQL databases and business systems, ensuring they are secure, high performing, and aligned with business needs. The role also supports reporting, data delivery, and ongoing system development, while working closely with IT and wider business teams.

Main Responsibilities

  • Manage and maintain SQL Server databases in line with Microsoft best practices.
  • Monitor, optimise, and troubleshoot database performance, availability, and security.
  • Extract, manipulate, and deliver data to meet business and customer requirements.
  • Maintain and enhance existing SQL code, stored procedures, and database objects that support core systems.
  • Provide support and bug fixes for C# and VB.NET applications when required, with opportunities to further develop these skills.
  • Deliver 3rd line support and participate in on-call support for system and data-related issues when needed.
  • Stay up to date with developments in business systems and best practices, advising senior management and supporting the adoption of improvements where relevant.
  • Support the business by producing reports and management information as required.

Key Performance Indicators

  • Proactively manage and plan work relating to business systems data maintenance and development.
  • Provide an approachable, responsive, and professional service to end users, working within agreed timescales and minimising business disruption.
  • Identify and escalate risks, issues, and opportunities appropriately, communicating clearly and in a timely manner.
  • Support the wider IT operations team as required.
  • Build and maintain strong working relationships across IT and the wider business, including involvement in change initiatives.

Key Skills & Behaviours

  • Technical knowledge: Deployable hands-on knowledge of SQL management and development. Good knowledge of development techniques and tools. Good understanding of integration methods and technologies. Reporting & BI tools and implementations
  • Development skills: Knowledge and skillset with the Microsoft stack languages and frameworks as well as experience with WPF and MVVM technologies. This is a lesser part of the role, however a basic level is required due to a small work team.
  • Problem solving skills and Solution focus: The ability to interpret user requirements and issues into solutions that make best use of existing systems, implementing new technology & processes thoughtfully for extended benefits.
  • Working Style: Confidence to work under own supervision and directly with business teams in a fast-paced environment demanding rapid change. A strong customer delivery focus
  • Systems Thinking, approach & understanding: Strong Understanding of core technologies deployed by the business with a structured approach to management of systems and utilisation of 3rd parties.
  • Communication: Selects appropriate communication method based on the needs and sophistication of the target; able to communicate effectively, remain calm under pressure, always courteous and helpful.

If you are interested, please apply below, alternatively, contact Angus on (phone number removed) or (url removed)

Data Engineer - Minerva - Contract - SC Clearance
CBSbutler Holdings Limited
Telford
Hybrid
Mid - Senior
ÂŁ480/day - ÂŁ515/day

Data Engineer - Minerva VAT TxR
Active SC Clearance required
Telford
ÂŁ480 - ÂŁ515 per day inside IR35
Hybrid - 2 days per week

Hiring for an experienced Data Engineer to join the Minerva Platform Data Team, supporting the consolidation of key tax data services into a single, scalable calculations platform.

Responsibilities:
* Design and build scalable data pipelines and integration solutions
* Develop ETL processes using Pentaho and Talend
* Work with Denodo (data virtualisation) and SAS
* Drive Agile delivery and DevOps CI/CD practices
* Ensure high standards of data quality, governance, and security

Skills and Experience:
* Strong ETL expertise (Pentaho, Talend)
* Experience with Denodo and SAS
* Strong SQL and data modelling skills
* Agile/Scrum and DevOps experience (Jenkins, Git, Docker, Kubernetes)
* Ability to solve complex data challenges in secure environments

Please apply for immediate interview!

CBSbutler is operating and advertising as an Employment Agency for permanent positions and as an Employment Business for interim / contract / temporary positions. CBSbutler is an Equal Opportunities employer and we encourage applicants from all backgrounds.

Senior Software Engineer
Hyre AI Limited
London
Hybrid
Senior
ÂŁ65,000 - ÂŁ75,000

Stealth AI Startup | London | Hybrid We’re hiring for a fast-growing AI startup building a cloud-native analytics platform that transforms large volumes of LLM output into structured, queryable intelligence for businesses and brands. The platform is fully AWS-native, heavily event-driven, and built for scale from day one. The team actively uses AI agents and automation across both engineering and product workflows, not as a gimmick, but as a core operating principle. We’re looking for a Senior Software Engineer to help shape, build, and run the core cloud and data foundation of the platform. What You’ll Be Doing \* 4+ years of experience \* You’ll take ownership of significant parts of the cloud and data layer, including: \* Designing and operating AWS infrastructure (ECS Fargate, Lambda, Step Functions, EventBridge, S3, Athena, DynamoDB) \* Building robust ETL / ELT pipelines that transform raw LLM outputs into structured, analytics-ready datasets \* Designing event-driven workflows and distributed processing systems \* Managing containerised services using Docker and ECS \* Implementing CI/CD pipelines and infrastructure-as-code practices \* Establishing monitoring, alerting, and reliability standards across production workloads \* Optimising system performance, cost efficiency, and scalability \* Embedding security, IAM design, and least-privilege principles from the ground up \* Actively leveraging AI agents and automation tools to improve engineering workflows and system design \* This is a hands-on role. You’ll be designing, building, debugging, and shipping. Not just overseeing. What We’re Looking For \* Strong production experience with AWS (especially ECS, Lambda, Step Functions, S3, EventBridge) \* Solid Docker and container-native deployment experience \* Python for data processing, automation, or backend services \* Experience building and maintaining data pipelines or data lake architectures \* Infrastructure-as-code (Terraform, CDK, or CloudFormation) \* CI/CD and production engineering mindset \* Comfort owning distributed systems and running real workloads in production \* Genuine curiosity about AI tools, LLM APIs, and automation-driven engineering Why Join \* Real ownership of core cloud and data architecture \* Work at the intersection of cloud infrastructure, data engineering, and applied AI \* Modern AWS-native, event-driven system with meaningful scale \* Small, senior team operating in a high-trust, low-bureaucracy environment \* Direct impact on a fast-growing AI product

Databricks SME and AWS Data Engineer
Experis
Northampton
Hybrid
Mid - Senior
ÂŁ460/day - ÂŁ480/day
+4

Location: UK - Northampton (Hybrid) 6 months UMBRELLA ONLY Project Objective: A key initiative involves migrating from Aerospike to Postgres and leveraging Databricks for back-testing fraud detection models. This role will contribute to the development and integration of Proof of Concepts (PoCs) from the detection backlog. Key Responsibilities: Collaborate with cross-functional teams to architect, design, develop and deliver PoCs related to fraud detection. Lead the ETL & Data manipulation/engineering work on AWS . Integrate Databricks-based back-testing into the fraud detection pipeline. Work closely with architects and other developers to ensure seamless integration with existing systems. Participate in weekly stand-up calls to demonstrate progress and align on deliverables. Take ownership of tasks from the backlog, ensuring timely and high-quality delivery. Required Skills & Experience: Strong AWS Data Engineering expertise . Proficiency in Kafka for real-time data streaming and integration. Proficiency with Databricks for data processing and analytics. Working knowledge of NodeJS would be an added advantage. Solid programming skills in Python , PySpark, Spark . Candidate must be adept at working with large-scale datasets , S3 , Python , and data cataloguing tools . Familiarity with data engineering best practices is essential. Performance and Code Quality - candidate should demonstrate a strong commitment to building high-performance, scalable, and resilient distributed systems , with an emphasis on clean, maintainable, and testable code . CI/CD Proficiency - Hands-on experience with CI/CD pipelines and tools (e.g., Jenkins, GitHub Actions, GitLab CI, etc.) for automated build, test, and deployment processes. Secure Development Practices - Awareness of secure coding standards , data protection principles, and experience working in regulated environments (especially relevant for fraud detection and financial services). Testing Rigor - Strong understanding of unit testing, integration testing, and test automation frameworks (e.g., PyTest, etc.) to ensure code quality and reliability. Familiarity with cloud-native development and CI/CD practices. Agile mindset with a proactive, developer-driven approach to problem-solving. Ideal Candidate Profile: A hands-on architect & developer with a strong sense of ownership and accountability. Comfortable working in a collaborative, fast-paced environment . Able to pick up tasks independently , contribute to design discussions, and deliver integrated solutions. Strong communication skills to engage with both technical and non-technical stakeholders. All profiles will be reviewed against the required skills and experience. Due to the high number of applications we will only be able to respond to successful applicants in the first instance. We thank you for your interest and the time taken to apply! TPBN1\_UKTJ

Senior Data Engineer
83zero Limited
London
In office
Senior
ÂŁ80,000
+1

Location: London

Employment Type: Full-time

Salary: ÂŁ80k per annum + 10% bonus & bens

Sector: Fintech / Payments

Overview

We are looking for a skilled Senior Data Engineer with a strong foundation in data analytics to join our growing team. The ideal candidate will have previously worked as a Data Analyst and since transitioned into a more engineering-focused role. You’ll help us scale our data infrastructure, design and build robust data models, and contribute directly to our data platform’s evolution.

This is a hands-on role where you’ll be expected to hit the ground running, contribute to ongoing projects with minimal hand-holding, and help us maintain (and improve) the current team’s velocity.

Key Responsibilities

Design, develop, and maintain data models to support analytical and operational use cases.

Write efficient, production-grade SQL to build data pipelines and transformations.

Develop and maintain data workflows and automation scripts in Python .

Collaborate with analysts, engineers, and stakeholders to deliver high-quality data solutions.

(Optional but highly valued) Contribute to our infrastructure as code efforts using tools like Terraform .

Work with modern data warehousing technologies such as Snowflake to ensure scalable and high-performing solutions.

Skills & Experience

5+ years of experience in data roles, ideally transitioning from Data Analyst to Data Engineer .

Proven expertise in SQL and building complex data models.

Strong proficiency in Python for data processing, ETL, and workflow automation.

Experience with cloud data platforms (Snowflake experience highly desirable).

Exposure to or experience with Terraform or similar infrastructure-as-code tools is a strong plus.

Comfortable working in fast-paced environments and able to contribute quickly without extensive onboarding.

Nice to Have

Experience with modern data stack tools (e.g., dbt, Airflow, etc.).

Understanding of CI/CD pipelines and data infrastructure automation.

Familiarity with data governance, security, and best practices in a cloud environment.

TPBN1_UKTJ

Databricks SME and AWS Data Engineer
Experis
Northampton
Hybrid
Mid - Senior
ÂŁ460/day - ÂŁ480/day

Location: UK - Northampton (Hybrid)
6 months

UMBRELLA ONLY

Project Objective:
A key initiative involves migrating from Aerospike to Postgres and leveraging Databricks for back-testing fraud detection models. This role will contribute to the development and integration of Proof of Concepts (PoCs) from the detection backlog.

Key Responsibilities:

  • Collaborate with cross-functional teams to architect, design, develop and deliver PoCs related to fraud detection.
  • Lead the ETL & Data manipulation/engineering work on AWS.
  • Integrate Databricks-based back-testing into the fraud detection pipeline.
  • Work closely with architects and other developers to ensure seamless integration with existing systems.
  • Participate in weekly stand-up calls to demonstrate progress and align on deliverables.
  • Take ownership of tasks from the backlog, ensuring timely and high-quality delivery.

Required Skills & Experience:

  • Strong AWS Data Engineering expertise.
  • Proficiency in Kafka for real-time data streaming and integration.
  • Proficiency with Databricks for data processing and analytics.
  • Working knowledge of NodeJS would be an added advantage.
  • Solid programming skills in Python, PySpark, Spark.
  • Candidate must be adept at working with large-scale datasets, S3, Python, and data cataloguing tools. Familiarity with data engineering best practices is essential.
  • Performance and Code Quality - candidate should demonstrate a strong commitment to building high-performance, scalable, and resilient distributed systems, with an emphasis on clean, maintainable, and testable code.
  • CI/CD Proficiency - Hands-on experience with CI/CD pipelines and tools (e.g., Jenkins, GitHub Actions, GitLab CI, etc.) for automated build, test, and deployment processes.
  • Secure Development Practices - Awareness of secure coding standards, data protection principles, and experience working in regulated environments (especially relevant for fraud detection and financial services).
  • Testing Rigor - Strong understanding of unit testing, integration testing, and test automation frameworks (e.g., PyTest, etc.) to ensure code quality and reliability.
  • Familiarity with cloud-native development and CI/CD practices.
  • Agile mindset with a proactive, developer-driven approach to problem-solving.

Ideal Candidate Profile:

  • A hands-on architect & developer with a strong sense of ownership and accountability.
  • Comfortable working in a collaborative, fast-paced environment.
  • Able to pick up tasks independently, contribute to design discussions, and deliver integrated solutions.
  • Strong communication skills to engage with both technical and non-technical stakeholders.

All profiles will be reviewed against the required skills and experience. Due to the high number of applications we will only be able to respond to successful applicants in the first instance. We thank you for your interest and the time taken to apply!

Data Platform Engineer
Matchtech
Preston
Fully remote
Mid - Senior
ÂŁ74/hour

Data Platform Engineer- 12 month contract - Warton Aerodrome (Remote) - 74.26 ph UMB or 55 ph PAYE (Inside IR35)

The Umbrella rate quoted above is the Gross Umbrella rate (i.e. the rate we pay to the Umbrella Company inclusive of ALL employment costs). Please note, the rate paid by the Umbrella will be less, as will a Limited Deemed rate or Agency PAYE rate. Please get in touch to discuss the rates via these different payment vehicles.

The Role

As a Data Platform Engineer in a highly regulated environment, you will be responsible for designing, building, and maintaining secure and scalable data infrastructure that supports both cloud and on premises platforms.

You will play a key role in ensuring that all data systems comply with industry regulations and security standards while enabling efficient access for analytics and operational teams.

A strong command of Apache NiFi is essential for this role.

You will be expected to design, implement, and maintain data flows using NiFi, ensuring accurate, efficient, and secure data ingestion, transformation, and delivery.

You should be adept at identifying and resolving issues within NiFi flows, managing performance bottlenecks, and implementing robust error handling strategies.

You’ll work closely with cross-functional teams including data architects, compliance officers, and cybersecurity specialists to integrate data from various systems such as databases, APIs, and cloud platforms.

Role Responsiblities:

Not limited to

  • Design, develop, and maintain robust and secure data pipelines using NiFi and related big data technologies.
  • Troubleshoot and optimize NiFi flows, including performance tuning, error resolution, and flow control.
  • Integrate NiFi with cloud platforms, databases (SQL & NoSQL), APIs, and third-party systems.
  • Ensure compliance with regulatory and security requirements across data storage, transfer, and access layers.
  • Support data migration efforts and implement disaster recovery protocols.
  • Continuously monitor data infrastructure performance and recommend improvements.
  • Collaborate with cross-functional teams to align data platform capabilities with business needs and compliance requirements.
  • Maintain documentation of data flows and processes, ensuring knowledge sharing and operational transparency.

What are BAE Systems looking for from you?

  • Over 3 years of relevant experience in data engineering, platform engineering, or a related field.

  • Proficiency in Java and SQL.

  • Deep understanding of core NiFi concepts: FlowFiles, Processors, Controller Services, Schedulers, Web UI.

  • Knowledge of data modeling, replication, and query optimization.

  • Experience designing and optimizing data flows for batch, real-time streaming, and event-driven architectures.

Security Requirements: SC & UK EYES ONLY

This role will require the person to hold full Security Clearance (SC) prior to working onsite. You will need to obtain a BPSS check as part of this process. You must currently hold or be eligible and willing to obtain SC and you must be eligible to work in the UK without sponsorship and have lived and worked in the UK for a minimum 5 year period. If you are unsure as to whether you are eligible, please contact me to discuss.

This role also requires you to be a sole British national and therefore hold no other nationalities.

Data Analyst
i-Jobs
London
In office
Junior - Mid
Private salary
TECH-AGNOSTIC ROLE

Location: Hornton Street, W8 7NX
Start Date: ASAP
Contract Duration: 3+ Months
Working Hours: Mon Fri, 09 00
Pay Rate: ÂŁ331.36 Per day
Job Ref: (phone number removed)

Job Responsibilities

  • Analyse and interpret data to support decision-making across departments.
  • Maintain and manage datasets using the CCIS database.
  • Produce clear reports and visualisations for internal stakeholders.
  • Ensure data accuracy, consistency, and compliance with council standards.
  • Support process improvements through data-driven insights.
  • Collaborate with teams to deliver timely and accurate information.
  • Maintain confidentiality and adhere to council policies.

Person Specification
Must-Have Requirements

  • Proven experience as a data analyst or in a similar analytical role.
  • Proficiency in MS Office and other relevant ICT tools.
  • Experience working with the CCIS database.
  • Strong numeracy and analytical skills.
  • Excellent written and verbal communication skills.
  • Eligible to work in the UK.

Nice-to-Have Requirements

  • Experience in a local government or public sector environment.
  • Familiarity with data visualisation tools and reporting software.
  • Knowledge of council policies, procedures, and compliance standards.

DISCLAIMER: By applying for this vacancy, you consent to your personal information being shared with our client and any relevant third parties we engage with, for the purpose of assessing your suitability specific organizations or hireSrs to whom you do not wish your details to be disclosed.

Data Engineer - Minerva - Contract - SC Clearance
CBSbutler Holdings Limited trading as CBSbutler
Shropshire
Hybrid
Mid - Senior
ÂŁ480/day - ÂŁ515/day

Data Engineer - Minerva VAT TxR
Active SC Clearance required
Telford
480 - 515 per day inside IR35
Hybrid - 2 days per week

Hiring for an experienced Data Engineer to join the Minerva Platform Data Team, supporting the consolidation of key tax data services into a single, scalable calculations platform.

Responsibilities:
Design and build scalable data pipelines and integration solutions
Develop ETL processes using Pentaho and Talend
Work with Denodo (data virtualisation) and SAS
Drive Agile delivery and DevOps CI/CD practices
Ensure high standards of data quality, governance, and security

Skills and Experience:
Strong ETL expertise (Pentaho, Talend)
Experience with Denodo and SAS
Strong SQL and data modelling skills
Agile/Scrum and DevOps experience (Jenkins, Git, Docker, Kubernetes)
Ability to solve complex data challenges in secure environments

Please apply for immediate interview!

CBSbutler is operating and advertising as an Employment Agency for permanent positions and as an Employment Business for interim / contract / temporary positions. CBSbutler is an Equal Opportunities employer and we encourage applicants from all backgrounds.

Lead Data & AI Scientist
Experis
London
Hybrid
Senior
ÂŁ100,000
+1

Hybrid: 1-2 days per week in the office (London)
Permanent
ÂŁ100k-ÂŁ150k DOE + Bonus + Benefits

Experis are delighted to be partnering with a leading and well-respected organisation as they continue to invest heavily in Data Science, Artificial Intelligence and Generative AI. We are supporting them in the search for an experienced Lead Data Scientist to shape, scale and lead their enterprise-wide AI capability.

This is a high-impact, strategic role with responsibility for defining the direction of Data Science and AI across the organisation. You will lead the delivery of production-grade ML and GenAI solutions, build a high-performing specialist team, and partner closely with senior stakeholders to drive measurable business value.

What You’ll Be Doing

  • Leading the design, development and productionisation of Machine Learning, AI and GenAI solutions across the business.
  • Defining technical standards, best practices and governance for Data Science and MLOps.
  • Identifying and prioritising high-impact use cases in partnership with Wealth, Enablement and Technology teams.
  • Building, mentoring and developing a high-performing Data Science team.
  • Embedding modern MLOps practices including CI/CD, model monitoring, feature stores and version control.
  • Working closely with Data Engineering, Architecture and Operations teams to ensure scalable, secure deployment.
  • Acting as a senior technical authority and thought leader in advanced analytics and AI.
  • Championing innovation, experimentation and continuous improvement in AI and analytics delivery.

Experience Required

  • Proven experience in Head of or lead-level Data Science, AI or Machine Learning roles within enterprise environments.
  • Strong Python and SQL skills, with experience developing production-grade analytical solutions.
  • Hands-on experience with cloud platforms (Azure, AWS and/or Snowflake).
  • Expertise in modern machine learning frameworks (scikit-learn, XGBoost, PyTorch, TensorFlow).
  • Demonstrable experience delivering end-to-end ML and GenAI solutions (including RAG, LLMs, embeddings and vector databases).
  • Strong knowledge of MLOps tooling and practices (MLflow, CI/CD, containerisation, monitoring).
  • Experience leading, mentoring and developing technical teams.

If you’d like to learn more, please contact Jacob Ferdinand at

Principal Data Engineer
Uniting Ambition
London
Hybrid
Senior
ÂŁ90,000 - ÂŁ95,000

ÂŁ90,000 - ÂŁ95,000 + Benefits

Remote - Once or twice a month in the office

The Business

A large, UK-based property organisation operating at a national scale. They are focused on delivering and managing long-term community services and initiatives and have a strong emphasis on social impact, sustainability and modern digital transformation.

The Role

The Principal Data Engineer will be a senior technical leader within a growing data engineering team.

You ll help shape engineering strategy, architecture, and best practices across multiple product areas, supporting a data mesh approach while ensuring consistency and good governance.

This is a hands-on role that blends technical expertise with mentoring, guidance, and collaboration across teams.

Core technologies include:

Google Cloud Platform (BigQuery, Dataflow, Dataproc, Data Fusion, Data Streams, Cloud Functions, Airflow/Composer), SQL, Python, CI/CD tooling

About you

You’ll have

  • 10+ years experience in Data Engineering
  • Experience operating in a Lead or Principal Engineer capacity
  • Strong understanding of Data Governance principles.
  • Significant experience with cloud data platforms (specifically GCP) and possibly exposure to Azure / AWS
  • Strong proficiency in SQL and Python a
  • Strong track record of mentoring and technical leadership
  • Background working in Agile / Scrum / SDLC environments

Requirements added by the job poster

No need for visa sponsorship

• Authorised to work in United Kingdom

(ILR, Tier 2 dependent visa and EU Settlement Scheme considered)

Happy to consider travel to the Lancashire region once or twice a month

Data Analyst - Sc cleared
CBSbutler Holdings Limited trading as CBSbutler
Reading
Hybrid
Senior - Leader
ÂŁ80/hour - ÂŁ83/hour

Data Analyst

+Hybrid working in Reading

+SC cleared role

+Inside IR35

  • 83 ph

Skills:

+Data Analyst

+MOD experience

+SC cleared

Are you an experienced SC cleared Data Analyst ready to lead impactful, data-driven innovation?
We’re looking for a Principal Data Analyst to drive the design and delivery of advanced analytics solutions that inform strategic decisions and unlock business value.

In this role, you’ll be at the forefront of our data journey - leading technical initiatives, mentoring colleagues, and influencing how the organisation harnesses data.

What You’ll Do

  • Lead the design and delivery of complex analytics and insight solutions across multiple projects
  • Oversee dashboarding, reporting, and automation workflows using SQL and cutting-edge tools
  • Define and implement best-practice data governance, security, and compliance standards
  • Collaborate with architects, engineers, and business leaders to shape enterprise data strategy
  • Inspire and guide stakeholders to make informed, data-driven decisions
  • Mentor and develop junior analysts, enhancing team capability
  • Innovate - adopting emerging technologies and driving continuous improvement in data practices

What You’ll Bring

  • Deep understanding of data lifecycle, governance, and quality principles
  • Advanced experience with SQL and data warehousing tools
  • Expertise in data visualisation (Power BI, Tableau) and report automation
  • Proven track record in translating complex datasets into powerful business insights
  • Strong leadership, mentoring, and communication skills
  • Ability to shape technical direction and lead data-driven initiatives

Core Expertise (Must-Have)

  • MOD Experience
  • Skilled in data modelling and dashboard design for future-proofed solutions
  • Strong knowledge of data validation and QA processes
  • Proven expertise in data warehousing and cloud/hybrid data environments
  • Deep understanding of data security, ethics, and privacy
  • Experience managing multidisciplinary team workloads and work packages
  • Familiarity with Agile/DevOps methodologies
Data Analyst
Deerfoot Recruitment Solutions Limited
London
Hybrid
Mid - Senior
Private salary
TECH-AGNOSTIC ROLE

Investment Data Analyst
Hybrid / London, SE1

Are you a detail-oriented data professional looking to make a visible impact within the investment management sector? This is a unique opportunity to step into a pivotal role where you won’t just be managing data, you’ll be helping to build the very foundations of a new data capability within an asset management environment.

In this role, you will sit at the heart of Investment, Operations, and Technology workflows. You will be the guardian of data integrity, ensuring that high-quality information underpins critical pension fund reporting, portfolio management, and NAV calculations. If you enjoy solving complex data puzzles and building strong relationships with both internal teams and external suppliers, this is the perfect career move for you.

Your Responsibilities

  • Data Integrity & Flow: You will monitor and validate time-critical data feeds from custodians, partner funds, and fund managers, identifying and resolving any anomalies or “data breaks” to ensure seamless downstream processes.
  • Supplier & Stakeholder Management: You will act as the key bridge between data suppliers and internal teams, resolving quality issues and communicating data trends clearly to stakeholders at all levels.
  • Quality Assurance & Root-Cause Analysis: Beyond just fixing errors, you will take ownership of recurring issues, driving root-cause analysis and helping to develop data quality metrics and dashboards.
  • Building the Future: You will contribute directly to designing scalable data management processes and embedding a new data quality framework.

What You Bring to the Team

  • Investment Management Expertise: Proven experience within an Investment Manager, Pension Fund, or Fund Administration environment, featuring a deep understanding of investment and operations data.
  • Strong Domain Knowledge: A clear understanding of how data quality impacts Performance Reporting, Net Asset Value (NAV) and Investment Outcomes.
  • Technical Proficiency: Hands-on experience performing data quality checks, reconciliations, and investigating complex data discrepancies.
  • Outcome-Focused Mindset: A clear understanding of how data quality impacts Performance/NAV and a proactive approach to reducing operational burdens.
  • Communication Skills: The ability to own relationships with external service providers and collaborate effectively with internal investment and tech teams.
  • Bonus Points: We would love to hear from you if you have experience with Microsoft Purview, Azure API Management, or implementing data governance frameworks.

Flexible Contract Options
Our client is looking for the best talent and is open to two engagement routes:

  • 12-Month Fixed Term Contract (FTC): 65,000 - 75,000 per annum + Benefits (e.g. 26-28 days holiday allowance)
  • 12-Month Day Rate Contract: Competitive daily rates considered on an individual basis (Inside IR35).

Ready to Apply? If you are a delivery-focused Data Analyst ready to help shape a new function in the Pension Investment space and ensure data excellence, we want to hear from you.

If you’ve held any of these roles or used these technologies/skills, this role could be a great fit: Investment Data Analyst, Pension Fund Analyst, Asset Management Data Specialist, Operations Data Analyst, Fund Data Controller, LGPS Data Analyst, Microsoft Purview, NAV Data Analyst, or Investment Operations Specialist.

Deerfoot Recruitment Solutions Ltd is a leading independent tech recruitment consultancy in the UK. For every CV sent to clients, we donate 1 to The Born Free Foundation. We are a Climate Action Workforce in partnership with Ecologi. If this role isn’t right for you, explore our referral reward program with payouts at interview and placement milestones. Visit our website for details. Deerfoot Recruitment Solutions Ltd acts as an Employment Business in relation to this vacancy.

Frequently asked questions
Typically, a Data Engineer should have a strong background in computer science or related fields, proficiency in programming languages like Python or Java, and experience with data warehousing, ETL processes, and big data technologies such as Hadoop or Spark.
Haystack features a wide range of Data Engineer positions, including roles in startups, large enterprises, and remote opportunities. You can find jobs specializing in cloud data engineering, real-time data processing, data pipeline development, and more.
To improve your chances, tailor your resume to highlight relevant skills and projects, gain hands-on experience with popular data tools, contribute to open-source projects, and stay updated with the latest trends in data engineering.
Yes, Haystack lists entry-level Data Engineer roles suitable for recent graduates or professionals transitioning into data engineering, as well as internships and junior positions to help you start your career.
Absolutely. Haystack offers many remote and flexible Data Engineer job listings to suit your preferred working style and location.