Make yourself visible and let companies apply to you.
Roles
Data Engineer Jobs
Overview
Looking for top Data Engineer jobs? Explore the latest data engineering opportunities on Haystack, your go-to IT job board. Whether you're skilled in ETL, data pipelines, or big data technologies, find the perfect role to advance your career today. Start your search for Data Engineer positions now!
Senior Data Engineer
Careerwise
Not Specified
Remote or hybrid
Senior
ÂŁ450/day - ÂŁ625/day
RECENTLY POSTED

Data Engineer - Python/Spark/Databricks - Security Cleared

UK (Remote/Hybrid)

Long-term Contract

Active SC Clearance Required

Paying - ÂŁ450 ÂŁ625 per day (Inside IR35)

We are hiring for a specialist UK data consultancy, supporting a large UK Government body on a new, long-term data and IT transformation programme .

This programme focuses on modernising enterprise data platforms and delivering secure, scalable data solutions using modern lakehouse technologies.

Key Skills

  • Python, Apache Spark, Databricks
  • SQL and data modelling
  • Delta Lake/Lakehouse architectures
  • Building production-grade data pipelines

Requirements

  • Experience on large-scale data or transformation programmes
  • Comfortable working in secure/regulated environments
  • Active SC clearance
  • UK-based

Why Join

  • Long-term, stable government programme
  • Modern data stack
  • Opportunity to shape a new platform from the ground up

Apply with your CV

Further details will be shared during the interview process due to NDA.

Data and Insights Manager
Erin Associates
Multiple locations
Hybrid
Mid - Senior
ÂŁ40,000 - ÂŁ45,000
RECENTLY POSTED

Data & Insights Manager
Lytham St Annes, Lancashire
ÂŁ45,000 + great benefits (BUPA private healthcare, pension, life assurance, bonus & more)

Are you a data-driven professional who enjoys turning customer data into meaningful insight? We’re working with a well-established and growing organisation in Lytham St Annes that’s expanding its Digital and IT function and is now looking for a Data & Insights Manager to join the team.

This is a varied and rewarding role where you’ll take ownership of customer data, helping the business better understand its customers and deliver more personalised, engaging experiences.

What you’ll be doing
Making sure customer data is collected accurately and consistently across all channels
Managing the flow of customer data between systems to ensure it’s secure, reliable, and accessible
Cleaning and analysing data to uncover trends, patterns, and insights
Creating reports for CRM and Marketing teams to support segmentation and targeted campaigns
Working closely with technical teams to improve data integration and quality
Building dashboards and reports so stakeholders can easily track customer KPIsWhat we’re looking for
Experience in a data, insights, analytics, or data management role
Knowledge of CRM and CDP platforms, ideally within a retail or customer-focused environment
Strong SQL skills and experience with segmentation and data modelling
Confidence working with analytics and BI tools
A good understanding of data protection and privacy regulations
Nice to have (but not essential): experience with Bloomreach CDP, A/B testing, or eCommerce platforms
Why this role? You’ll be joining a friendly, collaborative Digital team where your work will genuinely make an impact. The business values insight-led decision making and will give you the opportunity to shape how customer data is used across the organisation.

This is primarily an office-based role in Lytham St Annes, with occasional flexibility to work from home.

Keywords: Data & Insights Manager, Customer Data, CRM, Analytics, SQL, Excel, Lancashire, Lytham St Annes, Blackpool, Preston, Fleetwood

Send your CV to Alex or call (phone number removed) to find out more.

Erin Associates is an equal opportunities employer and welcomes applications from all backgrounds. If you need any reasonable adjustments during the application process, please let us know

DV Cleared Data Engineers Needed – Consultancy – AWS
Avanti Recruitment
Bristol
Hybrid
Mid - Senior
ÂŁ50,000 - ÂŁ70,000
RECENTLY POSTED

I’m looking for DV Cleared Data Engineers at all career levels to join a successful, multinational Consultancy across the UK in either their Bristol, Manchester or Belfast offices working on high profile client projects.

As a Data Engineer, you’ll design and implement cutting-edge data solutions that transform the clients’ businesses. You’ll work with cross-functional teams to create scalable, efficient architectures that turn complex data challenges into opportunities for innovation.

Your role

  • Design end-to-end data architectures that align with business objectives

  • Create cloud-native solutions leveraging PaaS, serverless, and container technologies

  • Build robust data pipelines for both batch and streaming processes

  • Collaborate with clients to understand their data landscape and requirements

To be considered you will be able to demonstrate skills and experience in many of the following:

  • Expertise in designing production-grade data pipelines using Python, Scala, Spark, and SQL

  • Deep knowledge of AWS Cloud Platforms (EMR, Glue, Redshift, Kinesis, Lambda, DynamoDB)

  • Experience with data processing across structured and unstructured sources

  • Strong scripting abilities and API integration skills

  • Knowledge of data visualization and reporting best practices

Desirable but not essential:

  • Experience with data mining and machine learning

  • Natural language processing expertise

  • Multi-cloud platform experience

They work within an Agile environment with Scrum practices, cross-functional collaborative teams and need someone who can work from the office 2 days a week.

Salary: ÂŁ50,000 - ÂŁ70,000 + 25 days holiday (option to buy 5 more) + pension + Performance Bonus + share options

Location: Hybrid working – 2 days a week in the office (Bristol, Manchester or Belfast).

We are looking for Data Engineers who are DV Cleared

Data Migration Specialist
etiCloud
Sheffield
In office
Mid - Senior
ÂŁ60,000
RECENTLY POSTED

Location: Sheffield
Contract Type: Permanent
Hours: Full time
Salary: ÂŁ45,000 - ÂŁ60,000 p/a depending on experience

The Data Migration Specialist is responsible for managing and delivering high-quality data migrations from legacy Case/Practice Management Systems (CMS/PMS) into our platform. This role owns the end-to-end migration lifecycle: extraction, transformation, validation, reconciliation, and loading of data using API-led and ETL approaches.

In addition, this role will advise on and help shape an internal data warehouse that will act as the foundation for migration preparation, cleansing, auditability, and as a single source of truth for database migrations.

Key Responsibilities

Lead the data migration workstream, including planning, scoping, mapping, cleansing, testing, and loading.
Extract data from legacy CMS/PMS and related systems and transform datasets to meet target data models.
Use API-based tools and integration frameworks to perform data imports, updates, and reconciliations.
Deliver migrations across core data domains including: Accounts / financial data, Matter / case data, Contacts, Documents and related metadata
Identify data quality issues, perform remediation, and work closely with clients to validate migrated data.
Execute trial migrations, delta migrations, and final production cutover loads.
Advise on and contribute to the design and development of an internal data warehouse to: Support migration data preparation and cleansing, Provide auditability and reconciliation reporting, Act as a single source of truth for migration datasets
Collaborate closely with the Implementation Lead and project team to align migration activities with overall delivery timelines.
Produce and maintain clear documentation for: Data mapping and transformation rules, Migration processes and runbooks, Data validation and reconciliation procedures

Skills & Experience Required

Proven experience delivering data migrations in legal, financial, accounts, or SaaS environments.
Strong understanding of ETL processes, data mapping, data quality, and validation techniques.
Hands-on experience using APIs for data import/export and system integrations.
Experience with scripting or development languages such as Python and/or C#.
Strong SQL skills, including querying, data analysis, and validation.
Experience working with relational databases and structured data models.
Ability to analyse, troubleshoot, and resolve data discrepancies efficiently and methodically.
Excellent organisational, documentation, and stakeholder communication skills.
Understanding of data governance, auditability, and reconciliation best practices.

Desirable

Experience working in professional services, legal tech, or enterprise SaaS implementations.

Joining etiCloud isnt just about the job, its so much more. We want you to forge a successful and rewarding career in the IT industry. Youll be supported every step of the way in a friendly, professional environment where you and your future matter.

Heres a quick overview of what you can expect when you become part of our team:

Competitive salary with regular reviews to reward your progress
Annual company bonus recognising your hard work
Career development through ongoing training, support, and progression opportunities
28 days annual leave
Company pension scheme to support your future
Supportive, friendly team with a down-to-earth culture
Health & wellbeing benefits, including private medical insurance, health cash plan, and mental health support
Modern, secure Sheffield office with kitchen facilities and a coffee machine
Weekly fresh fruit as part of our wellbeing initiatives
Free onsite parking

Apply now and take your next step in the world of tech with etiCloud!

You may also have experience in the following: Data Migration Specialist, Data Migration Consultant, ETL Developer, Data Engineer, SQL Developer, Database Migration Consultant, Technical Implementation Consultant, Data Integration Specialist, Systems Integration Engineer, Data Warehouse Developer, Migration Engineer, Data Conversion Specialist, Application Data Consultant, Legal Tech Data Specialist, SaaS Implementation Data Consultant

REF-224 949

Integration and Data Engineer
Pilgrims Europe
Chippenham
In office
Mid - Senior
Private salary
RECENTLY POSTED

?? Location-Chippenham, UK (On-Site) ?? Monday to Friday - Office Hours At Oakhouse Foods, part of Pilgrim's Europe, data drives everything we do — from operational performance to customer experience. We're looking for a technically strong and commercially aware Integrations & Data Engineer to design, build and support resilient data pipelines and system integrations that power our business. This is a hands-on role where you'll combine technical expertise with problem-solving capability to reduce manual processes, improve automation, and deliver reliable, future-proof solutions. The Role You will ensure business data is collected, transformed, validated, stored and distributed through secure, scalable automated pipelines. Supporting both Business as Usual (BAU) and project delivery, you'll maintain and enhance existing integrations while developing innovative solutions aligned to architecture standards and modern best practices. This is an on-site role in Chippenham, working closely with IT, operational teams, suppliers and project stakeholders. Key Responsibilities Design, develop and support system integrations between platforms Build and optimise scalable ETL/ELT pipelines Ensure structured, reliable and validated data availability Maintain documentation of data schemas, conventions and definitions Implement robust validation, logging and error-handling mechanisms Design and build automated workflows to reduce manual intervention Select and implement appropriate tools for large-scale automation Implement scheduling, retry mechanisms and resilience patterns Identify automation opportunities through user engagement and process analysis Resolve escalated data and integration support tickets with clear root cause analysis Review and improve existing workflows and processes Collaborate with suppliers and partners on support and project delivery Maintain knowledge base documentation and technical guides Provide technical input, effort estimation and risk assessment Communicate clearly with both technical and non-technical stakeholders Provide regular updates on progress, risks and blockers Ensure delivered solutions are secure, scalable and maintainable What We're Looking For Microsoft SQL Server Experience managing SQL Server instances (security, users, monitoring, maintenance). Advanced knowledge of Microsoft SQL Server, including writing DML queries, stored procedures, and functions. SAP Business One Understanding of core SAP B1 processes and objects (business partners, documents, etc). Experience customising the platform using add-ons such as Boyum B1UP. Familiarity with SAP B1's underlying data structures. Microsoft Fabric Experience ingesting data from APIs or databases into Fabric. Skilled in using Dataflows, Power Query, and Data Warehouse tooling to cleanse, transform, and prepare data. Ability to publish structured, reliable datasets for analytical and reporting use cases. Microsoft Azure Hands-on experience with Azure Logic Apps for workflow automation. Experience using Azure Function Apps for high-speed data manipulation and transfer operations. Familiarity with additional Azure data and compute services beneficial to integration workloads. Personal & Professional Skills Strong analytical and critical thinking ability Customer-focused approach to supporting internal teams and franchise partners Ability to explain technical concepts clearly Confidence mentoring support staff Commercial awareness and understanding of how data drives business performance Why Join Us? Play a key role in modernising and automating business systems Work in a collaborative environment where ideas are valued Be part of a business backed by the strength of Pilgrim's Europe Contribute to projects that directly impact operational performance and customer service Opportunities to shape data architecture and influence technical direction Competitive salary and benefits package Ready to Build the Data Foundations That Power the Business? If you're passionate about integration engineering, automation and delivering scalable data solutions that make a real impact, we'd love to hear from you. Apply now and help us build robust, future-ready systems at Oakhouse Foods - Pilgrim's Europe. TPBN1\_UKTJ

Data Lead - Insurance
High Finance Limited T/A HFG
London
Hybrid
Senior
Private salary
RECENTLY POSTED

A well-respected Insurance firm is looking to hire a Data Lead to play a key role in developing the company's enterprise data governance and master data capabilities. You will serve as the subject matter expert in data governance, data quality, and MDM tooling. You will oversee the master data across Finance, Actuarial, Claims, and Investments, ensuring that critical data is trusted, compliant, and aligned to regulatory frameworks.KEY REQUIREMENTS:· Have strong experience in data management, data governance, or MDM delivery, ideally within financial services. (Re)insurance experience is preferred · Be able to demonstrate experience in embedding data ownership and stewardship frameworks within business operations.· Have proven track record implementing or managing enterprise MDM platforms · Hands-on experience in data quality management, including rule creation, measurement, and remediation workflows.· Have a strong understanding of data governance frameworks (e.g., DAMA-DMBOK, DCAM) and regulatory expectations (GDPR, Solvency II, DORA).· Have excellent knowledge of data integration and architecture patterns, particularly between MDM systems and data warehouses (e.g., Snowflake, Azure Data Services).· Understanding of change management and adoption practices required to implement data governance successfully.

Solution Architect - AI & Automation
Fruition Group
UK
Remote or hybrid
Mid - Senior
ÂŁ75,000 - ÂŁ85,000
RECENTLY POSTED

Job Title: Solutions Architect (AI and Automation)
Location: Remote (occasional travel to London required)
Salary: ÂŁ75,000 - ÂŁ85,000 (Depending on location)

Would you like to work for an organisation plays a pivotal role in safeguarding and improving the quality of health and social care services across England. My client plays a vital role in improving outcomes across health and social care by using data, technology, and insight to drive meaningful change. With a strong focus on innovation and digital transformation, they are investing in modern cloud platforms, artificial intelligence, and advanced analytics to become a truly intelligence-led regulator.

Solutions Architect Responsibilities

  • Design scalable, secure, and ethical AI and data solutions using Microsoft Azure technologies.
  • Translate business requirements into end-to-end solution architecture across AI, machine learning, and data platforms.
  • Maintain architectural blueprints ensuring interoperability, resilience, governance, and regulatory compliance.
  • Lead architecture design reviews and provide technical leadership to delivery teams using Agile, DevOps, and CI/CD practices.
  • Mentor technical teams and promote responsible AI practices, including transparency, fairness, and security.

Solutions Architect Requirements

  • Extensive experience delivering solutions on the Microsoft Azure Data Platform (Synapse Analytics, Data Lake, Databricks, Azure ML, Power BI).
  • Expertise in modern data engineering approaches such as ETL/ELT pipelines, streaming, APIs, and event-driven architecture.
  • Experience with cloud-native architectures, microservices, and SaaS integrations.
  • Knowledge of data governance, information security, and AI ethics within regulated industries.
  • Excellent stakeholder engagement and leadership skills.

We are an equal opportunities employer and welcome applications from all suitably qualified persons regardless of their race, sex, disability, religion/belief, sexual orientation or age.

Senior/Lead Data Engineer
Catalyst
Newcastle upon Tyne
Hybrid
Senior
ÂŁ100,000
RECENTLY POSTED

Our client is a successful tech scale-up, a cash-generative SME currently at c.ÂŁ10m turnover and c.80 staff, looking to double in size in the next five years under the guidance of its experienced, highly charismatic and driven CEO, supported by a first-class senior leadership team.We are looking to strengthen the companys Data capability, working closely with a world-class CTO who leads a high-performing technology function, by appointing a new Senior/Lead Data Engineer.

As Senior/Lead Data Engineer youll take ownership of a strategically critical data-warehouse build, working hands-on to shape, implement and evolve a modern AWS-based data platform.Youll act as the internal technical lead for a major project currently in discovery with an external provider, guiding technology choices, ensuring high-quality delivery and preparing the business for knowledge transfer and long-term ownership.Reporting directly to the CTO, youll also help define and grow the internal data engineering function over time, setting standards, mentoring colleagues and ensuring the platform scales with the companys ambitions.

Your responsibilities as Senior/Lead Data Engineer will include:

Leading the build of a new cloud-based data warehouse, working hands-on with AWS technologies and modern data-engineering tooling (e.g. Snowflake, Redshift or equivalent)
Acting as the internal technical owner for the outsourced data-warehouse project, ensuring alignment between business needs, architectural decisions and delivery outcomes
Evaluating and selecting appropriate technologies, tools and patterns to support a scalable, secure and high-performing data platform
Managing knowledge-transfer activity from the external provider, embedding best practice and ensuring the business can confidently own and extend the platform
Designing and implementing robust data pipelines, modelling approaches and integration patterns to support analytics, reporting and operational use cases
Establishing engineering standards, documentation and processes to support long-term maintainability and future team growth
Providing guidance, coaching and leadership to colleagues as the function expands, fostering a positive, collaborative and delivery-focused culture
Working closely with product, engineering, finance and operational stakeholders to understand data needs, prioritise work and ensure timely delivery

As Senior/Lead Data Engineer youll need:

Strong commercial experience as a Data Engineer, Senior Data Engineer or Lead Data Engineer, ideally gained in a fast-paced, growing technology-driven business
Hands-on expertise with AWS data services and modern data-warehouse technologies (e.g. Snowflake, Redshift, Glue, Lambda, S3, Step Functions)
Proven experience designing and building scalable data pipelines, data models and ETL/ELT processes
Strong understanding of data architecture, cloud engineering principles, security, governance and best-practice engineering standards
Ability to self-manage, take ownership of complex technical work and operate confidently as the internal lead on a major data-platform build
Experience collaborating with external suppliers or delivery partners, ensuring quality, alignment and successful handover
Excellent communication, problem-solving and stakeholder-management skills
A positive, pragmatic approach, balancing strategic thinking with hands-on delivery

Rewards and Benefits:
Highly negotiable salary, likely between ÂŁ50,000 and ÂŁ100,000 (plus benefits) with potential stretch to c.ÂŁ130,000 for exceptional candidates
To secure maximum pay/rewards, you must possess all outlined experience, skills, knowledge and qualifications
Full-time, permanent role based in Newcastle upon Tyne with excellent hybrid and flexible working
Free parking
On-site health facilities, staff events and discounts, training and more

Please note: High levels of interest mean we will only contact you if your application is shortlisted and this will happen within five working days. You must also be eligible to work in the UK. Immediate or near-term availability (within two months) is strongly preferred.

Senior/Lead Data Engineer, Newcastle upon Tyne

Integration and Data Engineer
Pilgrims Europe
Chippenham
In office
Mid - Senior
Private salary
RECENTLY POSTED

My job Location-Chippenham, UK (On-Site) Monday to Friday - Office Hours At Oakhouse Foods, part of Pilgrim's Europe, data drives everything we do - from operational performance to customer experience. We're looking for a technically strong and commercially aware Integrations & Data Engineer to design, build and support resilient data pipelines and system integrations that power our business. This is a hands-on role where you'll combine technical expertise with problem-solving capability to reduce manual processes, improve automation, and deliver reliable, future-proof solutions. The Role You will ensure business data is collected, transformed, validated, stored and distributed through secure, scalable automated pipelines. Supporting both Business as Usual (BAU) and project delivery, you'll maintain and enhance existing integrations while developing innovative solutions aligned to architecture standards and modern best practices. This is an on-site role in Chippenham, working closely with IT, operational teams, suppliers and project stakeholders. Key Responsibilities Design, develop and support system integrations between platforms Build and optimise scalable ETL/ELT pipelines Ensure structured, reliable and validated data availability Maintain documentation of data schemas, conventions and definitions Implement robust validation, logging and error-handling mechanisms Design and build automated workflows to reduce manual intervention Select and implement appropriate tools for large-scale automation Implement scheduling, retry mechanisms and resilience patterns Identify automation opportunities through user engagement and process analysis Resolve escalated data and integration support tickets with clear root cause analysis Review and improve existing workflows and processes Collaborate with suppliers and partners on support and project delivery Maintain knowledge base documentation and technical guides Provide technical input, effort estimation and risk assessment Communicate clearly with both technical and non-technical stakeholders Provide regular updates on progress, risks and blockers Ensure delivered solutions are secure, scalable and maintainable What We're Looking For Microsoft SQL Server Experience managing SQL Server instances (security, users, monitoring, maintenance). Advanced knowledge of Microsoft SQL Server, including writing DML queries, stored procedures, and functions. SAP Business One Understanding of core SAP B1 processes and objects (business partners, documents, etc). Experience customising the platform using add-ons such as Boyum B1UP. Familiarity with SAP B1's underlying data structures. Microsoft Fabric Experience ingesting data from APIs or databases into Fabric. Skilled in using Dataflows, Power Query, and Data Warehouse tooling to cleanse, transform, and prepare data. Ability to publish structured, reliable datasets for analytical and reporting use cases. Microsoft Azure Hands-on experience with Azure Logic Apps for workflow automation. Experience using Azure Function Apps for high-speed data manipulation and transfer operations. Familiarity with additional Azure data and compute services beneficial to integration workloads. Personal & Professional Skills Strong analytical and critical thinking ability Customer-focused approach to supporting internal teams and franchise partners Ability to explain technical concepts clearly Confidence mentoring support staff Commercial awareness and understanding of how data drives business performance Why Join Us? Play a key role in modernising and automating business systems Work in a collaborative environment where ideas are valued Be part of a business backed by the strength of Pilgrim's Europe Contribute to projects that directly impact operational performance and customer service Opportunities to shape data architecture and influence technical direction Competitive salary and benefits package Ready to Build the Data Foundations That Power the Business? If you're passionate about integration engineering, automation and delivering scalable data solutions that make a real impact, we'd love to hear from you. Apply now and help us build robust, future-ready systems at Oakhouse Foods - Pilgrim's Europe. The company Pilgrim's Europe produces some of the best-known and most iconic brands in the UK and Ireland, including Fridge Raiders, Rollover, Denny, Richmond, Oakhouse and Moy Park, alongside a diverse range of industry leading own-label products in categories including fresh pork, lamb and chicken, working with all the major retailers and food service outlets. Our portfolio extends to authentic chilled and frozen ready meals, snacking ranges, added value and food service products. Across Pilgrim's Europe we combine 20,000 of the best people in the industry, united by a shared set of core values and a passion for producing the highest quality, most delicious and innovative food, which is enjoyed by millions of people in the UK, Ireland and Europe every day. Our Pilgrim's Europe team are based in our Pilgrim's UK, Moy Park, Pilgrim's Food Masters and Pilgrim's Shared Services businesses. What we'll bring to the table Competitive Salary Competitive Holiday Entitlement Pension Contribution Family Friendly Policies Learning and Development Opportunities Life Assurance People matter Previous Next Our values Determination Simplicity Availability Humility Discipline Sincerity Ownership TPBN1\_UKTJ

Azure Cosmos DB Developer
Stackstudio Digital Ltd.
London
Hybrid
Mid - Senior
ÂŁ450,000 - ÂŁ500,000
+3

Role / Job Title:Azure Cosmos DB DeveloperWork Location:LondonMode of Working:Hybrid

  • Office Requirement: 1 day a week mandatory in office

The RoleThe role will be integral to realising the customer’s vision and strategy in transforming some of their critical application and data engineering components. As a global financial markets infrastructure and data provider, the customer keeps abreast of the latest cutting technologies enabling their core services and business requirements. The role is critical in this endeavour by the means of enabling the technical and quality assurance thought leadership and excellence required for the purpose.Your Responsibilities

  • Develop cloud-native applications using Azure Cosmos DB (SQL API, Mongo API).
  • Build reusable libraries and frameworks in C#/.NET or Node.js
  • Establish robust CI/CD pipelines through GitLAB DevOps to streamline delivery and reduce operational overhead.
  • Design efficient data models tailored to NoSQL principles and application requirements.
  • Write and tune queries to minimize RU consumption and improve response times.
  • Implement indexing strategies and partitioning schemes for optimal performance.
  • Develop efficient queries using Cosmos DB SQL syntax.
  • Implement automated testing and unit test frameworks.
  • Collaborate with solution architects and DevOps teams to integrate Cosmos DB into microservices
  • Ensure compliance with security, governance, and data protection standards

Your ProfileEssential Skills / Knowledge / Experience

  • Hands-on experience with Azure Cosmos DB including query optimisation and throughput management
  • Thorough understanding of Cosmos DB Change Data Feed (CDF) and integration and debugging of Spark with Cosmos CDF.
  • High technical proficiency with Cosmos DB SDKs (e.g., .NET, Java, Python, Node.js)
  • Thorough understanding of Spark distributed computing concepts.
  • High technical proficiency in PySpark and Spark Concepts.
  • Experience with concurrency patterns, CLR, and scalable application design.
  • Deep understanding of Azure services including Azure Functions, App Services, AKS, and Logic Apps.
  • Experience with SQL API and familiarity with other APIs.
  • Strong understanding of partitioning, indexing, and consistency levels.
  • Experience with Git, version control, and continuous integration tools.
  • Strong goal-oriented outlook, problem solving capabilities, written and verbal communication skills, and collaborative mindset to work with internal and external stakeholders.
Data Engineer
Opus Recruitment Solutions
London
Hybrid
Mid - Senior
ÂŁ450/day - ÂŁ500/day

Data Engineer | Outside IR35 | £450 - £500 | 6 months | Hybrid London We’re supporting a company who are looking for a Data Engineer to build and enhance the data processing capabilities within their Databricks environment. You’ll be responsible for developing the code that drives their data pipelines, using Python, Spark, and Databricks Workflows to deliver new platform functionality and ensure efficient execution. Key Responsibilities Develop reliable Python and PySpark code to support data ingestion, transformation, and end‑to‑end processing. Deliver new technical features and components aligned to approved solution designs and business requirements. Enhance, extend, and tune existing data frameworks to support additional use cases and improved performance. Create, manage, and optimise Databricks Workflows, including orchestration logic and operational behaviours. Carry out testing, performance tuning, and provide day‑to‑day operational support for data pipelines. Work closely with Solution Designers / Architects and Configuration Analysts to ensure consistent and effective delivery. If this is a role that suits your skillset, can work onsite 2 days per month and immediately available then please apply for the job advert directly or reach out to myself at (url removed). Data Engineer | Outside IR35 | £450 - £500 | 6 months | Hybrid London

Data Engineer
SR2
Exeter
Hybrid
Mid
ÂŁ50,000

Data Engineer - Python / SQL / Databricks - Healthcare Tech - Hybrid (2 Days Onsite) - ÂŁ50,000

The Role

We’re working with a healthcare technology company investing in their data capability and hiring a Data Engineer to work directly alongside Product and Engineering.

This is not a “report factory” role - you’ll analyse data, challenge assumptions and influence product direction.

What You’ll Be Doing

Designing and developing scalable data solutions
Python & SQL development
ETL and data transformation
Working with Databricks (certification desirable, not essential)
Power BI reporting
Writing unit tests and mocking
Presenting insights to internal stakeholders
Collaborating closely with Product Owners and Developers

Tech Stack

Python
T-SQL
Databricks
Power BI
ETL pipelines
Zoho Creator/Analytics (desirable, not required)

What They’re Looking For

Strong communicator - confident presenting insights
Curious mindset - challenges assumptions
Experience analysing data, not just visualising it
Comfortable working directly with Product
Unit testing & mocking experience
Proactive and collaborative personality

Location

Hybrid - 2 days per week onsite (1 fixed team day + 1 flexible).

Easy access from M5/A30.

Process

30 min intro
Python & SQL tech test
Technical deep dive
Final 90 min interview

Business Intelligence Developer
Michael Page
Birmingham
Hybrid
Mid - Senior
ÂŁ45,123 - ÂŁ49,046

Join a forward-thinking public sector organisation as a Business Intelligence Developer in Birmingham, Cardiff or Leeds. This role focuses on utilising analytics to drive data-driven decisions and enhance operational efficiency

Client Details

This public sector organisation operates within the analytics field and is committed to improving processes and decision-making through data insights. As a large organisation, it offers the opportunity to make a meaningful impact while working on diverse and impactful projects

Description

  • Develop and maintain business intelligence solutions to support organisational goals.
  • Create Power BI dashboards and reports to present data insights effectively.
  • Work closely with the Performance Analysts, lead BI Developer, CRM and SharePoint specialists to ensure that new and changed reporting requirements are properly captured prior to analysis and development.
  • Lead in the design and support of robust routines for the production and delivery of reliable, accurate, agreed Management Information from key systems (case management, telephony, finance and HR)
  • Assess requirements, design solutions, document models, and deliver ETL solutions using SSIS and Azure Data Factory.
  • Develop and modify existing ETL models to support changes to the business process or emerging business needs.
  • Collaborate with stakeholders to gather and understand data requirements.
  • Ensure data quality, accuracy, and integrity across all reporting systems.
  • Analyse complex datasets to identify trends and opportunities for improvement.
  • Provide technical expertise in the development of data models and visualisation tools.
  • Support the implementation of new analytics technologies and methodologies.
  • Train and support team members on the use of business intelligence tools.

Profile

A successful Business Intelligence Developer should have:

  • Proficiency in Power BI / business intelligence tools and data visualisation techniques.
  • A strong background in data analysis and reporting.
  • Strong knowledge of ETL / Integration processes
  • Demonstrable skills in SQL and SSIS
  • Knowledge of Azure environments / Azure Data Factory is highly desirable
  • Experience in managing and processing large datasets.
  • Knowledge of database management systems and query languages.
  • The ability to work collaboratively with cross-functional teams.
  • Strong problem-solving skills and attention to detail.
  • A degree or equivalent qualification in a relevant field, such as computer science or analytics.

Job Offer

  • Competitive salary ranging from 45,123 to 49,046 GBP.
  • Generous 10% pension match scheme.
  • 1 day per week in the office hybrid working arrangement
  • Permanent role in a reputable public sector organisation.
  • Opportunities for professional growth and development within analytics.
  • Work in a central Birmingham location, with accessible transport links.

If you are ready to take the next step in your career as a Business Intelligence Developer in the public sector, apply today to join this impactful organisation in Birmingham.

Backend Software Engineer (Junior–Mid Level) – Programmatic
Mobkoi
City of London
Hybrid
Junior - Mid
Private salary

Location:  Hybrid – UK Contract Type: Permanent Hours: Full time Salary: Competitive COMPANY OVERVIEW MOBKOI is a fast-growing mobile company headquartered in London, with offices across central Europe, US and Asia. We use the latest in mobile ad technology to help premium brands effectively reach and engage with their clients' audiences. Bringing to market the most selective site list of premium global and local publishers enabling brands to be selective of where their ads are going to run and thus ensuring brand safety. MOBKOI prides itself on offering full transparency, bespoke creative builds and local market coordination. We are part of The Brandtech Group, formerly known as You & Mr Jones, working with partners developing the best technology across the globe. ROLE OVERVIEW As a Backend Software Engineer (Junior–Mid Level) – Programmatic, you will contribute to the development and evolution of MOBKOI’s programmatic advertising platform. You will work closely with experienced engineers on backend systems that power real-time bidding (RTB), campaign delivery, and large-scale integrations with SSPs and DSPs. This role is well suited to engineers early in their career who want to deepen their expertise in distributed systems, low-latency services, and cloud-based infrastructure, while gaining exposure to the programmatic advertising ecosystem. You will be supported through mentorship, code reviews, and progressive ownership of backend components. RESPONSIBILITIES •    Contribute to the development and improvement of MOBKOI’s programmatic backend systems, including components of the real-time bidding (RTB) pipeline. •    Support the design and implementation of scalable, reliable, and high-performance services handling large volumes of traffic. •    Work alongside senior engineers to help evolve system architecture while balancing performance, cost efficiency, and reliability. •    Assist in monitoring system performance and contribute to initiatives that optimise infrastructure usage and operational cost. •    Participate in diagnosing and resolving issues related to distributed systems, including latency, availability, and scaling challenges. •    Implement backend components following best practices in fault tolerance, load balancing, and resilient design. •    Develop an understanding of the programmatic advertising ecosystem, including SSPs, DSPs, bidding flows, and ad exchanges. •    Contribute to discussions around technology selection, tooling, and build-vs-buy  decisions. •    Collaborate with product, data, and business teams to ensure technical solutions align with business needs. •    Actively participate in code reviews, technical discussions, and continuous improvement initiatives. KEY SKILLS & COMPETENCIES •    Experience in backend software development, gained through a first professional role, internship, or apprenticeship. •    Solid engineering fundamentals with Go (or another backend language with a strong willingness to learn Go). •    Interest in or exposure to distributed systems and high-traffic environments. •    Understanding of core concepts around scalability, reliability, and high availability. •    Familiarity with cloud platforms such as AWS or GCP. •    Awareness of cost-efficient engineering practices and infrastructure optimisation. •    Curiosity about the programmatic advertising landscape and how technical systems support business outcomes. •    Strong collaboration skills and the ability to communicate clearly within a cross-functional team. BEHAVIOURS This role operates in a fast-moving, high-standards environment and requires someone who is motivated to learn, take ownership, and deliver quality work. The role holder is expected to manage their time effectively, adapt to change, and contribute positively to team and business outcomes. The role offers autonomy and is suited to someone who is comfortable balancing independent work with cross-functional collaboration. What we’re looking for: •    Motivation to contribute to the growth and success of the business •    A proactive, adaptable mindset with the ability to learn quickly •    Strong ownership and accountability for outcomes •    Clear and professional communication with a range of stakeholders You may also have experience in the following: Backend Software Engineer, Junior Backend Engineer, Mid-Level Backend Developer, Go Developer (Golang Engineer), Software Engineer – Backend, Programmatic Engineer, AdTech Engineer, RTB Engineer (Real-Time Bidding Engineer), Distributed Systems Engineer, Cloud Backend Engineer (AWS / GCP), Platform Engineer – AdTech, API Developer (Backend), High-Performance Systems Engineer, Programmatic Advertising Developer, Software Engineer – AdTech REF-(Apply online only)

Data Engineer
SR2
Exeter
Hybrid
Junior - Mid
ÂŁ50,000

Data Engineer - Python / SQL / Databricks - Healthcare Tech - Hybrid (2 Days Onsite) - 50,000

The Role

We’re working with a healthcare technology company investing in their data capability and hiring a Data Engineer to work directly alongside Product and Engineering.

This is not a “report factory” role - you’ll analyse data, challenge assumptions and influence product direction.

What You’ll Be Doing

  • Designing and developing scalable data solutions
  • Python & SQL development
  • ETL and data transformation
  • Working with Databricks (certification desirable, not essential)
  • Power BI reporting
  • Writing unit tests and mocking
  • Presenting insights to internal stakeholders
  • Collaborating closely with Product Owners and Developers

Tech Stack

  • Python
  • T-SQL
  • Databricks
  • Power BI
  • ETL pipelines
  • Zoho Creator/Analytics (desirable, not required)

What They’re Looking For

  • Strong communicator - confident presenting insights
  • Curious mindset - challenges assumptions
  • Experience analysing data, not just visualising it
  • Comfortable working directly with Product
  • Unit testing & mocking experience
  • Proactive and collaborative personality

Location

Hybrid - 2 days per week onsite (1 fixed team day + 1 flexible).

Easy access from M5/A30.

Process

  • 30 min intro
  • Python & SQL tech test
  • Technical deep dive
  • Final 90 min interview
Data Architect
Michael Page
Not Specified
Fully remote
Mid - Senior
Private salary
TECH-AGNOSTIC ROLE

This 3-6 month contract for a Data Architect focuses on shaping and managing data architecture to support analytics initiatives. The role requires expertise in designing and implementing scalable data solutions.

Client Details

They are a not for profit organisation based in the North West

Description

  • Design and implement robust data architecture frameworks to support analytics projects.
  • Collaborate with cross-functional teams to ensure data solutions align with business objectives.
  • Develop scalable data models and optimise database performance.
  • Ensure data governance and compliance with relevant regulations.
  • Provide technical guidance on data integration and transformation processes.
  • Evaluate and recommend tools and technologies for data management.
  • Document data architecture and maintain up-to-date records.
  • Monitor and troubleshoot data-related issues to ensure seamless operations.

Profile

A successful Data Architect should have:

  • Proven experience in data architecture and analytics within the life science industry.
  • Strong knowledge of data modelling, database design, and data integration techniques.
  • Expertise in relevant tools and technologies for data management and analytics.
  • Understanding of data governance and regulatory compliance requirements.
  • Ability to collaborate effectively with multidisciplinary teams.
  • Strong problem-solving skills and attention to detail.

Job Offer

Daily rate of 450 to 600 - outside IR35

3 month contract, with likely extension

Fully remote role

Power BI Developer
Fusion People Ltd
London
Fully remote
Mid - Senior
ÂŁ500/day - ÂŁ600/day

Power BI Developer - Construction, Rail & Civil Engineering

Department:

Commercial / Project Controls / Digital & Data

Reports To:

Head of Project Controls / Digital Transformation Manager

Location:

Working from home

Employment Type:

Contract - (Outside IR35)

Role Overview

We are seeking an experienced Power BI Developer to support major infrastructure, rail, and civil engineering projects by delivering high-quality business intelligence and data analytics solutions.

The successful candidate will work closely with Project Managers, Commercial Managers, Planners, and Senior Leadership teams to transform complex cost, programme, and operational data into clear, actionable dashboards that support performance improvement, cost control, and strategic decision-making.

Key Responsibilities

  1. Reporting & Dashboard Development
  • Design, develop, and maintain interactive dashboards and reports
  • Produce reporting for:
  • Cost Value Reconciliation (CVR)
  • Earned Value Management (EVM)
  • Programme performance (SPI / CPI)
  • Resource and plant utilisation
  • Risk and opportunity registers
  • Health & Safety KPIs
  • Develop executive-level portfolio dashboards across multiple projects
  • Automate monthly reporting packs and board reports
  • Ensure dashboards are visually clear, accurate, and aligned with business KPIs
  1. Data Integration & Modelling
  • Integrate data from ERP, planning, commercial, and site systems
  • Develop and maintain robust data models
  • Create advanced DAX measures and calculations
  • Optimise report performance and data refresh processes
  • Ensure data accuracy, governance, and consistency across systems
  1. Project Controls & Commercial Support
  • Support cost forecasting and trend analysis
  • Monitor project margins, cash flow, and cost-to-complete
  • Provide scenario modelling and performance insights
  • Support change management and commercial reporting requirements
  • Assist in developing standardised reporting frameworks across projects

Technical Skills & Experience

  • Advanced Power BI development experience (Desktop & Service)
  • Strong knowledge of DAX and Power Query (M language)
  • Proficiency in SQL and relational databases
  • Experience working with construction, rail, or civil engineering datasets
  • Understanding of:
  • Project controls processes
  • Earned Value Management principles
  • Programme performance metrics
  • Commercial reporting structures
  • Experience integrating with Excel and enterprise systems
  • Exposure to cloud-based data environments (desirable)

Qualifications

  • Degree in Data Analytics, Engineering, Construction Management, or related discipline
  • Minimum 3+ years’ experience in Business Intelligence within construction, rail, or infrastructure sectors
  • Relevant Microsoft certification (e.g., Power BI Data Analyst) desirable

Key Competencies

  • Strong commercial awareness
  • Analytical and problem-solving mindset
  • Ability to interpret engineering and programme data
  • Excellent stakeholder engagement skills
  • High attention to detail and data accuracy
  • Ability to manage multiple project deadlines

Desirable Experience

  • Experience on major infrastructure frameworks
  • Familiarity with NEC or JCT contracts
  • Knowledge of project lifecycle reporting (tender through to handover)
  • Experience supporting multi-project or portfolio-level reporting

If you are interested in hearing more please contact John Baker or Kat Oxlade

Fusion People are committed to promoting equal opportunities to people regardless of age, gender, religion, belief, race, sexuality or disability. We operate as an employment agency and employment business. You’ll find a wide selection of vacancies on our website.

Murex Datamart Developer
Arm
London
Hybrid
Mid - Senior
ÂŁ600/day - ÂŁ675/day

7-8 Month contract
3 days per week on site in Central London
(Apply online only) per day (Inside IR35)

Job Summary:
We are seeking an experienced Murex consultant based onshore (London) with a strong background in the MX3.1 Datamart module. The ideal candidate will possess a very good technical understanding of the Murex platform and its reporting functionality, and ideally have a sound working knowledge of rates, FX, credit and commodities products.
The candidate will be responsible for working on Murex reporting changes, enhancements and bug fixes for business users of the application. The will cover development, testing and preparation for release to production.

Responsibilities on the role-
Handle the end to end design, development and deployment packaging of Murex Datamart solutions aligning with user requirements
Configure and implement different Datamart objects with an optimized solution as per murex best practices
Work on reporting-related bugs and issues, and collaborate with users directly to understand the issues and requirements
Implement the agreed solutions and gather approval on the solution from user and downstream stakeholders
Handle the testing of the solution, documentation of the detailed technical specifications and runbook creation.
Reconcile report outputs and explain any differences
Participate in peer reviews of technical specifications and testing documentation

Technical Skills-

  1. Murex MX3.1 Datamart (required)
  2. SQL, Unix (required)
  3. GIT, Jenkins, JIRA, MxTest, Control-M (desirable)

Required Qualifications-
Bachelor’s degree in Finance, Computer Science, or a related field
Strong experience with Murex Datamart module
Strong technical skills, including proficiency in SQL, Unix/Linux and other relevant programming languages
Understanding of financial products including rates & commodities
Excellent problem-solving and analytical skills
Strong communication and interpersonal skills, with the ability to work effectively in a team environment

Disclaimer:

This vacancy is being advertised by either Advanced Resource Managers Limited, Advanced Resource Managers IT Limited or Advanced Resource Managers Engineering Limited (“ARM”). ARM is a specialist talent acquisition and management consultancy. We provide technical contingency recruitment and a portfolio of more complex resource solutions. Our specialist recruitment divisions cover the entire technical arena, including some of the most economically and strategically important industries in the UK and the world today. We will never send your CV without your permission. Where the role is marked as Outside IR35 in the advertisement this is subject to receipt of a final Status Determination Statement from the end Client and may be subject to change.

Data Governance & Quality Analyst
JLR Search Ltd
London
In office
Mid - Senior
ÂŁ450/day - ÂŁ500/day

A leading financial services company has an urgent 6 months + (inside ir35) requirement for a Data Governance & Quality Analyst to provide hands on support in executing data stewardship and governance activities, maintaining data quality, metadata and lineage, and supporting the implementation of governance standards, processes and tools to ensure the organisation can rely on accurate, well managed data for regulatory compliance, analytics and operational decision making, working under the direction of the business.

Key Responsibilities

Support the execution of strategic priorities for developing Data Governance capabilities, ensuring alignment with the data strategy, Data Protection Policy, SII data policy and the enterprise governance framework.

Key Skills / Experience

* Expertise in Data Governance concepts and best practice

* Demonstrable skills in Data Quality Analysis.

* Solid understanding of GDPR and The Data Protection Act 2018

* Experience in Microsoft Purview Data Governance is essential

* Working knowledge of Profisee (MDM) tooling is required

* Understanding of financial regulations and regulatory reporting

* Auditing experience

* Knowledge of or skills in Data warehousing, Data Lake and Big Data solutions (understanding SQL would be useful)

* Knowledge of Cloud based big data frameworks such as data lake, relational, Graph and other no-SQL databases

* Familiar with Cloud and Data Management trends, including open source projects, methodologies (connect and collect, hub and spoke, data fabrics, etc.) and leading commercial vendors that relate to data acquisition, management and the semantic web

* Microsoft Server technologies (Azure, T-SQL, SSIS, SSRS, Power BI) is desirable

* Understanding of Master Data Management technology landscape, processes and design principles

* Operational familiarity in the use of meta-Data Management, data quality, and data stewardship tools and platforms. Experience of Microsoft Purview is desirable.

* Data Lineage knowledge - ability to perform route cause analysis

* Proven track record in operating large Data Governance programs and managing enterprise data assets in a complex organisation

* Creating and implementing Data Governance frameworks and policies

* Experience using Data Governance & Data Quality systems and tools

* Experience querying databases using SQL is essential

* Experience with SQL Server (T-SQL, SSIS, SSRS, MDS) is desirable.

* Experience with Power BI

* Knowledge of data sources, transformation rules, and use of the data for the area of Data Stewardship

* Experience in the use of data catalogues and data quality technologies

* Experience of working within the financial sector

Data Engineer (BD&A - DAPM Live Service Support) - Hybrid
CBSbutler Holdings Limited trading as CBSbutler
Telford
Hybrid
Mid - Senior
ÂŁ400/day - ÂŁ430/day

Job Title: Data Engineer (BD&A - DAPM Live Service Support)

Max Rate: ÂŁ430 per day inside ir35

Duration: 6 months

Location: Telford/hybrid 2 days per week onsite)

Active SC security clearance is required for this role.

Job Description:

We are seeking an SC Cleared Live Support & Monitoring Engineer to provide operational support across a suite of data integration and analytics platforms. This role focuses on maintaining stability, enhancing monitoring capability, and improving service visibility through consolidated dashboards and intelligent alerting.

Responsibilities

Live Service Support

Provide ongoing live support across platforms including:
Denodo
Talend
Pentaho Data Integration (PDI)
Git
MySQL
Amazon Redshift
Investigate, diagnose and resolve incidents across data and integration services
Work closely with technical teams to maintain service availability and performanceGrafana Monitoring & Alerting

Design, create and consolidate Grafana dashboards
Transform multiple independent dashboards into a unified Live Service view with drill-down capability by service
Gather monitoring requirements from stakeholders
Configure and implement alerting for legacy services that currently lack monitoring
Deliver fit-for-purpose alert thresholds and notifications aligned to operational needs
Improve visibility, observability and proactive incident management

Experience & Skills

Essential

Active SC Clearance
Experience supporting live production environments
Exposure to data platforms such as Denodo, Talend, PDI, MySQL or Redshift
Experience creating or maintaining Grafana dashboards
Understanding of monitoring, alerting and service observability principles
Strong troubleshooting and analytical skills
Ability to gather requirements and translate them into monitoring solutionsDesirable

Experience configuring Grafana alerting
Experience working in a client-side environment
Knowledge of legacy system monitoring uplift
Familiarity with Git version control

Lead Data Modeller
Tria Recruitment
West Midlands
Hybrid
Senior
ÂŁ80,000
TECH-AGNOSTIC ROLE

Hybrid - Worcester - 3 days a week

ÂŁ70,000 - ÂŁ80,000

Our client are looking for a Lead Data Modeller to take ownership of developing their data architecture and modelling capability. In this role you will focus on establishing frameworks, standards and scalable data models that align with strategic business objectives alongside continuous improvement across data governance and architecture.

You’ll need proven experience leading the end-to-end delivery of data modelling initiatives, including building and implementing common data models, while managing internal teams and collaborating closely with third-party suppliers. Strong stakeholder engagement is essential, along with the ability to translate between technical architecture and business requirements.

We’re looking to speak with candidates who have;

  • Experience in delivering data architecture and data modelling programmes
  • Experience creating a common data model (preferably on Datasphere)
  • Strong understanding of data governance, metadata management and regulatory frameworks
  • Exceptionally strong stakeholder management skills
  • Experience leading a team is beneficial

Please apply for more information.

Frequently asked questions
Typically, a Data Engineer should have a strong background in computer science or related fields, proficiency in programming languages like Python or Java, and experience with data warehousing, ETL processes, and big data technologies such as Hadoop or Spark.
Haystack features a wide range of Data Engineer positions, including roles in startups, large enterprises, and remote opportunities. You can find jobs specializing in cloud data engineering, real-time data processing, data pipeline development, and more.
To improve your chances, tailor your resume to highlight relevant skills and projects, gain hands-on experience with popular data tools, contribute to open-source projects, and stay updated with the latest trends in data engineering.
Yes, Haystack lists entry-level Data Engineer roles suitable for recent graduates or professionals transitioning into data engineering, as well as internships and junior positions to help you start your career.
Absolutely. Haystack offers many remote and flexible Data Engineer job listings to suit your preferred working style and location.