Make yourself visible and let companies apply to you.
Roles
Data Engineer Jobs
Overview
Looking for top Data Engineer jobs? Explore the latest data engineering opportunities on Haystack, your go-to IT job board. Whether you're skilled in ETL, data pipelines, or big data technologies, find the perfect role to advance your career today. Start your search for Data Engineer positions now!
SAS Data Engineer
Deerfoot Recruitment Solutions Limited
Telford
Hybrid
Mid - Senior
ÂŁ50,000 - ÂŁ70,000
RECENTLY POSTED
TECH-AGNOSTIC ROLE

SAS Consultant/Data Engineer
Location: Telford or Worthing (hybrid working 2 days onsite)
Type: Full Time, Permanent
Salary: ÂŁ50,000 - ÂŁ70,000 DOE + comprehensive benefits package

Deerfoot Recruitment is working with a major consultancy partner on a long-term public sector engagement and is seeking experienced SAS Consultants/Data Engineers to join a growing data team.
This is a key role within a large-scale data portfolio, supporting critical programmes focused on revenue optimisation, fraud detection, data management and analytics. The successful candidate will play a hands-on role in designing, building and supporting robust SAS-based data solutions, working closely with product owners, architects, engineers and senior client stakeholders.

Key responsibilities include:

  • Designing and delivering secure, high-performance SAS solutions for data integration and analytics
  • Building and enhancing data pipelines covering ingestion, transformation, reporting and fraud detection
  • Supporting live services, incident resolution and continuous improvement
  • Collaborating in Agile delivery teams and contributing to engineering best practice

Key skills and experience:

  • Minimum 5 years experience as a data engineer or similar role
  • Strong background as a Data Engineer on complex, large-scale data platforms
  • Proven expertise with SAS 9.x
  • (SAS Viya (3.x/4) bonus to have)
  • Solid ETL, data modelling and batch scheduling experience
  • Understanding of performance optimisation, CI/CD and scalable solution design
  • Confident client-facing and consultancy skills

Security Clearance:
This role requires SC clearance, or eligibility to obtain it. Applicants must have lived in the UK continuously for the past 5 years. Some restrictions may apply based on nationality and residency.
This is an excellent opportunity to work on high-impact public sector systems within a collaborative, technically strong environment.

Apply today to find out more.

SAS Consultant/Senior SAS Consultant/SAS Developer/SAS Data Engineer/SAS Programmer/Lead SAS Programmer/SAS Analytics Consultant/SAS Technical Consultant/SAS Solutions Consultant/Data Engineer/Senior Data Engineer/Analytics Engineer/Data Platform Engineer/Data Integration Engineer/Data Pipeline Engineer/Data Solutions Engineer/Enterprise Data Engineer

Deerfoot Recruitment Solutions Ltd is a leading independent tech recruitment consultancy in the UK. For every CV sent to clients, we donate £1 to The Born Free Foundation. We are a Climate Action Workforce in partnership with Ecologi. If this role isn’t right for you, explore our referral reward program with payouts at interview and placement milestones. Visit our website for details. Deerfoot Recruitment Solutions Ltd is acting as an Employment Agency in relation to this vacancy.

Lead Data Engineer - Databricks
JLA Resourcing Ltd
Basingstoke
Hybrid
Senior
ÂŁ75,000
RECENTLY POSTED

Lead Data Engineer - Databricks - ÂŁ75-80k + bonus + benefits - Basingstoke 3 days a week

The Opportunity:
We are looking for a Lead Data Engineer with good Databricks expeirence to join a Basingstoke based organisation who are investing heavily in their Digital Transformation Programme.

The Role:
You’ll play a proactive role in the delivery of next-generation data platforms, will manage / mentor the existing person and drive the design, development and governance of the data pipelines. You’ll be working really closely with stakeholders across the technology function and within the business and will the availability, integrity and compliance of the systems. You’ll play a key role in the ownership of the core architecture / engineering across the new Azure Databricks ecosystem. This will include incorporating AI / ML capability.

They are currently working with a 3rd Party Data Partner who have recommended a number of improvements - you’ll work closely with them selecting, implementing and managing technology so it’s a great opportunity to really make a difference.

The Person:
Key to this is proactivity - they’re really looking for someone who is always looking at “what’s next” but also able to deliver / engineer “the now”

Skills / attributes to include:

  • In depth experience of modern data solution architecture design and delivery in a hybrid cloud environment but predominantly Azure / Databricks
  • The ability to lead a small team whilst contributing personally
  • Be able to drive agile, modern engineering practises
  • Champion quality, observability and good engineering discipline, ensuring pipeline and models run cleanly and predictably
IT Systems Engineer
Manufacturing Recruitment Limited
Diss
Remote or hybrid
Mid - Senior
ÂŁ60,000
RECENTLY POSTED

Someone who can own 3rd-line support, build Power Apps, work confidently with ERP (Enterprise Resource Planning), ideally ERP Epicor data, improve cyber security, and help move toward a data platform using Power BI/Fabric.

3rd Line Support & IT Operations Ownership

  • Strong Windows Server / Microsoft 365 / Azure AD troubleshooting
  • Confident owning escalations (not just passing tickets)
  • Fortinet Firewall
  • Comfortable managing MSPs: SLAs, ticket quality, challenge bad advice

Cyber Security & Risk Reduction

  • Baseline cyber frameworks (Cyber Essentials / ISO-aware)
  • Microsoft security stack familiarity (Defender, MFA, Conditional Access)
  • Patch management & vulnerability awareness
  • Can implement controls, not just write policies

Security must be baked into delivery not an afterthought.

Power Platform, Fabric & Data Modelling

  • Power BI (data modelling, DAX fundamentals)
  • Power Query / ETL thinking
  • Familiarity with Microsoft Fabric or modern data platforms
  • Can design a single source of truth, not just reports

Someone who moves from spreadsheets ? models ? insight.

Application Development (Internal + Customer-Facing)

  • Power Apps (Canvas + Dataverse) or equivalent low-code platform
  • REST API integration mindset
  • UX pragmatism (build usable tools, not demos)
  • Understanding of security boundaries (internal vs customer apps)

Low-code is acceptable. Cowboy coding is not.

Epicor / ERP + Manufacturing Systems Capability

  • Epicor (or similar ERP) experience useful: BAQs, REST/API, upgrades
  • SQL literacy (views, joins, performance awareness)
  • Understanding of manufacturing concepts: BOMs, routings, work centres
  • Ability to extract ERP data cleanly for reporting & forecasting

Process, Documentation & Knowledge Capture

  • Comfortable documenting systems and processes
  • Can create repeatable runbooks, not Word documents nobody reads
  • Thinks in terms of bus factor reduction
Oracle Software Engineer
Spectrum It Recruitment Limited
Southampton
Hybrid
Junior - Mid
ÂŁ40,000
RECENTLY POSTED

Solving real problems with PL/SQL precision and designing ETL processes that unlock the true value of enterprise date.

Work on complex data challenges, modern tooling, and create meaningful technical impact.

Oracle Software Engineer
Circa ÂŁ38k + Up to 15% bonus
Twice a month in the Southampton area based office

The Role

You’ll join a cross-functional product team that thrives on pace, collaboration, and innovation. Every day, you’ll help shape the data foundations that keep our products and services running seamlessly for thousands of users.

You’ll design and develop clean, efficient, and fully tested PL/SQL code, contributing your expertise to build reliable features that teams rely on.
Working with our established data structures, you’ll turn complex requirements into robust, performant database solutions.

As a key participant in our sprint cycle, you’ll embrace agile ways of working, bringing transparency, adaptability, and continuous improvement to everything you do.

This role would suit

Someone with commercial PL/SQL experience using ANSI syntax and a strong approach to unit testing.

Experience with ETL processes, to handle data with confidence, turning it into reports and transformations that power the business.

Experience with tools and frameworks like Bitbucket, JSON, REST, Confluence, Jira, TOAD, and Scrum

And with skills in Python, Azure DevOps, APIs, or other databases,

Current tech stack includes:

  • Oracle, PL/SQL, ETL, Toad,
  • Git, Jira, BitBucket, Agile,
  • UT PL/SQL (Unit testing framework),
  • Azure DevOps, API first,
  • MongoDB, PostGres,
  • Python

This Oracle Software Engineer role is paying circa ÂŁ38k with benefits including, 15% Bonus, 25 days holiday, enhanced pension, onsite gym, Car Scheme, healthcare scheme and more.

Apply now or contact Chris Lynes at Spectrum IT for more information.

Spectrum IT Recruitment (South) Limited is acting as an Employment Agency in relation to this vacancy.

Data Engineer
Birchwell Associates Ltd
Wallingford
Hybrid
Mid - Senior
ÂŁ50,000
RECENTLY POSTED

Birchwell Associates is recruiting on behalf of a client seeking an experienced Data Warehouse Specialist to join a growing data function based in Benson. This role requires onsite presence at least three days per week.

Reporting to senior leadership, the successful candidate will take ownership of an existing data warehouse while leading its evolution, including the design and delivery of a new architecture and the structured management of its migration. Working closely with technical and non-technical stakeholders, you will deliver reliable, high-quality data solutions that support informed decision-making across the business.

Key Responsibilities

  • Manage, optimise, and enhance the current data warehouse, ensuring strong data quality, performance, and governance.
  • Design and implement a scalable data warehouse architecture, integrating multiple internal and external data sources via APIs and connectors.
  • Maintain data integrity, security, and consistency across all reporting and analytics environments.
  • Identify and resolve data model and performance issues, producing clear technical documentation.
  • Collaborate with business stakeholders to translate requirements into effective data solutions.
  • Deliver data initiatives to agreed timelines and standards.
  • Support additional data-related projects as required.

Key Requirements

  • At least two years experience in data warehouse or database architecture roles, including semantic model or cube development.
  • Strong SQL skills with proven experience in data modelling and transformation.
  • Ability to analyse complex data and clearly communicate technical concepts to non-technical audiences.
  • Experience with Microsoft data platforms, including Azure Fabric.
  • Knowledge of DAX, Jet Analytics, or NAV is beneficial but not essential.
Data Manager
Oscar Associates Limited
Norwich
Hybrid
Mid - Senior
ÂŁ55,000
RECENTLY POSTED

Job Title: Data Manager
Location: Norwich - Hybrid (4 days per week in office)
Salary: ÂŁ45,000 - ÂŁ55,000 DOE

About the Business
Our client is a growing company, that is expanding its commercial data capabilities. The team is responsible for turning complex data into clear, actionable insights that support decision-making across the business.

About the Role
They are seeking a hands-on Data Manager to join their commercial data function. You will be responsible for managing and maintaining reporting systems, supporting forecasting and performance tracking and ensuring data is accurate, consistent and structured for decision-making. The role is ideal for someone who enjoys working with data, solving problems and providing insight that drives business performance.

Key Responsibilities

  • Own and maintain core commercial reporting across sales, campaigns, events, and audiences.
  • Build and maintain advanced Excel models, trackers, and dashboards.
  • Translate raw data into clear, actionable insights for leadership and teams.
  • Support forecasting, performance tracking, and ROI analysis.
  • Ensure data is structured consistently, accurate, and reliable.
  • Maintain clear documentation explaining data usage and definitions.
  • Develop an understanding of audience engagement and campaign performance.
  • Act as a point of control for reporting standards and data governance.
  • Communicate insights effectively to technical and non-technical stakeholders.

Essential Skills & Experience

  • Advanced Excel skills, including formulas, pivots, dashboards, and modelling.
  • Strong commercial awareness and understanding of performance metrics.
  • Experience working with sales, campaign, or performance data.
  • Exceptional attention to detail, organisation, and accuracy.
  • Ability to communicate complex data insights clearly to stakeholders.
  • Experience with data visualisation and SQL

Desirable

  • Experience with CRM or marketing platforms (e.g., Salesforce).
  • Interest in automation or AI.

Personal Attributes

  • Highly analytical, commercially minded and detail-oriented.
  • Proactive and able to work autonomously while delivering accurate results.
  • Strong communicator, confident across technical and non-technical audiences.
  • Organised, adaptable, and capable of managing multiple priorities.

Benefits

  • Competitive salary ÂŁ45,000 - ÂŁ55,000 DOE.
  • Hybrid working with 4 days per week in the office.
  • Opportunity to influence how commercial performance is measured, understood and improved.
  • Work with a rapidly growing, innovative team in a collaborative environment.

Oscar Associates (UK) Limited is acting as an Employment Agency in relation to this vacancy.

To understand more about what we do with your data please review our privacy policy in the privacy section of the Oscar website.

Azure Data Architect
ARC IT Recruitment
Brighton
Hybrid
Mid - Senior
ÂŁ80,000 - ÂŁ90,000
RECENTLY POSTED

Brighton, East Sussex, ÂŁ80 - ÂŁ90k

Azure Data Architect is required by our client who are a fast-growing and technically advanced multi-award-winning company going through an extended period of growth.

Key responsibilities:

  • Design, develop, and maintain data architectures using Azure Databricks and Synapse.
  • Collaborate with cross-functional teams to understand data requirements and translate them into robust architectural solutions.
  • Optimize data workflows and ensure seamless integration with various data sources.
  • Implement data governance and security best practices.
  • Provide technical leadership and guidance to the development team.
  • Conduct performance tuning and optimization of data processes.

Skills required:

  • Proven experience as a Data Architect working within Azure.
  • Expertise in Azure Databricks and Azure Synapse.
  • Strong understanding of Datamodelling, ETL processes, and data warehousing concepts.
  • Proficiency in SQL, Python, and other relevant programming languages.
  • Experience with data governance, data quality, and data security best practices.
  • Excellent problem-solving skills and attention to detail.
  • Strong communication and stakeholder management skills.

Worthing based hybrid opportunity, easily commutable from Portsmouth Hampshire, Guildford Surrey or Brighton, East Sussex.
Azure, Synapse, Databricks
Brighton, Hybrid (3 days in the office)

Configuration Engineer
Rullion Limited
Bridgwater
In office
Junior - Mid
ÂŁ220/day - ÂŁ300/day
RECENTLY POSTED

Role: Configuration Engineer
Position: Contract
Location: Hinkley Point C and SDC, Somerset
Duration: 12 Months Rolling
Rate: Circa ÂŁ220 p/d PAYE + 36 days annual leave // Circa ÂŁ300 p/d Umbrella

Job Purpose / Overview

As a Configuration Engineer, you will be part of a growing multidisciplinary team responsible for delivering Work Management Support and maintaining the digital As-Built configuration required to build, commission, and operate Hinkley Point C Power Station.

This role focuses on data extraction, validation, and assembly into datasets aligned with business rules and ready for submission to Asset Suite 9 (HPC’s chosen Enterprise Asset Management (EAM) system). You will support the population of the Project’s Master Equipment List and ensure accurate attribute data within HPC’s EAM tool.

Principal Accountabilities

  • Population of the Equipment module in Asset Suite 9 with accurate asset identifiers and attributes.
  • Performing data quality assurance for equipment installation and configuration references.
  • Maintaining the asset/system schedules and resolving data anomalies.
  • Producing weekly performance reports into the line manager for review, and upward reporting.
  • Supporting the digital configuration through work management processes.
  • Collaboration with Construction Contract Partners, Completions, and Handover teams to ensure consistent data across platforms.

Essential Skills:

  • Strong experience in asset data analysis and validation.
  • Proficiency in Microsoft Excel, Word, and Power BI.
  • Ability to work independently and manage data integrity.
  • Experience with SAP, EDRMS or other CMMS systems.

Desirable Skills:

  • Familiarity with Asset Suite/Passport or other EAM tools.
  • Background/experience in engineering disciplines or interpreting engineering drawings.
  • Previous experience of working in a construction, completions, and/or data management related industry.

Rullion celebrates and supports diversity and is committed to ensuring equal opportunities for both employees and applicants.

PySpark Developer
Randstad Digital
London
Fully remote
Senior - Leader
ÂŁ300/day - ÂŁ350/day
RECENTLY POSTED

Lead PySpark Engineer (Cloud Migration)

Role Type: 5-Month Contract

Location: Remote (UK-Based)

Experience Level: Lead / Senior (5+ years PySpark)

Role Overview

We are seeking a Lead PySpark Engineer to drive a large-scale data modernisation project, transitioning legacy data workflows into a high-performance AWS cloud environment. This is a hands-on technical role focused on converting legacy SAS code into production-ready PySpark pipelines within a complex financial services landscape.

Key Responsibilities

  • Code Conversion: Lead the end-to-end migration of SAS code (Base SAS, Macros, DI Studio) to PySpark using automated tools (SAS2PY) and manual refactoring.
  • Pipeline Engineering: Design, build, and troubleshoot complex ETL/ELT workflows and data marts on AWS.
  • Performance Tuning: Optimise Spark workloads for execution efficiency, partitioning, and cost-effectiveness.
  • Quality Assurance: Implement clean coding principles, modular design, and robust unit/comparative testing to ensure data accuracy throughout the migration.
  • Engineering Excellence: Maintain Git-based workflows, CI/CD integration, and comprehensive technical documentation.

Technical Requirements

  • PySpark (P3): 5+ years of hands-on experience writing scalable, production-grade PySpark/Spark SQL.
  • AWS Data Stack (P3): Strong proficiency in EMR, Glue, S3, Athena, and Glue Workflows.
  • SAS Knowledge (P1): Solid foundation in SAS to enable the understanding and debugging of legacy logic for conversion.
  • Data Modeling: Expertise in ETL/ELT, dimensions, facts, SCDs, and data mart architecture.
  • Engineering Quality: Experience with parameterisation, exception handling, and modular Python design.

Additional Details

  • Industry: Financial Services experience is highly desirable.
  • Working Pattern: Fully remote with internal team collaboration days.
  • Benefits: 33 days holiday entitlement (pro-rata).

Randstad Technologies is acting as an Employment Business in relation to this vacancy.

Data Engineer
MBDA UK
Bolton
Hybrid
Mid - Senior
ÂŁ45,000 - ÂŁ55,000
RECENTLY POSTED
+2

Bolton As a data engineer specialising in generative AI ; this role will see you working in a developing international and transversal structure. You will have the responsibility to evaluate, build and maintain data sets for internal customers whilst ensuring they can be maintained. Salary: Circa £45,000 - £55,000 depending on experience Dynamic (hybrid) working: 2-3 days per week on-site due to workload classification Security Clearance: British Citizen Restrictions and/or limitations relating to nationality and/or rights to work may apply. As a minimum and after offer stage, all successful candidates will need to undergo HMG Basic Personnel Security Standard checks (BPSS), which are managed by the MBDA Personnel Security Team. What we can offer you: Company bonus: Up to £2,500 (based on company performance and will vary year to year) Pension: maximum total (employer and employee) contribution of up to 14% Overtime: opportunity for paid overtime Flexi Leave: Up to 15 additional days Flexible working: We welcome applicants who are looking for flexible working arrangements Enhanced parental leave: offers up to 26 weeks for maternity, adoption and shared parental leave -enhancements are available for paternity leave, neonatal leave and fertility testing and treatments Facilities: Fantastic site facilities including subsidised meals, free car parking and much more…The opportunity: The MBDA IM GenAI delivery Office department is looking for an experienced data engineer able to evaluate design, deploy, improve and support MBDA data sets. You will ensure MBDA data pipelines are designed to be resilient, secure and responsive. You will use your data engineering expertise to collaborate with different internal customers regarding their data, ensuring they are optimised and secured for their needs. You will provide your knowledge in data management and data quality to guarantee compliance to MBDA data governance. A key part of this role is keeping up to date with new technology, where you will provide insight on our technology roadmap and deliver cutting edge solutions to our internal customers. What we’re looking for from you: SQL technologies skills (e.g. MS SQL, Oracle…) noSQL technologies skills (e.g. MongoDB, InfluxDB, Neo4J…) Data exchange and processing skills (e.g. ETL, ESB, API…) Development (e.g. Python) skills Big data technologies knowledge (e.g. Hadoop stack) Knowledge in NLP (Natural Language Processing) Knowledge in OCR (Object Character Recognition) Knowledge in Generative AI (Artificial Intelligence) would be advantageous Experience in containerisation technologies (e.g. Docker) would be advantageous Knowledge in the industrial and / or defence sector would be advantageousOur company: Peace is not a given, Freedom is not a given, Sovereignty is not a given MBDA is a leading defence organisation. We are proud of the role we play in supporting the Armed Forces who protect our nations. We partner with governments to work together towards a common goal, defending our freedom. We are proud of our employee-led networks, examples include: Gender Equality, Pride, Menopause Matters, Parents and Carers, Armed Forces, Ethnic Diversity, Neurodiversity and more… We recognise that everyone is unique, and we encourage you to speak to us should you require any advice, support or adjustments throughout our recruitment process. Follow us on LinkedIn (MBDA), X (@MBDA_UK), Instagram (MBDA_UK) and Glassdoor or visit our MBDA Careers website for more information. #LI-RM1

Data Engineer
MBDA UK
Manchester
Hybrid
Mid - Senior
ÂŁ45,000 - ÂŁ55,000
RECENTLY POSTED
+2

Bolton

As a data engineer specialising in generative AI ; this role will see you working in a developing international and transversal structure. You will have the responsibility to evaluate, build and maintain data sets for internal customers whilst ensuring they can be maintained.
Salary: Circa ÂŁ45,000 - ÂŁ55,000 depending on experience
Dynamic (hybrid) working: 2-3 days per week on-site due to workload classification
Security Clearance: British Citizen
Restrictions and/or limitations relating to nationality and/or rights to work may apply. As a minimum and after offer stage, all successful candidates will need to undergo HMG Basic Personnel Security Standard checks (BPSS), which are managed by the MBDA Personnel Security Team.
What we can offer you:

  • Company bonus: Up to ÂŁ2,500 (based on company performance and will vary year to year)
  • Pension: maximum total (employer and employee) contribution of up to 14%
  • Overtime: opportunity for paid overtime
  • Flexi Leave: Up to 15 additional days
  • Flexible working: We welcome applicants who are looking for flexible working arrangements
  • Enhanced parental leave: offers up to 26 weeks for maternity, adoption and shared parental leave -enhancements are available for paternity leave, neonatal leave and fertility testing and treatments
  • Facilities: Fantastic site facilities including subsidised meals, free car parking and much more

The opportunity:
The MBDA IM GenAI delivery Office department is looking for an experienced data engineer able to evaluate design, deploy, improve and support MBDA data sets.

You will ensure MBDA data pipelines are designed to be resilient, secure and responsive. You will use your data engineering expertise to collaborate with different internal customers regarding their data, ensuring they are optimised and secured for their needs.

You will provide your knowledge in data management and data quality to guarantee compliance to MBDA data governance. A key part of this role is keeping up to date with new technology, where you will provide insight on our technology roadmap and deliver cutting edge solutions to our internal customers.
What we’re looking for from you:

  • SQL technologies skills (e.g. MS SQL, Oracle )
  • noSQL technologies skills (e.g. MongoDB, InfluxDB, Neo4J )
  • Data exchange and processing skills (e.g. ETL, ESB, API )
  • Development (e.g. Python) skills
  • Big data technologies knowledge (e.g. Hadoop stack)
  • Knowledge in NLP (Natural Language Processing)
  • Knowledge in OCR (Object Character Recognition)
  • Knowledge in Generative AI (Artificial Intelligence) would be advantageous
  • Experience in containerisation technologies (e.g. Docker) would be advantageous
  • Knowledge in the industrial and / or defence sector would be advantageous

Our company: Peace is not a given, Freedom is not a given, Sovereignty is not a given

MBDA is a leading defence organisation. We are proud of the role we play in supporting the Armed Forces who protect our nations. We partner with governments to work together towards a common goal, defending our freedom.

We are proud of our employee-led networks, examples include: Gender Equality, Pride, Menopause Matters, Parents and Carers, Armed Forces, Ethnic Diversity, Neurodiversity and more

We recognise that everyone is unique, and we encourage you to speak to us should you require any advice, support or adjustments throughout our recruitment process.

Follow us on LinkedIn (MBDA), X Instagram (MBDA_UK) and Glassdoor or visit our MBDA Careers website for more information.

Lead Enterprise AI Engineer
SMS
Cardiff
Fully remote
Senior
Private salary
RECENTLY POSTED

Why choose us?

Choosing to work for SMS means choosing to make a difference.We are changing how businesses and consumers use energy for the better, helping achieve a greener, sustainable, and more affordable energy system for everyone. Through our range of innovative energy solutions, we are delivering the future of smart energy working closely with private and public sector partners we are playing a critical role in transforming and decarbonising the UK economy by 2050.

What’s in it for you?

  • 25 personal holiday days per year (with additional 8 public holidays) increasing to 30 personal days after 5 years of service (includes options to buy and sell)
  • Hybrid working options (for some positions).
  • Enhanced Maternity leave. Paternity and Adoption leave.
  • 24/7 free and confidential employee assistance service.
  • Simply Health plan offers a wide variety of benefits from cashback on everyday healthcare treatments like optical, dental and physio treatments. Discounted gym memberships and free 24/7 online GP.
  • Life Insurance (4 x annual salary)
  • Pension matching scheme (up to 5% of salary)

Visit Our People page

What’s the role?

Step into a role where strategy meets engineering excellence. As our Lead Enterprise AI Engineer, youll be the driving force turning bold business opportunities into production-ready AI solutions. Youll shape intelligent agents, craft next-generation conversational experiences using Microsoft Copilot and Databricks Mosaic AI, and architect the Semantic Layer that ensures our models deliver accurate, trusted insights every time.

If youre ready to take ownership of an organisations AI futureand build the systems that make it realthis is your stage.

You will report to the Data, Analytics and AI Director and work remotely on a full-time, 40-hour contract.

Please note: That travel is required, to have face to face meeting with your line manager.

Key responsibilities:

AI Solution Development & Agent Building

  • Design, build, and deploy low-code and pro-code AI agents using Microsoft Copilot tools to automate business workflows (e.g., HR queries, IT support, operational data retrieval).
  • Develop custom RAG (Retrieval-Augmented Generation) solutions within Databricks to allow LLMs to reason over proprietary SMS documents and data.
  • Integrate AI agents with enterprise systems (Dynamics 365, SharePoint, etc.) via APIs and Power Automate connectors.

Semantic Modelling & Databricks Genie Curation

  • Own the creation and maintenance of Databricks Genie Spaces. This involves translating complex database schemas into business-friendly semantic models.
  • Define and govern standard metrics, dimensions, and synonyms within Unity Catalog to ensure the AI “speaks the language of the business.”
  • Continuously monitor Genie performance, reviewing “human feedback” on answers to refine the semantic model and improve accuracy over time.

Business Engagement & Prototyping

  • Partner with business stakeholders (Finance, Operations, Commercial) to decompose high-level use cases into technical requirements.
  • Rapidly prototype AI solutions to demonstrate value and gain quick traction within business units.
  • Act as a technical evangelist, demonstrating to non-technical teams how to interact with Genie spaces and Copilots effectively.

AI Operations (LLMOps) & Governance

  • Implement monitoring frameworks to track the cost, latency, and quality of AI model outputs.
  • Ensure all AI solutions adhere to the organisation’s data governance and security standards (e.g., preventing data leakage via LLMs).
  • Manage the lifecycle of AI models and agents from development through to production and retirement.

To be considered for this role, we would love you to have:

  • Degree in Computer Science, Data Engineering, Artificial Intelligence, or related field, or equivalent industry experience.
  • Deep hands-on experience with Azure/Microsoft Fabric ecosystem (specifically Copilot Studio & Power Platform) and Databricks (SQL, Unity Catalog, Mosaic AI).
  • Strong ability to design data models for analytics. Experience with defining metrics layers (e.g., DBX semantic layer or Databricks Genie) is essential.
  • Practical experience with Large Language Models (LLMs), Prompt Engineering, and RAG architectures.
  • Proficient in Python (for data manipulation and API interaction) and SQL (for data modelling).
  • Experience working with REST APIs and connecting disparate business systems.
  • Ability to explain technical AI concepts to non-technical business users and translate their feedback into code.

#LI-Remote

Data & Integration Solution Architect
Agincare Group
Portland
Hybrid
Senior - Leader
ÂŁ40,000

Package Description:

At Agincare, we believe technology should enable exceptional care. As we continue our exciting journey of growth and digital transformation, were investing in our systems, data, and infrastructure to support our expanding services across the UK.

Were now looking for a Data & Integration Solution Architect to play a pivotal role in shaping our evolving digital landscape.

This is a newly created role within our IT team a rare opportunity to build, influence, and future-proof our data and integration ecosystem from the ground up. With scope to lead a small team in the future, this role offers real impact, visibility, and progression.

Why join us?

As Agincares data and integration specialist, youll be at the heart of our digital transformation. Youll design and deploy integration solutions, build robust data pipelines, and support the development of Power BI reporting that helps our teams make smarter, faster decisions.

Youll work closely with stakeholders across the business, influence technical direction, and help create a future-proof architecture that supports our ambition. And as the role grows, theres potential to lead a small team giving you the chance to shape not just our systems, but our capability.

What Youll Be Doing

Integration Solutions
Design and implement robust integration solutions (APIs, ETL, microservices, ESBs), developing reusable patterns and frameworks to ensure secure, reliable, and scalable data exchange across enterprise systems.

Middleware & Platforms
Select, configure, and manage middleware/iPaaS platforms, optimising performance, availability, and security with strong monitoring and error-handling practices.

Data Architecture & Governance
Define data architectures, schemas, and ETL pipelines, embedding governance, quality, security, and lifecycle management to support both operational and analytical needs.

Power BI & Analytics
Collaborate with business teams to deliver high-quality Power BI dashboards and reports, designing optimised datasets that enable accurate, timely, data-driven decisions.

Stakeholder Engagement & Standards
Work with internal teams and vendors to align integration strategy with business goals, clearly communicating technical solutions. Produce robust documentation and establish best-practice standards for integration and data management.

Future Leadership
Mentor junior colleagues, with scope to line manage and grow a small team as the function develops.

What Were Looking For

  • Proven experience designing and implementing integration solutions (APIs, ETL, microservices, ESBs) in complex enterprise environments.
  • Hands-on expertise with middleware or iPaaS platforms.
  • Strong programming skills (Java, C#, Python or similar), with solid understanding of JSON, XML, REST, and SOAP.
  • Experience architecting and maintaining data pipelines, ETL processes, and data warehouse/schema design.
  • Skilled in Power BI (dataset design, modelling, report/dashboard delivery).
  • Strong understanding of data governance, security, compliance, and performance tuning.
  • Excellent communication skills able to engage confidently with both technical and business stakeholders.

About Agincare

Were a family run business thats been caring and supporting people since 1986. With over 4,500 team members, were one of the UKs largest care providers and are continuing to grow. We have over 100 locations across England including care& nursing homes, home care branches, extra care schemes, supported living properties and live-in offices.

Agincare are signatories of the Care Leaver Covenant and are committed to supporting care leavers to live independently.We are proud to be able to offer a guaranteed interview to care leavers, or an informal conversation about our career opportunities.

All of our care services are regulated by the Care Quality Commission (CQC).

Equal opportunities are important to us at Agincare and we welcome applications from all.

#LI-KD1

Lead Data Engineer
Rebel Recruitment
Sheffield
Hybrid
Senior
ÂŁ85,000

Role: Lead Data Engineer - Azure/Databricks

Location: Sheffield

Working Model: Hybrid - 2 days per week in person

Salary: Up to ÂŁ85k depending on experience

Own the data platform. Shape the future architecture.

This role is for a hands-on data engineering leader who wants to build something properly, not babysit legacy pipelines.

Youll take full ownership of the data engineering domain, leading the design and building of a modern, highly scalable data platform that handles complex, high-frequency industrial data. Youll work closely with data scientists, software engineers, and senior leadership, with the trust and mandate to make meaningful architectural decisions.

If you enjoy balancing strong engineering principles with real-world business needs, and you like mentoring others while staying deeply technical, this role is built for you.

What youll be working on

  • Architecting and rebuilding the data transformation layer in Databricks
  • Designing robust data flows that support both real-time operational views and deep historical analysis
  • Moving pipelines from ad-hoc scripts to software-engineering standards(CI/CD, testing, modular design)
  • Defining clear data models, schemas, and standards across a complex data estate
  • Establishing a stable, high-performance serving layer for analytics, visualisation, and data science workloads
  • Working closely with data scientists to remove bottlenecks and enable better modelling and experimentation
  • Pair-programming, mentoring, and raising the engineering bar for a small, capable team

You wont just be delivering features; youll be setting direction.

The tech youll work with

  • Core:Azure, Databricks, Python, SQL, dbt, MQTT
  • Storage & Serving:Delta Lake, Postgres, TimescaleDB
  • Modelling & ML:MLflow
  • Visualisation:Grafana

What makes you a great fit

Youre someone who can zoom out to architecture and zoom in to code, comfortably.

  • Proven experience building and owning data platforms on Azure
  • Deep, hands-on knowledge of Databricks(lakehouse architecture, cluster management, performance and cost optimisation)
  • Strong opinions on data modelling, schema design, and standardisation
  • You treat data pipelines as software: version control, CI/CD, automated testing
  • Comfortable challenging architectural decisions, and explaining why
  • Able to translate technical trade-offs into business impact for non-technical stakeholders
  • A mentor by nature: you raise the team through pairing, guidance, and example

Industry experience

  • Experience with industrial, sensor-driven, or time-series data is highly desirable
  • Alternatively, background in high-volume or highly variable data environments (missing data, duplicates, schema drift, spiky load) will transfer well

Why youll want this role

  • Real autonomy:Youre trusted to make architectural calls, thats why youre being hired
  • Visible impact:Your work directly unlocks better analytics and machine learning outcomes
  • Modern problems:Youll work on advanced data patterns and architectures, not cosmetic refactors
  • Still technical:This is a leadership role without stepping away from code

What youll get

  • 5 weeks paid holiday plus bank holidays
  • Tax-efficient stock options
  • Company pension scheme
  • Salary sacrifice EV scheme
  • Training and professional development support
  • Regular all-hands sessions with real transparency
  • Hybrid working (2 days in person)
  • Quarterly employee recognition awards
  • Access to discounts via BrightHR

We welcome diverse applicants and are dedicated to treating all applicants with dignity and respect, regardless of background.

MI Analyst
Elliott Recruitment Solutions Limited
Redditch
In office
Mid - Senior
ÂŁ40,000

Redditch | Office-based
ÂŁ40,000 | Permanent

Elliott Recruitment are working with a national, market-leading organisation to recruit an experienced MI Analyst to join their Redditch-based team.

This role is ideal for someone who enjoys producing meaningful management information that supports operational performance and senior decision-making. Youll work closely with key stakeholders, including the Operations Director, providing clear insight into KPIs, trends, and performance across the business.

If youre confident turning complex data into simple, actionable MI and enjoy seeing your work directly influence decisions this could be a great next step.

The role

As MI Analyst, youll be responsible for delivering accurate, timely, and insightful management information. Youll develop and maintain reporting packs and dashboards, monitor performance against KPIs, and provide analysis that helps the business understand whats happening, why its happening, and where improvements can be made.

What youll be doing

  • Producing regular and ad-hoc MI reports to support operational and strategic decision-making
  • Monitoring KPIs and identifying trends, risks, and opportunities
  • Building and maintaining dashboards and performance reporting packs
  • Extracting, cleansing, and validating data from Salesforce and other systems
  • Working with stakeholders to define KPIs and reporting requirements
  • Providing clear commentary and insight alongside MI outputs
  • Managing multiple data sources, including complex or imperfect datasets

What were looking for

  • Proven experience in an MI, reporting, or performance analysis role
  • Strong numerical and analytical skills
  • Advanced Excel capability (pivot tables, formulas, data manipulation)
  • Knowledge of Power BI
  • SQL / MySQL knowledge
  • Experience working with multiple data sources
  • Salesforce experience (beneficial, not essential)
  • Strong communication skills, with the ability to explain MI clearly to non-technical audiences

Whats on offer

  • Competitive salary of ÂŁ40,000
  • Permanent role within a stable, national organisation
  • High visibility role with exposure to senior stakeholders
  • Opportunity to influence performance through insight and reporting
  • Collaborative, professional working environment

If youre an MI professional who enjoys making data meaningful and wants a role where insight genuinely drives action wed love to hear from you.

Apply online today immediate interviews available.

Lead Data Engineer
Fruition Group
Leeds
Hybrid
Senior
ÂŁ80,000
+3

Job Title: Lead Data Engineer
Location: Leeds, 2x per week
Salary: Up to ÂŁ80,000 per annum

Why Apply?
This is an exciting opportunity to work as a Lead Data Engineer delivering scalable, high quality data solutions for a leading client in the technology sector. This position offers professional growth, challenging projects, and access to cutting edge cloud data technologies.

Lead Data Engineer Responsibilities:

  • Design, develop, and optimise robust, scalable data pipelines and architectures to support Business Intelligence and analytics initiatives.
  • Manage and maintain cloud-based data platforms (AWS, Azure, or Google Cloud) including data lakes, warehouses, and lakehouse solutions.
  • Transform and process structured and unstructured data using modern ETL/ELT frameworks (Apache Spark, Airflow, dbt).
  • Collaborate closely with product managers, analysts, and software developers to ensure seamless integration and high-quality data availability.
  • Develop, maintain, and enhance reporting and analytics capabilities through tools such as PowerBI, Tableau, or QuickSight.
  • Apply best practices in data governance, data quality, and performance optimisation.
  • Operate in an agile environment, contributing to technical discussions and problem-solving initiatives.

Lead Data Engineer Requirements:

  • Proven experience in building and managing cloud-based data platforms (AWS Redshift/Glue, Azure Data Factory/Synapse, Google BigQuery/Dataflow).
  • Strong programming skills in Python, SQL, and Java for data engineering tasks.
  • Experience designing reliable, maintainable, and high-performance data pipelines and architectures.
  • Broad understanding of data warehousing, data lakes, and lakehouse architectures.
  • Familiarity with Business Intelligence and data visualisation tools.
  • Excellent analytical thinking, attention to detail, and problem-solving skills.
  • Strong collaboration and communication skills, able to work with both technical and non-technical stakeholders.
  • Comfortable with complexity, ambiguity, and working independently or as part of a team in a fast-paced environment.

We are an equal opportunities employer and welcome applications from all suitably qualified persons regardless of their race, sex, disability, religion/belief, sexual orientation or age.

SAS Data Engineer
Deerfoot Recruitment Solutions
Telford
Hybrid
Mid - Senior
ÂŁ70,000
TECH-AGNOSTIC ROLE

SAS Consultant / Data Engineer
Location: Telford or Worthing (hybrid working 2 days onsite)
Type: Full Time, Permanent
Salary: ÂŁ50,000 - ÂŁ70,000 DOE + comprehensive benefits package

Deerfoot Recruitment is working with a major consultancy partner on a long-term public sector engagement and is seeking experienced SAS Consultants / Data Engineers to join a growing data team.
This is a key role within a large-scale data portfolio, supporting critical programmes focused on revenue optimisation, fraud detection, data management and analytics. The successful candidate will play a hands-on role in designing, building and supporting robust SAS-based data solutions, working closely with product owners, architects, engineers and senior client stakeholders.

Key responsibilities include:

  • Designing and delivering secure, high-performance SAS solutions for data integration and analytics
  • Building and enhancing data pipelines covering ingestion, transformation, reporting and fraud detection
  • Supporting live services, incident resolution and continuous improvement
  • Collaborating in Agile delivery teams and contributing to engineering best practice

Key skills and experience:

  • Minimum 5 years experience as a data engineer or similar role
  • Strong background as a Data Engineer on complex, large-scale data platforms
  • Proven expertise with SAS 9.x
  • (SAS Viya (3.x / 4) bonus to have)
  • Solid ETL, data modelling and batch scheduling experience
  • Understanding of performance optimisation, CI/CD and scalable solution design
  • Confident client-facing and consultancy skills

?? Security Clearance:
This role requires SC clearance, or eligibility to obtain it. Applicants must have lived in the UK continuously for the past 5 years. Some restrictions may apply based on nationality and residency.
This is an excellent opportunity to work on high-impact public sector systems within a collaborative, technically strong environment.

Apply today to find out more.

SAS Consultant / Senior SAS Consultant / SAS Developer / SAS Data Engineer / SAS Programmer / Lead SAS Programmer / SAS Analytics Consultant / SAS Technical Consultant / SAS Solutions Consultant /Data Engineer / Senior Data Engineer / Analytics Engineer / Data Platform Engineer / Data Integration Engineer / Data Pipeline Engineer / Data Solutions Engineer / Enterprise Data Engineer
???

Deerfoot Recruitment Solutions Ltd is a leading independent tech recruitment consultancy in the UK. For every CV sent to clients, we donate £1 to The Born Free Foundation. We are a Climate Action Workforce in partnership with Ecologi. If this role isn’t right for you, explore our referral reward program with payouts at interview and placement milestones. Visit our website for details. Deerfoot Recruitment Solutions Ltd is acting as an Employment Agency in relation to this vacancy.

Principal GCP Data Engineer
Anson McCade
Multiple locations
Hybrid
Senior
ÂŁ95,000

ÂŁUp to ÂŁ95,000 GBP
Hybrid WORKING
Location: Bristol; Gloucester; Cardiff; Corsham; Cheltenham, Bristol, South West - United Kingdom Type: Permanent

Principal GCP Data Engineer
Join an award-winning innovation and transformation consultancy recognised for its cutting-edge work in data engineering, cloud solutions, and enterprise transformation. This organisation is known for bringing ingenuity to life, helping clients turn complexity into opportunity, and fostering a culture where technical specialists thrive and grow.

An opportunity has arisen for a Principal GCP Data Engineer to join the London-based data and analytics practice. This Principal GCP Data Engineer role offers the chance to lead the design and delivery of end-to-end data solutions on Google Cloud Platform for high-profile clients, shaping data strategy and driving technical excellence across complex programmes.

With a reputation for combining breakthrough technologies with pragmatic delivery, the organisation empowers senior data engineers to influence architecture, mentor teams, and deliver production-ready solutions that create lasting impact.

The Role - Principal GCP Data Engineer
The Principal GCP Data Engineer is a senior technical role responsible for leading data engineering solutions, guiding teams, and acting as a subject matter expert in Google Cloud Platform. As a Principal GCP Data Engineer, you will define end-to-end solution architectures, implement best practices, and lead the development of robust, scalable data pipelines.

This role combines hands-on technical leadership with coaching, mentorship, and client engagement, making it ideal for a Principal GCP Data Engineer who enjoys delivering complex solutions while shaping the capabilities of their team and influencing enterprise-wide data strategy.

What You’ll Be Doing as a Principal GCP Data Engineer
As a Principal GCP Data Engineer, you will:

  • Lead the design, development, and delivery of data processing solutions using GCP tools such as Dataflow, Dataproc, and BigQuery
  • Design automated data pipelines using orchestration tools like Cloud Composer
  • Contribute to architecture discussions and design end-to-end data solutions
  • Own development processes for your team, establishing robust principles and methods across architecture, code quality, and deployments
  • Shape team behaviours around specifications, acceptance criteria, sprint planning, and documentation
  • Define and evolve data engineering standards and practices across the organisation
  • Lead technical discussions with client stakeholders, achieving buy-in for solutions
  • Mentor and coach team members, building technical expertise and capability

Key Responsibilities

  • Develop production-ready data pipelines and processing jobs using batch and streaming frameworks such as Apache Spark and Apache Beam
  • Apply expertise in data storage technologies including relational, columnar, document, NoSQL, data warehouses, and data lakes
  • Implement modern data pipeline patterns, event-driven architectures, ETL/ELT processes, and stream processing solutions
  • Translate business requirements into technical specifications and actionable solution designs
  • Work with metadata management and data governance tools such as Cloud Data Catalog, Collibra, or Dataplex
  • Build data quality alerting and data quarantine solutions to ensure downstream reliability
  • Implement CI/CD pipelines with version control, automated tests, and automated deployments
  • Collaborate in Agile teams, using Scrum or Kanban methodologies

Key Requirements
The successful Principal GCP Data Engineer will bring deep technical expertise, client-facing experience, and leadership skills. You will have:

  • Proven experience delivering production-ready data solutions on Google Cloud Platform
  • Strong knowledge of batch and streaming frameworks, data pipelines, and orchestration tools
  • Expertise in designing and managing structured and unstructured data systems
  • Experience translating business needs into technical solutions
  • Ability to mentor and coach teams and guide technical decision-making
  • Excellent communication skills, with the ability to explain technical concepts to technical and non-technical stakeholders
  • A pragmatic approach to problem solving, combined with a drive for technical excellence

Why Join

  • Take a senior technical leadership role as a Principal GCP Data Engineer within a globally recognised innovation and transformation consultancy
  • Lead the delivery of complex data engineering programmes on Google Cloud Platform
  • Shape the data engineering standards, practices, and architecture across client engagements and internal teams
  • Work in a collaborative, inclusive, and learning-focused culture where technical specialists are empowered to grow and succeed

Reference: AMC/AON/PGCPDataEnginer

#aaon

Data Architect
Stackstudio Digital Ltd.
Leeds
Remote or hybrid
Mid - Senior
Private salary

Databricks ArchitectRole Overview
We are seeking a Senior Databricks Architectto lead the design and delivery of scalable, high-performance data platforms built on Databricks. This is a strategic role requiring strong architectural experience, hands-on design capability, and a deep understanding of modern data ecosystems.

The ideal candidate will have several years’ experience operating atarchitect level, with proven success designing and/or migrating data platforms into Databricks. Candidates with strong architecture backgrounds on comparable platforms (such as Snowflake) will also be considered, provided they hold relevant Databricks certifications.

Key Responsibilities

  • Design end-to-end data architectures usingDatabricksfor analytics, data engineering, and data science use cases.
  • Lead and supportdata platform migrationsinto Databricks from legacy or alternative data platforms.
  • Define architectural standards, best practices, and reference patterns for Databricks implementations.
  • Collaborate withdata engineers, platform teams, and stakeholders to translate business requirementsinto scalable technical solutions.
  • Ensure solutions meetperformance, security, scalability, and cost-optimizationrequirements.
  • Provide technical leadership and architectural governance across Databricks initiatives.
  • Review existing data architectures and recommend improvements or modernisation strategies.
  • Support teams with architectural guidance, troubleshooting, and design reviews.
  • Essential Skills & ExperienceProven experience working as aData Architect / Platform Architectat a senior level.
  • Hands-on experience designing solutions within theDatabricksecosystemOR strong architectural experience on a competing platform (e.g.Snowflake) combined withDatabricks certifications.
  • Demonstrated experience withdata platform migrations, modernisation, or large-scale data transformations.
  • Strong understanding of data architecture principles, including:
  • Data lakes / lake house architectures.
  • Data modelling and data integration patterns
  • Performance optimisation and scalability
  • Experience working with cloud-based data platforms.
  • Strong stakeholder communication and documentation skills.

Desirable Skills & Experience

  • Databricks certifications(Data Engineer, Data Architect, or equivalent).
  • Experience designing solutions prior to Databricks (traditional data warehouses, big data platforms, or cloud-native data stacks).
  • Knowledge of modern data engineering tools and frameworks.
  • Experience operating in complex enterprise environments
Data Architect
Stackstudio Digital Ltd.
Multiple locations
Remote or hybrid
Mid - Senior
Private salary

Databricks ArchitectRole OverviewWe are seeking a Senior Databricks Architectto lead the design and delivery of scalable, high-performance data platforms built on Databricks. This is a strategic role requiring strong architectural experience, hands-on design capability, and a deep understanding of modern data ecosystems.

The ideal candidate will have several years’ experience operating atarchitect level, with proven success designing and/or migrating data platforms into Databricks. Candidates with strong architecture backgrounds on comparable platforms (such as Snowflake) will also be considered, provided they hold relevant Databricks certifications.

Key Responsibilities

  • Design end-to-end data architectures usingDatabricksfor analytics, data engineering, and data science use cases.
  • Lead and supportdata platform migrationsinto Databricks from legacy or alternative data platforms.
  • Define architectural standards, best practices, and reference patterns for Databricks implementations.
  • Collaborate withdata engineers, platform teams, and stakeholders to translate business requirementsinto scalable technical solutions.
  • Ensure solutions meetperformance, security, scalability, and cost-optimizationrequirements.
  • Provide technical leadership and architectural governance across Databricks initiatives.
  • Review existing data architectures and recommend improvements or modernisation strategies.
  • Support teams with architectural guidance, troubleshooting, and design reviews.
  • Essential Skills & ExperienceProven experience working as aData Architect / Platform Architectat a senior level.
  • Hands-on experience designing solutions within theDatabricksecosystemOR strong architectural experience on a competing platform (e.g.Snowflake) combined withDatabricks certifications.
  • Demonstrated experience withdata platform migrations, modernisation, or large-scale data transformations.
  • Strong understanding of data architecture principles, including:
  • Data lakes / lake house architectures.
  • Data modelling and data integration patterns
  • Performance optimisation and scalability
  • Experience working with cloud-based data platforms.
  • Strong stakeholder communication and documentation skills.

Desirable Skills & Experience

  • Databricks certifications(Data Engineer, Data Architect, or equivalent).
  • Experience designing solutions prior to Databricks (traditional data warehouses, big data platforms, or cloud-native data stacks).
  • Knowledge of modern data engineering tools and frameworks.
  • Experience operating in complex enterprise environments
Data Engineer – TV Advertising Data (FAST)
Datatech
London
Hybrid
Mid - Senior
ÂŁ75,000 - ÂŁ85,000

Data Engineer - TV Advertising Data (FAST) Location: London - 3 days onsite Salary £75,000 - £85,000 Neg DOE Reference : J13057 Note: Full and current UK working rights required for this role We're currently seeking a Data Engineer to build the foundations behind the rapidly growing FAST (Free Ad Supports Streaming TV channels) A pioneering opportunity to be involved with direct to consumer advertising for a Global player in the field. Someone who is passionate about how data drives the industry and to help optimise campaigns, measure performance, and monetise content. Key Responsibilities ·Design, build, and maintain scalable ETL/ELT pipelines that transform raw data into reliable, analytics-ready datasets ·Ingest, integrate, and manage new data sources across advertising, audience, platform, and content data within Fremantle's Microsoft Fabric environment ·Deliver robust data flows that underpin global FAST dashboards, monetisation insights, and audience viewing metrics ·Work closely with the central Data & Analytics team to enable high-quality Power BI reporting and analysis ·Ensure strong data governance, integrity, and security across the Azure/Fabric ecosystem ·Optimise data pipelines for performance, scalability, and efficiency, following best-practice engineering standards including version control and code reviews ·Monitor pipeline health, data freshness, and quality, implementing proactive alerting and issue resolution ·Translate business and analytical needs into well-structured data models and technical solutions ·Automate data workflows to minimise manual processes and improve operational reliability ·Maintain clear documentation of pipelines, datasets, and data flows to support collaboration and smooth handovers ·Stay current with data engineering best practices, particularly within the Microsoft technology stack Skills & Experience ·5+ years' experience working as a Data Engineer or in a similar role ·Proven experience with cloud-based data platforms (Azure, AWS, SQL, Snowflake, Springserv); Microsoft Fabric experience is a strong plus ·Strong proficiency in Spark SQL and PySpark, including complex transformations ·Experience building ETL/ELT pipelines using tools such as Azure Data Factory or equivalent ·Ability to write efficient, reusable scripts for transformation, validation, and automation ·Hands-on experience integrating data from APIs (REST, JSON), including automated data collection ·Solid understanding of data modelling best practices for analytics and dashboards ·Confidence working with large, complex datasets across multiple formats (CSV, JSON, Parquet, databases, APIs) ·Strong problem-solving skills and the ability to diagnose and resolve data issues ·Excellent communication skills and experience working with cross-functional teams ·Genuine curiosity about how data drives content performance, audience behaviour, and monetisation If this sounds like the role for you then please apply today

Frequently asked questions
Typically, a Data Engineer should have a strong background in computer science or related fields, proficiency in programming languages like Python or Java, and experience with data warehousing, ETL processes, and big data technologies such as Hadoop or Spark.
Haystack features a wide range of Data Engineer positions, including roles in startups, large enterprises, and remote opportunities. You can find jobs specializing in cloud data engineering, real-time data processing, data pipeline development, and more.
To improve your chances, tailor your resume to highlight relevant skills and projects, gain hands-on experience with popular data tools, contribute to open-source projects, and stay updated with the latest trends in data engineering.
Yes, Haystack lists entry-level Data Engineer roles suitable for recent graduates or professionals transitioning into data engineering, as well as internships and junior positions to help you start your career.
Absolutely. Haystack offers many remote and flexible Data Engineer job listings to suit your preferred working style and location.