Make yourself visible and let companies apply to you.
Roles
Data Engineer Jobs
Overview
Looking for top Data Engineer jobs? Explore the latest data engineering opportunities on Haystack, your go-to IT job board. Whether you're skilled in ETL, data pipelines, or big data technologies, find the perfect role to advance your career today. Start your search for Data Engineer positions now!
Data Engineer
Sanderson Recruitment
Bristol
Hybrid
Mid - Senior
ÂŁ45,000
RECENTLY POSTED

Industry: Not-for-profit (Family Services)

Location: Bristol/ South Gloucestershire (Free parking available)

Salary: ÂŁ45,000 - ÂŁ48,000 + benefits

Permanent role: hybrid working (3 days on site)

Data Engineer - Role Purpose

We are embarking on a significant transformation of our data and analytics capabilities and are seeking a skilled Data Engineer to help build and shape our modern Data & AI Platform.

Working alongside the Head of Data & Analytics, you will design, develop and maintain secure, high-quality data pipelines that enable trusted reporting, analytics, and future AI/ML development. This is a rare opportunity to influence architecture, engineering standards, automation and governance within a co-managed delivery model.

You will work across structured and semi-structured data from key internal systems, including HR, care delivery, finance, estates, medication and incident management to build reusable data pipelines, semantic models and certified datasets. Your work will directly support operational teams, strategic planning and improved outcomes for the people we support.

This role is ideal for someone who enjoys solving complex data challenges, building scalable solutions and embedding best-practice engineering within a collaborative, mission-driven environment.

Data Engineer: Technical Skills & Experience

We are looking for candidates with experience in:

  • Cloud or data platform technologies (e.g., Azure, Fabric, Databricks)
  • Operating and managing modern cloud-based data platforms
  • Integrating third-party data feeds
  • Exposure to DevOps or platform engineering
  • Strong SQL for transformation, modelling and optimisation
  • At least one data engineering programming language (e.g., Python)
  • Data modelling (dimensional, star schema, analytics-optimised models)
  • Building and maintaining production-grade ETL/ELT pipelines
  • Orchestration and scheduling tools
  • Version control (e.g., Git)
  • CI/CD principles for data workloads
  • Environment separation (Dev/Test/Prod)
  • Writing maintainable, testable and well-documented code
  • Applying GDPR and data protection principles, including privacy-by-design, retention, anonymisation and pseudonymisation

Data Engineer: Pay & Benefits

We recognise the importance of investing in our people and offer a competitive employment package, including:

  • 34 days annual leave (including public holidays)
  • Access to earned pay before payday
  • Company pension scheme
  • Generous occupational maternity/paternity pay
  • Ongoing learning and development opportunities
  • Health Cash Plan after probation (covering dental, optical, therapies, maternity/paternity, prescriptions and more)
  • Opportunities for career progression

How to Apply

This role is being recruited by Sanderson Recruitment. Please apply with your CV.

Reasonable Adjustments:

Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people of all backgrounds and perspectives. Our success is driven by our people, united by the spirit of partnership to deliver the best resourcing solutions for our clients.

If you need any help or adjustments during the recruitment process for any reason, please let us know when you apply or talk to the recruiters directly so we can support you.

Data Modeller
DCV Technologies Limited
London
Hybrid
Mid - Senior
Private salary
RECENTLY POSTED

Position: Data Modeller
Location: London, UK (Hybrid-2 days from office)
6 months contract position

The Role

In this role, you will define the data blueprints and foundational models that underpin how customers in dynamic, data intensive industries operate, scale, and innovate. You will design robust, future ready data models that enable seamless integration, advanced analytics, and AI driven decision making across complex digital transformation programmes. With access to modern data platforms, high quality datasets, and leading modelling frameworks, you will guide engineering, analytics, and architecture teams in building secure, scalable, and well governed data structures. This role empowers you to shape end to end data ecosystemsaccelerating delivery, enhancing data clarity, strengthening operational resilience, and driving organisations toward a more insight rich, data enabled future.

Your responsibilities:

Define and maintain conceptual, logical, and physical data models that accurately reflect business processes and support analytics, AI/ML, and operational needs.

Translate business requirements into robust data entities, attributes, relationships, and constraints; ensure traceability from requirements to models.

Establish and enforce GDM modelling standards and naming conventions (e.g., normalization, dimensional/star/snowflake patterns, data vault where appropriate).

Design dimensional models (facts, dimensions, hierarchies, slowly changing dimensions) for BI/analytics and performance at scale.

Create and manage canonical data models and semantic layers to enable consistent metrics and self-service analytics across domains.

Ensure data quality by designdefine integrity rules, reference/master data relationships, and validation checks embedded in pipelines.

Optimise models for performance and cost (partitioning, clustering, indexing, compression, surrogate keys, distribution strategies).

Drive data integration design across sources (CDC, event streaming, APIs), mapping source-to-target, resolving conflicts, and handling historical changes.

Support AI/ML readiness by modelling features, aggregations, and histories; collaborate on feature stores and model input/output schemas.

Embed privacy and security requirements into models (PII classification, minimisation, masking, role-based access, retention, and residency).

Your Profile

Essential skills/knowledge/experience:

  • Proven experience delivering conceptual, logical, and physical data models for cloud data platforms, ideally GCP
  • Strong hands-on modelling for Big Query (analytical/columnar patterns, denormalization strategy, partitioning & clustering considerations)
  • Expertise in data modelling approaches: 3NF, dimensional (Kimball), Data Vault, and hybrid patterns for Lakehouse designs

Maintain versioned model artefacts (ERDs, schema scripts, JSON/YAML specs) and change logs; manage controlled evolution of models.

  • Ability to translate banking domain requirements (Customer, Accounts, Payments, Credit, Risk, Finance) into scalable canonical models
  • Strong understanding of Big Query performance and cost optimisation impacts driven by modelling choices (query patterns, storage, scan costs)
  • Experience designing data products for analytics and reporting with trusted definitions (facts, dimensions, SCD, conformed dimensions)
  • Strong knowledge of data governance: metadata management, lineage, stewardship, data quality rules, and critical data elements

Proficiency with data modelling tools such as ER/Studio, Power Designer, ERWin, SQL Developer Data Modeler, or equivalent cloud-native tools.

  • Familiarity with GCP ecosystem integration (e.g., Cloud Storage, Dataflow/Dataproc, Pub/Sub) and how ingestion patterns influence modelling

Desirable skills/knowledge/experience:

Experience supporting regulatory, risk, and audit needs through traceability, controlled definitions, and clear lineage to source systems

Capability in defining and enforcing enterprise standards (naming conventions, reference data, code sets, data contracts)

Experience with model versioning, impact analysis, and change control across multiple downstream consumers

Ability to partner closely with data engineers and architects to drive physical implementation, testing, and production readiness

Strong communication skills to explain complex data structures to both technical and non-technical stakeholders

Comfortable operating in Agile delivery: contributing to user stories, acceptance criteria, design assurance, and documentation

Solid understanding of data modelling techniques such as normalization, denormalization, Data Vault, semantic modelling, and canonical models.

Familiarity with master data management (MDM), reference data frameworks, and designing conformed dimensions.

Proficiency in SQL and data profiling to validate models against source data and ensure completeness/consistency

Data Scientist
DCV Technologies Limited
London
Hybrid
Senior
Private salary
RECENTLY POSTED

Position: Data Scientist
Location: London, UK (Hybrid-2 days a week from office)
6 months contract position

The Role

In this role, you will uncover the insights and intelligence that help customers in dynamic, data intensive industries operate, scale, and innovate. You will develop robust, future ready machine learning and analytical models that enable predictive insights, automation, and data driven decision making across complex digital transformation programmes. With access to modern data platforms, high quality datasets, and advanced statistical and AI frameworks, you will work closely with engineering, product, and analytics teams to build solutions that are accurate, explainable, and scalable. This role empowers you to shape end to end analytical ecosystemsaccelerating delivery, enhancing decision quality, strengthening operational resilience, and guiding organisations toward a more insight rich, AI enabled future.

Your responsibilities:

Explore, clean, and analyse large, complex datasets to uncover patterns, trends, and opportunities that drive actionable insights.

Develop, train, and validate machine learning, statistical, and predictive models that solve real business problems and deliver measurable impact.

Design and run experiments (A/B tests, hypothesis tests, simulations) to evaluate ideas, quantify outcomes, and guide decision making.

Collaborate with data engineers, analysts, product managers, and domain experts to translate business requirements into well-defined modelling tasks.

Build end to end ML pipelinesfrom feature engineering and preprocessing to deployment ready model outputs.

Apply advanced techniques such as NLP, time series forecasting, anomaly detection, optimisation, or LLM/GenAI methods where relevant.

Implement model evaluation frameworks using offline metrics, cross validation, online experiments, and human in the loop feedback loops.

Communicate insights clearly through dashboards, visualisations, written summaries, and presentations tailored to technical and non-technical stakeholders.

Ensure models are interpretable and explainable where required, providing transparency into key drivers and assumptions.

Work with engineering teams to deploy models into production, monitor performance, and retrain or recalibrate as data and conditions change.

Your Profile

Essential skills/knowledge/experience: (Up to 10, Avoid repetition)

Role- Data Scientist (5 to 12 Yrs)

Hands-on experience with GenAI, Gemini or Open source LLMs and develop GenAI applications for Code Translation, Text Extraction, Summarisation and SDLC Optimization etc.

Hands-on Experience with AI Agents, Chat bots, RAG (Retrieval-Augmented Generation), and vector databases. (PG vector / croma DB)

Hands-on Experience with GenAI Performance Evaluation tools like Pegasus, Ragas, DeepEval

Create Conversational Interface with React JS or other Frontend components, Develop and deploy AI agents using LangGraph and ADK, A2A, MCP

Strong programming skills in Python (experience with LangChain/LangGraph / LangSmith frameworks) and TypeScript (preferable)

Solid understanding of LLMs, prompt engineering, and graph-based workflows.

Knowledge and implementation of Input and Output guardrails in addressing Hallucination, PII filtering, HAP and Bias etc.

Implemented security best practices, Experience to address spikes and Denial of wallet attacks, DDoS attack and other Spike arrest strategies

Knowledge of API Gateways and ISTIO, ability to Diagnose and intercept failures in End-to-End communication

Hands-on Experience with API Development and Microservices architecture

Desirable skills/knowledge/experience: (As applicable)

Strong experience applying machine learning, statistical modelling, and predictive analytics to real world business problems.

Collaborate with cross-functional teams to ability to resolve end to end connectivity and Data Integrations

Experience working with large, complex datasets, including data cleaning, feature engineering, and exploratory data analysis.

Familiarity with LLMs, NLP techniques, and GenAI frameworks, including embeddings, prompt engineering, or fine tuning.

Experience building end to end ML pipelines, including model validation, optimisation, deployment, and monitoring.

Understanding of MLOps practices, including model versioning, model registries, CI/CD for ML, and automated training/inference workflows.

Ability to translate business problems into analytical tasks and communicate insights in a clear, concise manner to technical and non-technical audiences.

Knowledge of data governance, including data quality, lineage, ethics, privacy considerations, and responsible AI principles.

Comfort working with cloud platforms (GCP preferred) for model training, deployment, and scalable compute.

A growth-oriented mindset with enthusiasm for exploring new algorithms, tools, and emerging AI/ML techniques.

Senior Data Scientist
Daniel James Resourcing Ltd
Manchester
Hybrid
Senior
ÂŁ90,000
RECENTLY POSTED

Senior Data Scientist Decisioning & Pricing Intelligence

Manchester (Hybrid 2 days onsite)
Up to ÂŁ90,000 + bonus

The Opportunity

This is not a role where you sit in a notebook building models no one uses.

Our client is a market-leading SaaS organisation operating at scale within the automotive ecosystem, building the intelligence layer that drives real-world commercial decisions across Europe.

They are investing heavily in data and AI to power a next-generation decisioning platform helping enterprise clients make high-value pricing, trading, and operational decisions across thousands of assets, in real time.

?? This is about building models that directly influence revenue, pricing, and market behaviour.

The Role

Youll take ownership of production-grade models that sit at the core of a live product used daily by commercial teams across multiple markets.

This includes:

  • Pricing intelligence
  • Buyer behaviour modelling
  • Stock segmentation
  • Recommendation systems

Youll work closely with product, engineering, and domain experts to ensure your work doesnt just function it lands, scales, and delivers measurable impact.

What Youll Be Doing

  • Building and deploying machine learning models into production environments
  • Designing decisioning systems that optimise pricing, channel selection, and asset performance
  • Developing explainable models (SHAP, LIME, feature importance) that drive user trust
  • Creating frameworks to measure real-world impact and model effectiveness
  • Working with large, complex datasets across multiple markets
  • Collaborating with product and engineering teams to embed models into live systems

What Were Looking For

  • Strong experience in machine learning, statistical modelling, or pricing / propensity modelling
  • Proven track record of delivering models that go into production and stay there
  • Experience working in commercial environments where data drives decisions
  • Strong Python (or R) experience across modern data science tooling
  • Familiarity with MLOps, model monitoring, and deployment pipelines
  • Ability to communicate complex outputs clearly to non-technical stakeholders
Senior Power BI Engineer/Databricks - Immediate Start
Templeton and Partners
London
Hybrid
Senior
ÂŁ600/day - ÂŁ750/day
RECENTLY POSTED
+3

Senior Power BI Engineer Greenfield Trading Project

  • London, UK (Hybrid)
  • Trading/Financial Services
  • ÂŁ600 ÂŁ750 per day (Inside IR35)
  • 12 Month Contract
  • Immediate Start

Templeton & Partners are urgently hiring a Senior Power BI Engineer for a high impact greenfield programme within a leading Trading & Financial Services organisation. This is a rare opportunity to shape a brand new BI environment and work directly with Front Office trading teams.

We re looking for someone highly technical, confident engaging with senior stakeholders, and able to lead a small BI team while staying fully hands on.

Role Overview

As the Senior Power BI Engineer, you will design and deliver dashboards, semantic models, and reporting solutions from scratch. you’ll act as a BI Subject Matter Expert, collaborating with traders and senior business stakeholders to turn complex requirements into scalable, high performance BI products.

This role is ideal for applicants who enjoy ownership, autonomy, and building solutions in a fast paced environment.

Key Responsibilities

  • Lead and mentor a small BI team while remaining hands on
  • Design, build, and optimise Power BI dashboards, semantic models, and reporting solutions
  • Work closely with traders and senior stakeholders to gather and translate requirements
  • Act as SME across BI, modelling, and reporting best practices
  • Support design and development of data solutions and moderately complex applications
  • Drive process improvements, performance optimisation, and BI governance
  • Independently manage workload and prioritisation in a fast moving trading environment

Key Skills:

  • Microsoft Power BI
  • SQL
  • Data Modelling

BI & Reporting

  • Power BI development
  • Advanced DAX
  • Power BI Service
  • Data visualisation
  • Tabular Editor
  • DAX Studio
  • Measure Killer

Data Modelling & Architecture

  • Semantic modelling
  • Star schema design
  • Enterprise semantic models

Databases & Platforms

  • Snowflake
  • Azure Databricks

Data Engineering & ETL

  • ETL processes
  • DBT
  • IBM Data Manager

DevOps & Governance

  • GitHub
  • Version control
  • CI/CD
  • License management
  • Capacity optimisation

Nice to Have (Not Essential)

  • Microsoft Dynamics CRM
  • Intermediate Python
  • JavaScript

Whats in it for you?

  • Greenfield project with real influence from day one
  • High visibility within Front Office trading
  • Opportunity to build BI architecture from scratch
  • Fast paced, hands on environment
  • Work with cutting edge data tools and platforms
  • Immediate start available

How to Apply

If you re a Senior Power BI Engineer who wants to make a measurable impact in a trading environment, please apply with your CV or contact me directly for more details.

Senior Data Engineer
Aurora Energy Research
Oxford
Hybrid
Senior
Private salary
RECENTLY POSTED
TECH-AGNOSTIC ROLE

Based in Oxford, you will join the growing Business Data team within Internal Technology. Working alongside the Lead Data Engineer, you will play a key role in delivering the business data platform using the agreed technology stack, Microsoft Fabric. A key aspect of this role is collaborating with senior business stakeholders to define functional specifications for business intelligence reports that align with established technology standards. These specifications will be presented through the Technical Design Authority, while you take a hands-on role in building the required data infrastructure with the team to enable high-quality reporting and insights. As a senior member of the practice, you will also help define scalable, secure, and robust processes and best practices. The successful applicant will be passionate about solving business problems through technology, combining strong technical data expertise with excellent communication skills and a solid understanding of business operations. You will work in a creative and intellectually stimulating environment, enjoying autonomy and the opportunity to make a meaningful impact on Aurora’s data strategy. Your work will help ensure the business intelligence capability effectively leverages enterprise data to support the needs of a fast-growing, data-driven organisation Key Responsibilities - Work within the Architecture and Engineering function to build a centrally managed, governed data layer that provides insight across enterprise systems, including Microsoft Dynamics 365 F&O, Salesforce, SharePoint, SAP SuccessFactors (HRMS), and future platforms introduced as the company grows - Support the Lead Data Engineer in guiding key decisions on system integration and data flows, ensuring a scalable and well-managed ecosystem that enables efficient business operations and cross-system insights - Collaborate with internal teams to understand operational challenges, particularly around reporting requirements, and identify where technology and data solutions can deliver value - Help establish policies, standards, and best practices to ensure the business data layer remains scalable, secure, and robust - Guide junior team members to support high-quality technical delivery and effective ways of working Required attributes: - Minimum of 2 years’ experience delivering data solutions to organisations including: - Extraction of data from traditional database systems and online systems via API - Transformation of data to support tactical and broad business insight requirements - Dimensional modelling - Data governance - Demonstrable experience with: Microsoft Fabric / Python / PySpark Notebooks /SQL / SaaS and API based data extraction - Strong experience analysing and manipulating numerical and business data, with a high level of analytical capability - Excellent time management, organisational skills, and attention to detail - Strong communication and interpersonal skills, with the ability to build relationships with stakeholders at all levels - Ability to work independently, manage competing priorities, and deliver to deadlines - Flexible and proactive approach to work, with a positive, team-oriented mindset - Delivery-focused, with a hands-on attitude and willingness to take ownership to get things done - Strong problem-solving skills and the ability to translate business challenges into data and technology solutions - Passion for technology and its application to solving real business problems Desirable attributes: - Experience with reporting from any of our existing Tier 1 systems (Dynamics 365 Finance and Operations, Salesforce CRM, SAP SuccessFactors, Entra ID) - Experience with delivering Direct Lake data architectures with Microsoft Fabric - Experience with utilising CI/CD patterns and DevOps tooling to orchestrate configuration across environments - Experience working with Power BI (including Power Query and DAX) - Experience working with Logic Apps, Power Platform What we offer Some of the benefits we include are: - Private Medical Insurance - Dental Insurance - Parental Support - Salary-Exchange Pension - Employee Assistance Programme (EAP) - Local Oxford Discounts - Cycle-to-work Scheme - Flu Jabs At Aurora we will consider all requests for flexible working. For most roles, the following types of flexibility are usually possible: a hybrid model of remote and in-office working, part time hours and flexible start and finish times. Please talk to us at interview about the flexibility we could offer and we will explore what’s possible for the role. The Company is committed to the principle that no employee or job applicant shall receive unfavourable treatment on grounds of age, disability, gender reassignment, race, religion or belief, sex, sexual orientation, marriage and civil partnership and pregnancy and maternity. The successful candidate would start as soon as possible. The team will review applications as they are received. Salary will be competitive with experience. To apply, please submit your Résumé / CV, a personal summary, your salary expectations and please inform us of your notice period. About Aurora Energy Research From its academic roots, Aurora Energy Research is a thriving, rapidly growing company, currently serving over 950 of the world’s most influential energy sector participants, including utilities, investors, and governments. While we constantly strive to reach new markets and diversify our product portfolio, we are already active across the globe in Asia-Pacific, Latin America, Europe, South Africa and North America, working with leading organisations to provide comprehensive market intelligence, bespoke analytic and advisory services, and cutting-edge software. We are a diverse team of experts with vast energy, financial, and consulting backgrounds, covering power, hydrogen, carbon, and fossil commodities. With this, we provide data-driven intelligence to fuel strategic decisions in the global energy transformation.

Senior C++ Software Engineer - Systematic Trading
Templeton and Partners
London
In office
Senior
ÂŁ600/day - ÂŁ850/day
RECENTLY POSTED
  • Senior C++ Software Engineer - Systematic Trading
  • 12-month contract - ÂŁ600 to ÂŁ850/day (Inside IR35)
  • Global Trading & Supply | High-Performance Systematic Trading Technology

As a Senior C++ Software Engineer, you’ll work closely with technologists, quants, and traders to design, optimise, and scale our in-house global derivatives and systematic trading platform. You’ll operate across the full stack of proprietary trading technology - from ultra-low-latency exchange connectivity to strategy engines - with the autonomy and impact expected in a high-performance engineering culture.

You will be instrumental in ensuring ultra-low latency, deterministic performance, reliability, and scalability while shaping the next generation of our systematic and algorithmic trading systems.

What You’ll Work On Exchange Connectivity

  • Develop software that interacts with major global futures exchanges using native APIs & protocols (FIX, WebSocket, HTTP).
  • Maintain and extend testing suites ensuring robust, consistent connectivity.
  • Optimise exchange communication through Kernel bypass, TLS tuning, and advanced networking techniques.
  • Study and align with detailed exchange behaviours and microstructure.
  • Work across C++, Rust, Python, and Typescript as needed.

Systematic & Algorithmic Trading

  • Build and enhance systematic trading strategies from quant and trader requirements.
  • Improve performance, reliability, and stability of the systematic and algorithmic trading engines.
  • Enhance monitoring, observability, and analytics for Real Time trading operations.
  • Diagnose and resolve production issues (crashes, logic inconsistencies, latency bottlenecks).
  • Support systematic strategy deployment, backtesting workflows, and release preparation.

Technical Experience

  • 5+ years post-graduation C++ in financial markets, ideally in high-performance, low-latency environments.
  • Strong background in multi-threaded, asynchronous, and distributed systems.
  • Deep knowledge of algorithms, data structures, and performance optimisation.
  • Understanding of the full exchange-traded derivatives life cycle.
  • Strong Scripting & automation skills (Python, PowerShell, C#, SQL, etc.).
  • Proven experience with application deployment best practices and proactive system monitoring.

Industry Experience

  • 8+ years in Trading, Systematic Trading, Capital Markets, or Investment Banking environments.
  • Familiarity with global futures exchanges and their native protocols.
  • Exposure to systematic or quantitative trading environments highly desirable.
Data Engineer
Boss Professional Services
Not Specified
Hybrid
Mid - Senior
Private salary
RECENTLY POSTED

Inside IR35 - Hybrid - Must have an active SC Clearance

Job Description:
An experienced Data Engineer is required to support the CDW SA team in delivering DWIT Informatica Advanced Scans across Legacy data warehouses. This role is a key enabler for the strategic migration from Legacy data warehouses to the new Data Lakehouse architecture.
The successful candidate will work closely with DWIT and CDW stakeholders to establish a comprehensive and accurate as is view of each Legacy warehouse, leveraging Informatica Advanced Scanning capabilities wherever possible.

Key Responsibilities:

  • Perform detailed analysis of Legacy data warehouses using Informatica Advanced Scanning techniques.
  • Produce a complete and accurate as is representation of each data warehouse.

Identify and document all inbound data feeds consumed by CDW, including:

  • Source/origin systems
  • Data formats
  • Full feed by feed data content

Capture and document the actual metadata held within the data warehouse, including:

  • Table structures
  • Field names and definitions

Identify and document all outbound data feeds from the data warehouse, including:

  • Consuming systems or platforms
  • Full feed by feed data content
  • Outbound metadata (tables, fields, and structures)
  • Where possible, generate bespoke outputs from Informatica Advanced Scanning for each Legacy warehouse to support analysis and migration planning.
  • Collaborate with architecture, data, and migration teams to ensure outputs are aligned to Data Lakehouse migration objectives.
  • Ensure documentation is clear, accurate, and suitable for use in downstream migration and governance activities.

Required Skills and Experience:

  • Strong experience with Unix Shell Scripting
  • PL/SQL and Oracle SQLMaestro/TWS scheduling tools
  • ETL concepts and implementationsCommand line tools (PuTTY/CLI)File transfer tools (eg WinSCP)
  • Use of text and documentation tools (eg Notepad , Word or equivalent)

Desirable Knowledge and Experience:

  • SA Profiles extractsSA MARTsSA Warehouse structures and concepts
  • Caseflow and KBSAuthoreteEureka
  • Business Objects tooling: BO Infoview
  • BO Developer
  • BO CMC
  • BO Import Wizard

Additional Information:
This role is hands on and delivery focused, supporting a critical phase of Legacy to Lakehouse migration.
Strong attention to detail, documentation quality, and stakeholder collaboration are essential.

If you are looking for your next opportunity, please contact me

Data Quality Improvement Manager
Tria Recruitment
West Midlands
Hybrid
Mid - Senior
ÂŁ65,000
RECENTLY POSTED

Up to ÂŁ69,000 per annum + private healthcare + excellent pension

3 days per week in the Worcester office

A fantastic opportunity has arisen to join a purpose-driven, not-for-profit organisation that is heavily investing in new technologies to modernise systems, streamline processes, and enhance data-driven decision-making. As part of an organisation committed to improving efficiency through innovation, this is an exciting time to help shape the future of how data is governed, used, and valued across the organisation.

We are looking for an enthusiastic and experienced Data Quality Improvement Manager who will lead initiatives to enhance the quality, integrity and reliability of priority data across the organisation. You will play a central role in embedding data governance principles, ensuring data quality standards are defined, implemented and continuously improved.

What you’ll be doing:

  • Embedding the data governance framework and supporting teams to adopt data quality and metadata management best practices.
  • Managing data migration activities for acquisitions, mapping and profiling data, and ensuring alignment with SAP validation standards.
  • Monitoring, reporting and improving data quality KPIs; configuring data quality workflows and rules; and coordinating timely resolution of data issues.
  • Collaborating with Risk, Compliance and IT Security to ensure data governance supports wider assurance frameworks.
  • Supporting communication and training programmes to raise awareness of data governance principles across the organisation.

We’re looking for someone with:

  • Proven experience in Data Governance and Data Quality roles within a complex organisation.
  • Strong knowledge of Data Quality, Metadata, Master Data and Data Lifecycle Management.
  • Ability to analyse, diagnose and resolve data-related issues effectively.
  • Excellent stakeholder management, communication and influencing skills.
  • Advanced Excel skills and intermediate SQL.
  • Experience with Data Governance tools (e.g., Aperture Data Studio) would be advantageous.

If you’re passionate about improving data quality and want to make a meaningful impact within an organisation that genuinely contributes to social good, we’d love to hear from you. Apply today.

Celonis Process Mining Data Engineer
Gazelle Global Consulting Ltd
London
Hybrid
Mid - Senior
Private salary
RECENTLY POSTED

Hiring: Celonis Process Mining Data Engineer

Location: London, UK (Hybrid)

Employment Type: Contract

We are seeking aCelonis Data Engineerto join a leadingfinancial services organization, playing a critical role in transforming complex banking data into actionable process insights. This role is key to enabling adata-driven, insight-led operating modelwithin a highly regulated environment.

Key Responsibilities:

1?? Data Engineering & Event Log Construction

  • Design, build, and maintainrobust event log pipelinesfor process mining in Celonis
  • Translate raw process data into structuredevent logs (Case ID, Activities, Timestamps, Attributes)
  • Develop scalable, reusable, and high-performance event log frameworks

2?? Data Model & Pipeline Development

  • Build and optimizeETL/ELT pipelinesfrom multiple source systems
  • Managedata ingestion, transformation, and refresh cyclesfor Celonis datasets
  • Design and optimizeprocess mining data models (CCPM & OCPM)
  • Handlelarge-scale transactional datasetswhile ensuring process integrity and traceability

3?? Performance Optimization & Data Quality

  • Optimize queries, transformations, and data models forperformance and scalability
  • Performdata validation, reconciliation, and root cause analysis
  • Identify and resolvedata quality issueswith proactive remediation strategies

4?? Collaboration & Documentation

  • Collaborate withprocess analysts, business stakeholders, and functional teams
  • Documentdata models, ETL logic, and event log definitions
  • Enableanalysis-ready datasetsfor business users within Celonis

5?? Governance & Best Practices

  • Ensure compliance withdata governance, security, and audit standards
  • Apply best practices includingversion control, modular design, and monitoring
  • Supportcontinuous improvement and optimization initiatives

Required Skills & Experience

  • Strong experience indata engineering for process mining using Celonis
  • Hands-on expertise inevent log creation, data pipelines, and transformation frameworks (CCPM & OCPM)
  • Strong proficiency inSQL and Python
  • Solid understanding ofdata modeling and ETL/ELT concepts
  • Experience working withlarge datasets and optimizing analytical workloads

Nice to Have

  • Understanding ofprocess mining concepts and analytical data structures
  • Experience inBanking / Financial Services (BFSI)
  • Exposure toKYC Operations
  • Strongdocumentation, analytical, and stakeholder management skills
Master Data Lead
Mackenzie Jones IT
London
Hybrid
Senior
ÂŁ75,000
RECENTLY POSTED
TECH-AGNOSTIC ROLE

ÂŁ75k + 18% Bonus + Benefits
Hybrid - West London - 3 Days Onsite
Must be Eligible to work in the UK - Cannot Provide Sponsorship

Leading organisation is seeking a Master Data Lead - focused on owning and evolving the Master Data Governance Framework.

Role:

  • Master Data Lead accountable for - Data Governance, Quality & Integrity
  • Master Data Governance across - Financial, Pricing, SKU, Customer & Key Reference Data
  • Define & own Master Data Policies, Standards, Controls & Governance Framework
  • Lead Data Governance across - Finance, IT, Commercial, Supply Chain & Retail
  • Master Data across core systems in UK&I - including SAP S/4, VisualFabriq, Retail Systems & additional platforms
  • Establish Data Ownership, Stewardship & Approval Workflows
  • Ensure Data is defined, accurate, consistent & maintained across all systems
  • Design & implement Data Quality Controls - preventative & detective
  • Monitor Data Quality Metrics & lead Root Cause Analysis where required
  • Enable high-quality Master Data - supporting reporting, analytics, operational efficiency & transformation
  • Establish Data Forums, governance cadences & quality review cycles
  • Act as Master Data Lead for SAP S/4 & Digital Transformation programmes
  • Support Reporting & Analytics - ensuring consistent hierarchies, attributes & business rules
  • Partner with Finance & Analytics teams to resolve data-related reporting issues
  • Drive Continuous Improvement - simplify, standardise & automate Master Data processes

Skills & Experience Required:

  • Master Data, Data Governance or Data Management leadership experience within complex environments
  • Data Governance Frameworks, Policies & Controls - Strong experience
  • SAP S/4 HANA or SAP ECC6 experience - Master Data
  • Experience across multiple systems & data domains
  • Ability to Design & embed Governance Frameworks
  • Strong stakeholder engagement - influencing senior stakeholders
  • Strategic mindset & pragmatic delivery approach
  • Strong analytical capability & attention to detail
  • Experience operating within transformation environments
  • Collaborative, people-focused approach
Data Analyst
Adria Solutions Ltd
Derbyshire
Remote or hybrid
Junior - Mid
ÂŁ40,000 - ÂŁ50,000
RECENTLY POSTED

We are looking for a Data Analyst to improve data quality, governance, and availability across the business. You will help create gold datasets, establish lightweight governance, and automate quality checks to support accurate, trusted reporting.

Key Responsibilities

  • Inventory and profile existing data sources (apps, databases, files, SaaS).
  • Design and implement data cleansing and standardisation pipelines.
  • Maintain data dictionaries, lineage diagrams, and semantic layers.
  • Establish pragmatic governance: data ownership, KPIs, thresholds, and access controls.
  • Implement automated data quality checks, dashboards, and root-cause remediation.
  • Ensure trust, consistency, and timeliness of critical business datasets.

Success Measures

  • Governance charter agreed with stakeholders and data owners appointed.
  • Cleansing pipelines operational with automated checks for key datasets.
  • Gold datasets live for priority business use cases.
  • Data quality metrics: 98% validity, 99% referential integrity, duplicates* 90% of reporting sourced from curated datasets; issues resolved at source.

Key Traits

  • Quality-focused and data-driven; fixes at the source.
  • Pragmatic, enforceable governance mindset.
  • Analytical with strong problem-solving skills.
  • Clear communicator bridging business and technical teams.
  • Ownership-driven and system-aware.

Skills & Experience

  • 2+ years in data analysis/engineering, hands-on profiling, cleansing, and modelling.
  • Experience building repeatable data pipelines in cloud/data warehouse platforms (e.g., Azure/Synapse/Fabric, Snowflake, BigQuery, Redshift).
  • Proficient with data prep and QA automation; strong understanding of master/reference data practices.
  • Knowledge of access controls, PII handling, retention, and audit requirements.

Preferred: Governance framework experience, streaming ingestion/schema evolution, ML-assisted entity resolution, retail/wholesale domain knowledge, familiarity with BI tools (Power BI, Looker, Tableau).

Interested? Please Click Apply Now! Data Analyst

Data Engineer
IO Associates
London
Hybrid
Senior
ÂŁ52,500 - ÂŁ75,000
RECENTLY POSTED

Exciting Opportunity: Data Engineer

Our Client, a leading organisation in the data and technology sector, is seeking a talented Data Engineer to join their dynamic team. Known for fostering an innovative and collaborative environment, our Client is dedicated to advancing data-driven solutions within the regulatory landscape. They promote a culture of continuous learning, agility, and excellence, making it an inspiring place to develop your career.

Role Overview
This pivotal role is a response to strategic growth and the ongoing digital transformation within our Client. As a Data Engineer, you will be at the forefront of designing and implementing high-quality data solutions that significantly impact decision-making processes. Your expertise will help shape the future of data management and analytics, underpinning critical initiatives in a fast-paced, technology-enabled environment.

Key Responsibilities

  • Lead technical designs for scalable data platforms within an AWS Cloud environment, guiding the implementation of robust solutions.
  • Provide expert advice on architecture to ensure data solutions meet business needs and comply with regulatory standards.
  • Review deliverables from various teams, ensuring adherence to best practices such as CI/CD, automated testing, and high-quality coding standards.
  • Identify opportunities for product enhancements, proposing innovative solutions that deliver measurable benefits.
  • Collaborate with senior professionals, stakeholders, and third-party providers to facilitate technology change and promote a data-centric approach.
  • Foster a culture of agility, continuous improvement, and rapid delivery by applying modern engineering practices.

Essential Skills & Experience

  • Extensive experience designing and developing end-to-end ETL pipelines on AWS and Cloudera data lake environments, supporting high-volume data ingestion with secure, scalable architectures.
  • Proven track record delivering enterprise data solutions using tools such as Talend ETL, Python, and PySpark.
  • Strong understanding of big data technologies, data governance, security, and compliance frameworks.
  • Demonstrated ability to work effectively across diverse teams and manage complex integrations.
  • Passionate about innovation with a focus on enhancing performance, reliability, and operational efficiency in data engineering.

Desirable Skills & Experience

  • Prior experience with DataOps practices and automated testing approaches such as TDD and BDD.
  • Familiarity with additional data processing tools and cloud services.
  • Knowledge of regulatory requirements specific to data handling within a regulated environment.

Application Process
If you possess the skills and experience outlined above and are eager to make a meaningful impact within a forward-thinking organisation, we encourage you to apply. Please submit your CV today to be considered for this exciting opportunity to contribute to transformative data initiatives.

Machine Learning Engineer - £110k – £130k – Geospatial Tech 4 Good
Opus Recruitment Solutions
London
Fully remote
Mid - Senior
ÂŁ110,000 - ÂŁ130,000
RECENTLY POSTED

Machine Learning | Deep Learning | Time Series | Climate | Remote Sensing | PyTorch | scikit‑learn | Geospatial | AWS | MLOps | Python | Risk Modelling | FinTech | Do you want to work with a business building AI‑native data system that bring clarity and credibility to nature‑based assets? A business tackling complex, real‑world environmental challenges, helping organisations make high‑impact decisions around risk, resilience and commercial performance? This is the chance to join as a Machine Learning Engineer working with a climate‑tech scale‑up applying cutting‑edge Machine Learning to satellite data, weather models and environmental signals, reshaping how nature is valued in real‑world decision‑making. Joining their AI team, you’ll design and deploy models that forecast climate volatility, detect vegetation stress, and generate risk‑driven insights from remote sensing and time‑series data. You’ll work across AI, climate science, geospatial modelling and scalable pipelines, contributing meaningfully from day one. What you’ll be working on: • Building and evaluating Machine Learning/DL models for satellite, weather and climate data • Forecasting environmental and risk‑related signals (volatility, vegetation stress, land‑surface change) • Developing geospatial and remote‑sensing models (Sentinel‑1/2, GEDI, optical, radar, LiDAR) • Creating time‑series and forecasting models for environmental change • Translating business questions into robust modelling problems • Turning research prototypes into scalable, reproducible AI pipelines • Communicating assumptions, uncertainty and results clearly The must‑haves: • Strong background in Machine Learning, DL and Applied Statistics • Time‑series modelling + backtesting • Experience with geospatial and climate datasets • Python stack: PyTorch, scikit‑learn, scipy • Reproducible workflows (Git, AWS/cloud, W&B) Nice‑to‑haves: • Risk modelling, financial time series, portfolio optimisation (great for FinTech/quant backgrounds) • Climate/weather datasets (CMIP, forecast data) • Geospatial tools: rasterio, xarray, geopandas, GDAL • Remote sensing (optical, radar, LiDAR) • MLOps: CI/CD, containerisation, monitoring • Startup or fast‑paced product environment The role offers £110k–£130k, a global team environment, and the chance to shape the future of AI‑powered environmental and risk intelligence. If it ticks those boxes, don’t hang about message me: (url removed) Machine Learning | Deep Learning | Time Series | Climate | Remote Sensing | PyTorch | scikit‑learn | Geospatial | AWS | MLOps | Python | Risk Modelling | FinTech

Senior Data Analyst
HARRIS HILL
London
Hybrid
Senior
Private salary
RECENTLY POSTED

Senior Data Analyst CRM Migration SQL Focus 9-Month FTC

Location: London (Hybrid 2 days onsite)
Contract: 9-month fixed-term Immediate start

Available now and ready to make an impact on a major CRM transformation?

We re supporting a purpose-led organisation undergoing a migration from Dynamics 365 to Salesforce. They re looking for a Senior Data Analyst to play a key role in preparing, analysing, and validating data to ensure a smooth and effective transition.

This is a hands-on role where you ll work across teams to resolve data issues, shape migration decisions, and influence how data is structured and used going forward.

The role

  • Use advanced SQL to extract, interrogate, and validate complex datasets
  • Identify, investigate, and help resolve data quality issues in source systems
  • Work closely with business stakeholders to provide insight and context for data fixes
  • Support decisions around data retention, cleansing, and migration scope
  • Review existing reporting to inform migration approach and future reporting needs

What you ll bring

  • Strong SQL skills with experience working on large, complex databases
  • Proven ability to analyse and interpret data for both technical and non-technical audiences
  • Experience working with CRM systems (Dynamics 365, Salesforce, or similar)
  • A structured, problem-solving approach to data quality and reconciliation

Nice to have:

  • Experience supporting or delivering data migration projects
  • Power BI (including DAX and Power Query)
  • Exposure to data governance or data privacy practices

Why apply?

  • Play a central role in a high-profile transformation programme
  • Work in a collaborative, data-led environment where your insights matter
  • Opportunity to shape data quality, structure, and reporting for the future state
  • Immediate start with quick impact and visibility

If you re ready to step into a high-impact role and get stuck into meaningful data challenges from day one, let s talk.

Local Land Charges Spatial Data Officer
LB RICHMOND UPON THAMES & LB WANDSWORTH
London
Hybrid
Junior - Mid
Private salary
RECENTLY POSTED
TECH-AGNOSTIC ROLE

Job Title: Local Land Charges Spatial Data Officer

Salary Range: SSA Scale SO2 - ÂŁ37,602 - ÂŁ45,564

Fixed Term: 12 month fixed term appointment

Full Time/Part Time/Term Time Only: Full Time

Location: Hybrid

Objective of role

Wandsworth and Richmond Councils have an exciting opportunity within their Local Land Charges Team, we are recruiting a Local Land Charges Spatial Data Officer.

This Officer will ensure the Councils continue to deliver an excellent Local Land Charges Service by the accurate creation and maintenance of a wide range of spatial data relevant for the Local Land Charges service. You will also provide training and support on the use of GIS within the Team and Service.

About the role

The Officer will be responsible for applying best GIS practice to the collation, analysis and creation of spatial and address related data, ensuring that the spatial data sets are complete by working with data owners to ensure good quality data is available for use by the Local Land Charges Officers.

You will also review and deliver improvements to the quality and format of a wide range of statutory LLC and related spatial datasets and associated metadata.

As part of the HM Land Registry project you will assist in the data transfer and data cleansing tasks to ensure the key milestones and project outcomes are met.

Essential Qualifications, Skills and Experience

  • Geographic Information Systems (GIS) Expertise-Proven working knowledge of MapInfo (v12+) or QGIS with Microsoft Excel, and ideally experience with additional GI systems.
  • High Quality Data Output and Accuracy-Attention to detail with the ability to produce high specification, accurate outputs.
  • Time Management and Workload Prioritisation-Working effectively to meet deadlines and manage competing and changing priorities.
  • Proactive Data Quality and Process Improvement-Reviewing spatial datasets, identifying issues, and proposing improvements to data and processes

Indicative recruitment timeline

Closing Date: Monday 6th April 2026 (23:59)

Shortlisting Date: W/C 6th April 2026

Interview Date:W/C 13th April 2026

We are proud to be a Disability Confident employer. If you require any reasonable adjustments throughout the recruitment and selection process, please let us know.

We are also committed to safeguarding and promoting the welfare of children and young people/vulnerable adults and expects all staff and volunteers to share this commitment.Some posts may be exempted under the Rehabilitation of Offenders Act 1974 and as such appointment to these posts will be conditional upon the receipt of a satisfactory response to a check of police records via Disclosure and Barring Service (DBS).

We offer a wide range of benefits designed to attract, develop, and reward our employees such as 40 days annual leave (including Bank Holidays), flexible working and a generous pension plan.

Azure Data Engineer - Contracvt outside IR35 - UK remote
Infused Solutions Ltd
Not Specified
Fully remote
Mid - Senior
Private salary
RECENTLY POSTED

Contract Azure Data Engineer

Location: UK (Remote)
Contract: Initial 3 months (Outside IR35)
Rate: Negotiable
Start Date: Immediate (Interview slots available)

Overview

We are currently seeking an experienced Azure Data Engineer to join a fast-paced project on a fully remote basis. This is an exciting opportunity to work on building and optimising scalable data platforms within a cloud-first environment.

Key Responsibilities

  • Design, build, and maintain scalable data pipelines
  • Develop and manage ETL processes across Azure data platforms
  • Work with Azure Data Factory, Azure Data Lake, and Azure SQL
  • Collaborate with stakeholders to integrate data from multiple sources
  • Support data warehouse development and optimisation
  • Implement best practices across data engineering and pipeline performance
  • Contribute to CI/CD pipelines and deployment processes

Required Skills & Experience

  • Strong experience with Azure Data Factory, Azure Data Lake, and Azure SQL
  • Solid background in ETL development and data pipeline engineering
  • Proficiency in Python and SQL
  • Experience with Spark and Databricks
  • Strong understanding of data warehousing concepts
  • Experience building and maintaining scalable data architectures

Desirable Skills

  • Experience with Azure Synapse
  • Knowledge of CI/CD pipelines and DevOps practices
  • Experience with data integration and orchestration across multiple systems

Additional Information

  • Outside IR35
  • Fully remote (UK-based candidates only)
  • Immediate start available
  • Fast interview process
Azure Snowflake DevOps Engineer
Robert Half
London
Hybrid
Mid - Senior
ÂŁ400/day - ÂŁ450/day
RECENTLY POSTED

DevOps + Snowflake Data Engineer (Associate/Contract)

Robert Half are looking for an interim DevOps + Snowflake Data Engineer to join as an Associate and support a global consulting firm delivering a major enterprise data transformation programme for an insurance client. The engagement sits within a Divisional Data Team responsible for building and managing an enterprise-wide data platform, improving reporting capabilities and embedding modern data governance and engineering best practices. This role blends Snowflake platform administration, DevOps engineering and data platform optimisation to ensure the reliability, security and performance of a modern cloud data estate.

Assignment Details

  • Rate: c. ÂŁ450+12.07% per day via PAYE (with employer’s NI & Tax deducted at source - unlike umbrella companies and no umbrella company admin fees)
  • Location: London - Hybrid (2 days per week onsite)
  • Contract Length: c. 6 months rolling
  • Start Date: c. 2-3 weeks subject to interview and onboarding

Responsibilities

  • Administer and manage the Snowflake data platform across development, test and production environments
  • Implement security controls including RBAC, SSO/MFA authentication, data encryption and masking
  • Build and maintain CI/CD pipelines for Snowflake deployments using Azure DevOps or GitHub
  • Automate operational and deployment processes across the data platform
  • Monitor platform health and performance and troubleshoot pipeline failures or operational incidents
  • Manage Snowflake warehouses, databases, schemas and access roles
  • Optimise query performance, warehouse sizing and Snowflake credit consumption
  • Support database change management, secrets management and role hierarchy structures
  • Work closely with data engineering and networking teams to resolve pipeline or infrastructure issues
  • Maintain artefactory certificates and resolve Snyk/SonarQube security scanner issues

Key Experience

  • Strong hands-on experience administering Snowflake data warehouse platforms
  • Experience implementing DevOps practices within data engineering or cloud environments
  • Experience building CI/CD pipelines using Azure DevOps or similar tooling
  • Strong SQL and data warehouse engineering experience
  • Experience using DBT or modern data transformation frameworks
  • Experience implementing RBAC, data security controls and governance policies
  • Experience monitoring data platform performance and managing operational incidents
  • Understanding of Azure cloud environments and modern data platforms
  • Exposure to Terraform, infrastructure as code or platform automation is beneficial
  • Experience within insurance or financial services environments is highly desirable

Robert Half Ltd acts as an employment business for temporary positions and an employment agency for permanent positions. Robert Half is committed to diversity, equity and inclusion. Suitable candidates with equivalent qualifications and more or less experience can apply. Rates of pay and salary ranges are dependent upon your experience, qualifications and training. If you wish to apply, please read our Privacy Notice describing how we may process, disclose and store your personal data:

Product Data Lead
Rise Technical Recruitment
Poole
In office
Senior
ÂŁ40,000 - ÂŁ43,000
RECENTLY POSTED
TECH-AGNOSTIC ROLE

Poole, UK - 5 Days On-site
40,000 - 43,000 + Profit Share Bonus + Benefits

This is an excellent opportunity for a data professional to take full ownership of the end-to-end product data lifecycle within a high-growth, global industry. It is a perfect fit for someone who enjoys bridging the gap between technical system management (PIM/ERP) and cross-functional leadership.

This company is a leading provider of essential products for businesses across various sectors. They specialise in delivering a comprehensive range of high-quality janitorial, catering, and packaging supplies, helping organisations maintain efficiency and hygiene in their operations.

In this varied role, you will act as the organisation’s product data authority. You will define the data strategy, govern standards, and oversee the lifecycle of products from initial creation through to change control and retirement. You will serve as the functional lead for PIM (Perfion) and ERP systems, ensuring data flows accurately to support sales, procurement, finance, and global logistics.

The ideal candidate will possess a strong background in Master Data Management (MDM) or Product Information Management (PIM), ideally within FMCG, wholesale distribution, or a related sector. You should be an analytical thinker with an eye for detail, capable of translating complex product attributes into scalable, compliant data structures that meet ESG and ISO certifications.

This is a fantastic opportunity where you will have the chance to make a significant impact on the company’s digital transformation and operational efficiency from day one.

The Role:

  • Lead PIM and ERP strategy.
  • Govern end-to-end data lifecycles.
  • Partner with procurement and category teams.
  • Ensure data accuracy and compliance.

The Person:

  • Expert in PIM and master data (MDM).
  • Advanced ERP and analytical skills.
  • FMCG or wholesale distribution background.
  • Able to commute to Poole 5 days a week.

Reference: BBBH(phone number removed)

Rise Technical Recruitment Ltd acts an employment agency for permanent roles and an employment business for temporary roles.

The salary advertised is the bracket available for this position. The actual salary paid will be dependent on your level of experience, qualifications and skill set and will be decided by our client, the employer. Rise are not responsible or liable for any hiring decisions made by the end client.

We are an equal opportunities company and welcome applications from all suitable candidates.

Proclaim Developer
Express Solicitors
Manchester
Hybrid
Mid - Senior
ÂŁ40,000 - ÂŁ50,000
RECENTLY POSTED

Job Title: Proclaim Developer

Location: Sharston, Manchester, M22 4SN

Salary: ÂŁ40,000 - ÂŁ50,000 per annum, dependent on experience

Job type: Full time, Permanent

About Us:

Established in 2000, Express Solicitors is an award-winning law firm that deals with personal injury and clinical negligence claims. Based in Manchester, we serve clients nationwide and are currently ranked 64 out of more than 10,000 law firms. We have a 5-star rating on Trustpilot from over 8,000 reviews, which coming from our clients means a lot to us. We are proud of the work we do helping injured people, and this is the core of our business.

The Role:

Express is currently looking to appoint a Proclaim Developer, who will be responsible for the design, development, and day to day administration of the Proclaim case management system.

The role will be working with our Development Manager, and seven other developers and where necessary third parties, in particular Access Legal Systems to ensure that all identified improvements to the system can be integrated efficiently and with minimal impact to end users.

In addition to development, the role will encompass the day-to-day administration of the Proclaim system including template maintenance, user configuration, task server administration and report design and execution.

The role will require co-operation with the general IT department and Operations department.

Person Specification:

Essential:

  • General understanding of common database programming and query languages.
  • Excellent analytical and problem-solving skills.
  • Effective communication skills allowing reporting on a non-technical level of work in progress to senior stakeholders.
  • Ability to clearly document and evidence planned changes using project and workflow documentation and retain current configurations for recovery purposes.
  • Excellent organisation and time management skills.
  • Must understand the concept of internal customers and ensure that internal stakeholders remain confident in the ability of Proclaim to suit the business.
  • Strong negotiation skills and the capability to deal firmly with external companies to manage projects and maintenance effectively.

Desired:

  • Up to and including Proclaim Technical Level 4 training attended or ability to demonstrate equivalent practical experience including:
  • A minimum of 5 years’ experience in a similar Proclaim development role.
  • Creating new case types, database fields and correspondents.
  • Screen design and intelligence.
  • Using maths fields and tests to perform specific functions.
  • Workflow maintenance including creation of linked actions, forms, secure documents, and master documents.
  • Advanced Report training or equivalent practical experience of designing, amending and scheduling reports.
  • Understanding of design and execution of SQL queries.
  • Experience of designing and using auto routines.
  • Import/Export routines.
  • Task server configuration, scheduling, and troubleshooting.
  • Familiarity with Proclaim v3.5 and its additional features.
  • Experience of design and execution of Macros.
  • Knowledge of Proclaim Accounts system and Sage accounting software.
  • Experience of Personal Injury, Clinical Negligence, Medical Agency, and Costs Case types.
  • Knowledge of SQL or similar database languages and ASP, PHP or similar for Web interfacing.
  • HTML knowledge.
  • Understanding of web services and multi-platform system interactions.
  • Knowledge of Proclaim MI Warehouse
  • Understanding of Proclaim A2A integration and maintenance.
  • Experience of creating and maintaining Proclaim Secure Docs.
  • A practical knowledge of Windows based networks including Active Directory and Exchange.

Salary & Hours

Salary of ÂŁ40,000 - ÂŁ50,000, dependent on experience

Our standard working hours are 8:30am to 5:30pm Monday-Thursday and 8:30am to 5pm Friday. It may on occasion be necessary to perform maintenance outside of core hours and therefore a flexible approach is required.

Benefits:

Hybrid Working - Remote or hybrid working available

23 Days Holiday - Rising to 26 days, plus bank/public holidays.

Extra Holidays - 3 holiday buy backs and an extra day for your birthday after service length requirement.

Looking After Your Health - Private medical insurance available after 2 years’ service, annual flu jab and Employee Assistance Programme.

Looking After Your Well-being - 24/7 onsite Gym, Netball/Football team, 10km Manchester team and more.

Work Life / Balance - Active social committee with generous departmental and firm-wide social budget.

Recruitment Process:

Interviews will be conducted by MS Teams and will include scenario-based questioning.

Our employees are our most important asset, we rate skill and ability above all else and our recruitment policy encourages applications from all.

Please click APPLY to be redirected to our website to complete your application.

Candidates with the relevant experience or job titles of; Proclaim Developer, Case Management Developer, Law, Web Developer, Web Designer SQL Queries, Proclaim Accounts may also be considered for this role.

Data Engineer
Consortium Professional Recruitment Ltd
Driffield
In office
Mid - Senior
ÂŁ50,000 - ÂŁ60,000
RECENTLY POSTED

Location: Hessle
Salary: ÂŁ50,000 ÂŁ60,000 depending on experience
Contract Type: Permanent

We re looking for a Data Engineer with experience in Business Intelligence to join a forward-thinking organisation modernising its data ecosystem. This role is ideally suited to someone operating at a strong mid-level, looking to deepen their technical capability while gaining exposure to more advanced data engineering and architectural concepts.

As a Data Engineer, you ll work with data from systems such as SAP, Salesforce and production environments, helping to ensure it is connected, structured and ready to support meaningful insight across the business. This is an opportunity to step into a role where you can build on solid foundations, while being challenged and supported to grow into a more senior position.

Key Responsibilities as a Data Engineer:

  • Design, build and maintain data pipelines using Azure Data Factory, with increasing ownership as you develop in the role
  • Work with stakeholders and technical colleagues to support the development of scalable, reliable data models
  • Contribute to the ongoing development of the data platform, including data lakes and integration of sources such as SAP, Salesforce and production systems
  • Collaborate with the BI Developer to deliver accurate and engaging Power BI dashboards and reports
  • Support data governance, data quality and best practice initiatives across the business
  • Play an active role in improving how data is used across departments, helping to turn data into clear, actionable insight

Skills and Experience:

  • Experience working with Azure Data Factory and cloud-based data environments
  • Strong SQL skills for querying, transformation and optimisation
  • Experience building or supporting data pipelines and ETL processes
  • Exposure to data modelling and data warehousing concepts
  • Working knowledge of Power BI or similar BI tools
  • Ability to engage with stakeholders and translate business requirements into data solutions
  • A proactive approach, with an interest in developing into a more senior or lead-level role over time

Desirable Experience:

  • Exposure to enterprise data sources such as SAP, Salesforce or production systems
  • Awareness of data governance, data quality or data privacy standards
  • Experience contributing to improvements in data processes or standards
  • Interest in mentoring, knowledge sharing, or taking on increased responsibility

Why This Role?
This is an opportunity to join a business investing heavily in its data and analytics capability, where data is becoming central to decision-making and performance. You ll work with modern Azure technologies, a range of complex data sources, and a team that values both technical quality and practical business impact.

For someone looking to move beyond pure execution, this role offers genuine scope to grow. You ll gain exposure to data architecture, platform development and cross-functional collaboration, with the opportunity to take on greater ownership and responsibility as your experience develops.

Consortium is delighted to represent this opportunity exclusively and welcomes applications from candidates who are ready to take the next step in their data engineering career.
Consortium Professional Recruitment Ltd are a professional level recruitment consultancy specialising in the delivery of high relevance recruitment services on behalf of our clients across the UK. We regularly receive a high volume of applications, which can make providing individual feedback challenging. If you have not received a response within 14 days, your application has unfortunately been unsuccessful on this occasion. We will retain your details for future opportunities unless you advise otherwise. To learn more about our services, please visit (url removed)

Page 1 of 17
Frequently asked questions
Typically, a Data Engineer should have a strong background in computer science or related fields, proficiency in programming languages like Python or Java, and experience with data warehousing, ETL processes, and big data technologies such as Hadoop or Spark.
Haystack features a wide range of Data Engineer positions, including roles in startups, large enterprises, and remote opportunities. You can find jobs specializing in cloud data engineering, real-time data processing, data pipeline development, and more.
To improve your chances, tailor your resume to highlight relevant skills and projects, gain hands-on experience with popular data tools, contribute to open-source projects, and stay updated with the latest trends in data engineering.
Yes, Haystack lists entry-level Data Engineer roles suitable for recent graduates or professionals transitioning into data engineering, as well as internships and junior positions to help you start your career.
Absolutely. Haystack offers many remote and flexible Data Engineer job listings to suit your preferred working style and location.