Make yourself visible and let companies apply to you.
Roles
Contract Data Engineer Jobs
Overview
Looking for contract Data Engineer jobs? Explore top freelance and contract opportunities for Data Engineers on Haystack. Find flexible, high-paying gigs where you can design, build, and optimize data pipelines using the latest tools and technologies. Start your contract Data Engineer career today with Haystack’s curated listings!
Data Engineer Lead (Openshift)
Infoplus Technologies UK Ltd
Sheffield
Remote or hybrid
Senior
£450/day - £480/day
RECENTLY POSTED

Key Responsibilities: Design, implement, and maintain data pipelines to ingest and process OpenShift telemetry (metrics, logs, traces) at scale. Stream OpenShift telemetry via Kafka (producers, topics, schemas) and build resilient consumer services for transformation and enrichment. Engineer data models and routing for multi-tenant observability; ensure lineage, quality, and SLAs across the stream layer. Integrate processed telemetry into Splunk for visualization, dashboards, alerting, and analytics to achieve Observability Level 4 (proactive insights). Implement schema management (Avro/Protobuf), governance, and versioning for telemetry events. Build automated validation, replay, and backfill mechanisms for data reliability and recovery. Instrument services with OpenTelemetry; standardize tracing, metrics, and structured logging across platforms. Use LLMs to enhance observability capabilities (e.g., query assistance, anomaly summarization, runbook generation). Collaborate with platform, SRE, and application teams to integrate telemetry, alerts, and SLOs. Ensure security, compliance, and best practices for data pipelines and observability platforms. Document data flows, schemas, dashboards, and operational runbooks.
Required Skills: Hands-on experience building streaming data pipelines with Kafka (producers/consumers, schema registry, Kafka Connect/KSQL/KStream). Proficiency with OpenShift/Kubernetes telemetry (OpenTelemetry, Prometheus) and CLI tooling. Experience integrating telemetry into Splunk (HEC, UF, sourcetypes, CIM), building dashboards and alerting. Strong data engineering skills in Python (or similar) for ETL/ELT, enrichment, and validation. Knowledge of event schemas (Avro/Protobuf/JSON), contracts, and backward/forward compatibility. Familiarity with observability standards and practices; ability to drive toward Level 4 maturity (proactive monitoring, automated insights). Understanding of hybrid cloud and multi-cluster telemetry patterns. Security and compliance for data pipelines: secret management, RBAC, encryption in transit/at rest. Good problem-solving skills and ability to work in a collaborative team environment. Strong communication and documentation skills.

Geospatial Software Engineer
Bright Purple Resourcing
Bath
Fully remote
Mid - Senior
£420/day
RECENTLY POSTED

Geospatial Data Engineer - Contract
Remote UK
£420 per day
12 Months - the entirety of 2026 into 2027
Deemed Outside IR35 (pending QDOS assessment)
This Contract Geospatial Data role is a fantastic opportunity to work on cutting edge data problems at a leading environmental risk consultancy. The position has arisen due to the success and growth of Bright Purple’s impressive established client. They are a leading player in their field with a powerful and highly regarded platform built using the latest technologies.
You will be working in a team of software and data engineers supporting data workflow & orchestration, within an AWS environment and must have experience of RASTER data sets.
In this role, you will be:

  • Developing ETL pipelines - Raster, Vector, Tabular data
  • Configuring CI/CD pipelines
  • Managing data storage within an AWS environment
  • Improving data automation, workflow and efficiency
  • Developing Python based ML pipelines

Key skills for this role include:

  • Good knowledge of Python programming
  • Experience in cloud computing (ideally AWS)
  • Strong experience across industries in both Geospatial and non-Geospatial domains
  • Experience with Machine Learning (sci-kit learn, tensorflow, metaflow, MLOps)

Preferred Experience:

  • Experience with frameworks like Metaflow, Prefect, etc.
  • Experience with geospatial libraries i.e. Raster, Geo-pandas, Vector databases

Bright Purple is an equal opportunities employer: we are proud to work with clients who share our values of diversity and inclusion in our industry.

Lead Data Engineer
Stealth IT Consulting Limited
Sheffield
Hybrid
Senior
£400/day - £445/day

Location: Hybrid (60% Office / 40% Remote) Sheffield
Contract: 6 Month Contract (Extension possible)
Rate: £400+ per day inside IR35

Role Overview

We are seeking an experienced Lead Data Engineer to design, develop, and optimise enterprise-scale data platforms for large, regulated organisations, ideally in banking, financial services, or other regulated sectors. This role requires hands-on technical expertise, leadership, and a consulting mindset to deliver scalable, resilient data solutions while promoting best practices and operational excellence.

Key Responsibilities

  • Lead the design, development, and optimisation of enterprise data engineering platforms
  • Build and maintain robust ETL/ELT pipelines integrating large, complex datasets
  • Work with structured, semi-structured, and unstructured data across SQL and NoSQL technologies
  • Develop solutions using Hadoop, Spark, and Splunk in large-scale environments
  • Write maintainable Python code, applying object-oriented and functional programming principles
  • Implement and maintain CI/CD pipelines, automated testing, and version control
  • Collaborate with BI, Analytics, and downstream teams to support reporting and insights
  • Pair program and mentor other engineers to promote knowledge sharing and code quality
  • Define and maintain technical test plans including unit and integration tests
  • Promote SRE principles to ensure service resilience, sustainability, and recoverability

Essential Skills & Experience

  • Proven experience as a Lead or Senior Data Engineer in enterprise-scale environments
  • Hands-on expertise with Hadoop, Spark, and Splunk
  • Advanced Python development skills
  • Experience designing and optimising high-performance data pipelines
  • Strong understanding of CI/CD, source control, and automated testing
  • Analytical, problem-solving, and leadership skills
  • Experience working in regulated, enterprise environments (e.g., banking, fintech, government)
  • Agile delivery experience (Scrum/Kanban)

Consulting & Soft Skills

  • Ability to mentor and uplift team performance
  • Strong communication and stakeholder engagement skills
  • Collaborative, delivery-focused mindset with high accountability
  • Knowledge of control, compliance, and regulatory requirements
  • Up-to-date awareness of modern tools, cybersecurity, and data privacy regulations
  • Champions innovation, advanced technologies, and best practices

If this role aligns with your skills and experience, we’d love to hear from you. Apply today to be considered.

Software Simulation Engineer
Quest Global Engineering Limited
Redditch
In office
Mid - Senior
Private salary

Redditch, UK

12 months +

Work Experience

  • Bachelors or Masters degree in Computer Science ./Industrial Engineering
  • Experience: Minimum 37 years of relevant experience in supply chain simulation, industrial engineering, or discrete event simulation.
  • Simulation Software/ Tools : Proficiency in specialized software such as FlexSim, AnyLogic, .ANSYS (FEA/CFD), MATLAB/Simulink (dynamic systems), COMSOL, or AnyLogic.
  • Programming: Strong scripting skills (Python, R) for data analysis and automating simulation tasks.
  • Programming Languages: Proficiency in Python (scientific computing), C++ (high-performance tasks), and MATLAB/R for data analysis and mathematical modeling.
  • Domain Knowledge: Solid understanding of warehousing automation technologies (AGVs, sorters, AS/RS).
  • WMS Knowledge: Familiarity with WMS systems.
  • Simulation Tools: Experience with industry-specific software such as ANSYS (FEA/CFD), MATLAB/Simulink (dynamic systems), COMSOL, or AnyLogic.
  • Experience with CAD tools (e.g., AutoCAD) for layout creation.

Job Requirements

  • The Simulation Engineer will develop, validate, and analyze discrete-event simulation models of warehouse facilities, incorporating automation, conveyors, and WMS software. The goal is to identify bottlenecks, improve efficiency, and validate operational scenarios before implementation.
  • Modeling & Simulation: Design and build 3D simulation models of distribution centers and warehouse logistics using software like FlexSim, Simio, or AnyLogic.
  • Data Analysis & Validation: Analyze operational data (e.g., order profiles, inventory levels, stock audit ) to validate simulation models, ensuring they accurately represent real-world operations.
  • Process Optimization: Conduct experiments to identify bottlenecks, test “what-if” scenarios, and optimize resource requirements (staffing, automated equipment).
  • WMS Integration: Simulate interactions between physical equipment (AGVs, ASRS, conveyors) and warehouse software systems (e.g., SAP EWM, Manhattan Associates).
  • Documentation & Reporting: Create detailed technical reports and visualizations (Tableau, R) to present findings to stakeholders and support data-driven decision-making.
  • Cross-Functional Collaboration: Collaborate with engineering and operations teams to integrate simulation results into final warehouse designs.
Data Platform Engineer
Morson Edge
Preston
In office
Mid - Senior
£64/hour - £74/hour

Data Platform Engineer – Warton – SC Cleared – 12 Month Contract

About Your Role:

As a Data Platform Engineer in a highly regulated environment, you will be responsible for designing, building, and maintaining secure and scalable data infrastructure that supports both cloud and on premises platforms.
You will play a key role in ensuring that all data systems comply with industry regulations and security standards while enabling efficient access for analytics and operational teams.
A strong command of Apache NiFi is essential for this role.
You will design, implement, and maintain data flows using NiFi, ensuring accurate, efficient, and secure data ingestion, transformation, and delivery.
You should be adept at identifying and resolving issues within NiFi flows, managing performance bottlenecks, and implementing robust error handling strategies.
You’ll work closely with cross-functional teams including data architects, compliance officers, and cybersecurity specialists to integrate data from various systems such as databases, APIs, and cloud platforms.

About You:

As an experienced Data Platform Engineer, your skills and experience may include;

• Strong experience of Apache NiFi
• Experience designing and optimizing data flows for batch, real-time streaming, and event-driven architectures.
• Ability to identify and resolve flow issues, optimize performance, and implement error-handling strategies.
• Optional scripting skills for creating custom NiFi processors.
• Knowledge of data modelling, replication, and query optimization.
• Hands-on experience with SQL and NoSQL databases is desirable.
• Familiarity with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery) would be beneficial.
• Data Platform Management
• Comfortable operating in hybrid environments (cloud and on-prem).
• Experience integrating diverse data sources and systems

Machine Learning Engineer (Python / MLE)
Sanderson Recruitment
London
Fully remote
Mid - Senior
£650/day - £750/day

Machine Learning Engineer (Python / MLE )

6 Month Contract

£650 - £750

Remote

Umbrella

Urgent Start

We are looking for a number of MLE / Machine Learning Engineers for a critical 6 month contract with a household name. This role is essential for maintaining the stability and performance of their core data and Machine Learning systems.

If your background is diving into complex production codebases and possess excellent problem solving skills, this is the perfect opportunity. We need candidates ready to start by Mid to late November.

This is a deeply technical engineering role focused less on new feature development and more on reliability and fixes.

Key Skills:

Investigating and debugging complex data flow and Machine Learning issues within a live, high impact production environment.

Extensive Python, NumPy and Pandas is required for this role.

You must demonstrate a deep commercial background in the following areas:

Extensive Python: Very strong, production-level Python coding and debugging skills.

Production Environment: Proven experience working directly with and troubleshooting issues in live production codebases (not just isolated development).

Cloud Experience: Solid experience with any major public cloud provider (GCP, AWS, or Azure).

Experience with BigQuery would be good.

Machine Learning: Experience supporting and understanding ML pipelines and models in a production setting.

Direct experience with Google Cloud Platform, BigQuery, and associated tooling.

Experience with workflow tools like Airflow or Kubeflow.

Familiarity with dbt (Data Build Tool).

Please send your CV for more information on these roles.

Reasonable Adjustments:

Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people of all backgrounds and perspectives. Our success is driven by our people, united by the spirit of partnership to deliver the best resourcing solutions for our clients.

If you need any help or adjustments during the recruitment process for any reason , please let us know when you apply or talk to the recruiters directly so we can support you.

TPBN1_UKTJ

ODI Designer Developer
Stackstudio Digital Ltd.
Milton Keynes
Hybrid
Senior
£500/day - £550/day

Role Details

  • Role/Job Title: ODI Designer Developer
  • Work Location: Milton Keynes
  • Hybrid Requirement: 2 3 days in office

The RoleSantander is undergoing major Finance transformation using Oracle ERP. This role requires experienced ODI professionals who can independently work on complex tasks related to ODI.Your Responsibilities

  • Develop, test, and deploy ETL processes using Oracle Data Integrator (ODI).
  • Collaborate with data architects to design data integration solutions.
  • Analyze and translate data requirements into technical specifications.
  • Monitor and troubleshoot data integration workflows for issues.
  • Optimize ETL processes to ensure high performance and data integrity.
  • Document ETL processes and maintain accurate records of data transformations.
  • Work with stakeholders to understand data requirements and deliver solutions.
  • Participate in code reviews and provide constructive feedback to peers.
  • Unix Scipt , Strong PL/SQL

Your ProfileEssential Skills / Knowledge / Experience

  • Good Communication skills.
  • At least 7-10 years of experience working In ODI.
  • Strong knowledge of Oracle ODI with at least 2 recent project experence.
Intern - Business Intelligence & Performance Reporting - (Fixed Term) - GLA14952
Glasgow
UK
In office
Graduate
£10,000/day (Negotiable)
TECH-AGNOSTIC ROLE
Job Description

Glasgow City Council’s Summer Internship Programme will be available from Monday 8 June 2026 – Friday 28 August 2026, inclusive.

Applicants must be available for the full duration of the placement.

The intern will work 35 hours per week and rate of pay will be the Glasgow Living Wage.

Interns will work for 12 weeks, during which time they will accrue 6 days leave, the payment of which is included in their Salary so must be taken during their 12-week placement.

Applicants require to be available week commencing 23 March 2026 - Thurs 2 April 2026 for interview.

The intern will support the development of enhanced Business Intelligence (BI) reporting to strengthen performance monitoring, governance and audit assurance within the Directorate.

Key Responsibilities
• Review and analyse existing BI dashboards and underlying data sources across

Education Services
• Work with officers to define and agree key performance indicators (KPIs) aligned to

Directorate priority committee reporting and Internal Audit requirements.
• Design and develop a consolidated BI dashboard or KPI-based performance report
• Test outputs with key stakeholders, incorporating feedback and ensuring data accuracy and usability
• Produce clear documentation and support handover to ensure outputs can be maintained and refreshed beyond the project period.

Eligibility criteria
• Must live within the Glasgow City Council boundary
• Have the right to live and work in the UK
• Be in the year of study specified in the advert

For more Information please see attached Recruitment outline and Person Specification or please visit our website https://www.glasgow.gov.uk/summerinternship.

Application Packs

We want everyone to be able to apply. If you need the Application Pack in another format, like Braille, large print, or another language, please call us on 0141 287 1054.

If we need to post it to you, we’ll send it by second-class mail within three working days. Please allow enough time to complete and return your application before the closing date. If you think you might need more time because of accessibility needs, please get in touch and we’ll be happy to help.

There are also a number of Accessibility Tools compatible with the myjobscotland website which may assist you with your application. More information on these can be found at https://myjobscotland.gov.uk/accessibility-statement.

Further Information

Please note that Glasgow City Council is currently completing a Job Evaluation exercise and introducing a new pay and grading structure which may impact on current salaries quoted in job adverts, see

Working for Us\Job Evaluation

For further information about working for us please refer to our website GCC HR Policies

Data Scientist (NLP & LLM Specialist)
Randstad Technologies Recruitment
London
Fully remote
Mid - Senior
£400/day - £565/day

Remote- UK 6-month contract with Potential Extension Day rate - £427.68 - £565 per day Inside IR35 Data Scientist (NLP & LLM Specialist) Are you an expert in Natural Language Processing who thrives on building scalable, real-world AI solutions? We are seeking a hands-on Data Scientist to join a premier global credit ratings and financial information firm. You will be a key player in launching a brand-new, from-scratch analytics platform designed for elite institutional clients including corporate banks and asset managers. The Opportunity In this role, you will go beyond conventional boundaries to design, build, and deploy quantitative models that power advanced insights. You will collaborate with a cross-domain team of economists, political scientists, and developers to transform proprietary risk data into actionable strategic assets. Your Impact Model Innovation: Design and optimize risk models for analytics and generative AI applications using proprietary NLP data generation processes. Pipeline Development: Develop and maintain robust ML and data pipelines for experimentation and deployment. Insight Extraction: Prototype and test new approaches for extracting insights from structured and unstructured data. Technical Translation: Explain ML/NLP model outputs and methodologies to non-technical stakeholders to drive strategic decisions. Your Experience Experience Level: 5-7+ years (More experience is welcomed). Core Technicals: High expertise in Python and Machine Learning (ML). NLP Expertise: 3-5 years of experience in Natural Language Processing. AI Knowledge: Familiarity with LangChain and LlamaIndex. The role involves using Large Language Models (LLMs) to build data models rather than building LLMs from scratch. Deployment: Must understand the deployment process and CI/CD practices to troubleshoot, though a dedicated engineering team handles the heavy lifting. Industry Knowledge: Experience with Risk Modeling or financial services is preferred.This is an urgent vacancy where the hiring manager is shortlisting for an interview immediately. Please apply with a copy of your CV or send it khushboo. pandey @ randstad. Co. uk Randstad Technologies is acting as an Employment Business in relation to this vacancy

Data Engineer | Outside IR35 | £400 - £500 | 6 months | Hybrid Nottingham
Opus Recruitment Solutions
Nottingham
Hybrid
Mid - Senior
£400/day - £500/day

We’re looking for a highly skilled Data Engineer to join a growing data team supporting a large-scale modern data platform project. Working 3 days a week onsite on the outskirts of Nottingham, you'll be a key contributor in evolving the organisation’s data capability, focusing on best‑practice engineering, clean architecture, and high‑value BI delivery. Key Responsibilities Design, build and optimise ETL/ELT pipelines using Azure Data Factory and Databricks. Develop scalable data models and transformations using SQL and Python. Work hands‑on with Databricks (Lakehouse, Delta tables, notebooks, workflows). Deliver high-quality dashboards and reporting solutions using Power BI. Implement best practices for data quality, governance, lineage and automation. Collaborate with cross‑functional teams including analysts, product owners and business stakeholders. Support performance tuning, cost optimisation and reliability improvements across the data estate. Document pipelines, models and processes to ensure smooth knowledge transfer.Technical Skills Required Databricks – notebooks, Delta Lake, Spark (PySpark desirable) Power BI – data modelling, DAX, dashboard/report development SQL – advanced querying, performance optimisation, data modelling Python – scripting, transformation logic, automation Azure Data Factory – pipelines, triggers, mapping data flows Understanding of data warehousing / lakehouse principles Experience working in cloud-based data ecosystems (Azure) Strong appreciation of data quality, governance and best practicesIf this is a role that suits your skillset, can work onsite 3 days per week and immediately available then please apply for the job advert directly or reach out to myself at (url removed)

CGEMJP00330718 Lead Data Engineer
CBSbutler Holdings Limited trading as CBSbutler
Sheffield
Hybrid
Senior
£430/day

Role Title: Lead Data Engineer Location: Sheffield/hybrid (3 days on site) Duration: 9 months Rate: £430 per day inside ir35 We are seeking a Lead Data Engineering Consultant with proven experience in leading and developing data engineering platforms. Experience required: Extensive enterprise experience with Hadoop, Spark, and Splunk. Proficiency in object-oriented and functional scripting, particularly in Python. Skilled in handling raw, structured, semi-structured, and unstructured data (SQL and NoSQL). Experience integrating large, disparate datasets using modern tools and frameworks. Strong background in building and optimizing ETL/ELT data pipelines. Familiarity with source control and implementing Continuous Integration, Delivery, and Deployment via CI/CD pipelines. Experience supporting and collaborating with BI and Analytics teams in fast-paced environments. Ability to pair program and work effectively with other engineers. Excellent analytical and problem-solving abilities. Knowledge of agile methodologies such as Scrum or Kanban is a plus. Comfortable representing the team in standups and problem-solving sessions. Capable of driving the creation of technical test plans and maintaining records, including unit and integration tests, within automated test environments to ensure high code quality. Promote SRE (Site Reliability Engineering) culture by addressing challenges through data engineering. Ensure service resilience, sustainability, and adherence to recovery time objectives for all delivered software solutions.Soft Skills (Consultant): Demonstrated ability and enthusiasm for enhancing team performance. Strong active listening and effective communication skills. Self-mastery, with a focus on positive mindsets and professional behaviours. Maintains up-to-date expertise in current tools, technologies, and key areas such as cybersecurity, data privacy, consent, and data residency regulations. Engages with industry groups and external vendors to represent and advance HSBC's interests and influence. Takes accountability for ensuring control and compliance throughout the engineering process. Champions innovation and the adoption of advanced technologies and best practices within the domain.If you are interested in this role or wish to apply, please feel free to submit your CV

Repairs Data Analyst
Sellick Partnership
Manchester
Hybrid
Junior - Mid
£25/hour - £28/hour

Repairs Data Analyst - Hybrid Role

Location: Manchester
Contract: Up to 3 months
Pay: 25 - 27 Umbrella

About the Role We’re looking for a Repairs Planning Officer to join our team on a hybrid basis. You wil be responsible for providing analytical insight across data linked to a key materials project; supporting informed decision-making, with particular focus on performance monitoring, process compliance and the tracking of materials purchasing.

The Repairs Data Analyst responsibilities include:

  • Ensuring that data collected and managed by the Distribution Centre team is accurate, reliable, up to date, and sufficient to support data-driven decision making within the department and wider business.
  • Collating, organising, and analysing data to provide operational and business insight.
  • Identifying trends across datasets to inform investigations, proactive surveys, or planned programmes of work.
  • Producing analysis and reports for the department and wider business, aligned to the project scope.
  • Processing, analysing, and interpreting data related to Great Places’ performance and operations.
  • Creating visualisations and reports to communicate findings effectively to key stakeholders.
  • Providing accurate, timely, and relevant business-critical performance information.

The successful Repairs Data Analyst will have:

  • Proficiency in the full Microsoft Office suite, with advanced skills in Microsoft Excel
  • Experience working with large datasets, analysing and comparing information, and communicating results effectively
  • Experience of project management
  • Advantageous experience in SQL, power BI and data warehouse reporting and extraction

Please contact Josh at the Derby Office for more information.

Sellick Partnership is proud to be an inclusive and accessible recruitment business and we support applications from candidates of all backgrounds and circumstances. Please note, our advertisements use years’ experience, hourly rates, and salary levels purely as a guide and we assess applications based on the experience and skills evidenced on the CV. For information on how your personal details may be used by Sellick Partnership, please review our data processing notice on our website.

Snowflake Data Engineer
Square One Resources
London
Hybrid
Mid - Senior
£550/day - £600/day

Job Title: Snowflake Data Engineer
Location: London (2 days on-site per week)
Salary/Rate: 550 - 600 per day inside IR35
Start Date: March
Job Type: Initial 3-6 month contract

Company Introduction
We have an exciting opportunity now available with one of our sector-leading consultancy clients! They are currently looking for a skilled Snowflake Data Engineer to help on their cloud migration project.

Job Responsibilities/Objectives
You will be responsible for designing and building scalable data pipelines, Data Vault models/Dimension Model, and Snowflake/dbt workloads for cloud migration projects.

? Implement Data Vault 2.0 (Hubs, Links, Satellites) /Dimension Model on Snowflake.
? Build ELT pipelines using Snowflake, dbt, Python/PySpark.
? Develop ingestion from APIs, databases, streams.
? Optimize Snowflake warehouses, cost, and performance.
? Collaborate with architects, analysts, and DevOps.
? Maintain documentation, lineage, governance standards.

Required Skills/Experience
The ideal candidate will have the following:

? Strong SQL; Snowflake ELT; dbt experience.
? Python/PySpark, ETL/ELT design.
? Data Vault 2.0 or dimensional modeling.
? AWS services (S3, Glue, Lambda, Redshift) or GCP equivalents.
? Experience with CI/CD for data pipelines.

Good to have skills
Although not essential, the following skills are desired by the client:

? Kafka/Kinesis, Airflow, CodePipeline.
? BI tools (Power BI/Tableau).
? Docker/OpenShift; metadata driven pipelines.

? 3-8+ years Data Engineering experience.
? Cloud data engineering and Snowflake/dbt hands on exposure.

If you are interested in this opportunity, please apply now with your updated CV in Microsoft Word/PDF format.

Disclaimer
Notwithstanding any guidelines given to level of experience sought, we will consider candidates from outside this range if they can demonstrate the necessary competencies.
Square One is acting as both an employment agency and an employment business, and is an equal opportunities recruitment business. Square One embraces diversity and will treat everyone equally. Please see our website for our full diversity statement.

AI Developer
Experis
London
Hybrid
Mid - Senior
£585/day

Role: AI Developer

Location: London - up to 3 days per week on-site

Duration: 3 Months

Day rate: 540 - 585 Umbrella Only

Minimum of 5 years UK residency required

Role Description:

We’re looking for an experience AI developer with the skills below. They’ll be building out our client’s AI needs and wants using a Microsoft Azure and Copilot stack.

Essential skills and experience

  • Design and develop Microsoft Azure AI solutions using services such as; AI Foundry, Open-AI services, and Microsoft Copilot Studio.
  • Apply governance frameworks for AI/ML models.
  • Python knowledge to automate processes
  • Large datasets experience, including data preprocessing, feature engineering, and model evaluation.
  • Agile development of AI solution from concept to deployment to continuous improvement.
  • Create and maintain technical documentation
  • Communicate complex concepts to all stakeholders.
  • At least 3 years of experience in AI solution engineering.
  • Large Language Models experience including prompt engineering and RAG implementations.
  • Expert data analytics, MLOps practices and API development.
  • Desirable knowledge in Docker and Kubernetes
PLM Data Analyst
Computer Futures - London & S.E(Permanent and Contract)
Not Specified
In office
Mid - Senior
£500/day - £800/day
TECH-AGNOSTIC ROLE

PLM Data Analyst Opportunity

Are you an experienced PLM Data Analyst with a background in aerospace and defense? Join our client’s team on a contract basis to participate in advanced projects at the forefront of the industry. This exciting opportunity involves working with innovative tools and technologies, helping to shape the future by leveraging your expertise in PLM systems.

Role Overview

As a PLM Data Analyst, you will play a key role in analysing existing CATIA V5 PLM data, such as CAD, metadata, and structures. You’ll support data mapping activities from CATIA V5 to the 3DEXPERIENCE (3DX) data model and contribute to the seamless integration of PLM object models. This role is especially suited to someone with a strong understanding of parts, products, documents, and BOMs within the ENOVIA ecosystem.

Key Skills and Responsibilities

  • CATIA V5 and 3DEXPERIENCE (3DX) expertise: Proficient in analysing and working with PLM data models to enhance system performance.
  • PLM object models: In-depth knowledge of parts, products, documents, and BOMs.
  • Data mapping: Supporting integration and alignment activities between CATIA V5 and the 3DX data model.
  • ENOVIA data handling: Expertise in managing and manipulating ENOVIA-related data structures.

Join a dynamic sector and contribute to a leading client’s innovative projects. If you’re looking for a challenging and rewarding role, apply today to bring your skills to our client’s esteemed team.

Please visit our website to find out more about our Key Information Documents. Please note that the documents provided contain generic information. If we are successful in finding you an assignment, you will receive a Key Information Document which will be specific to the vendor set-up you have chosen and your placement.

To find out more about Computer Futures please visit our website

Computer Futures, a trading division of SThree Partnership LLP is acting as an Employment Business in relation to this vacancy | Registered office | 8 Bishopsgate, London, EC2N 4BQ, United Kingdom | Partnership Number | OC387148 England and Wales

CGEMJP Lead Data Engineer
CBSbutler Holdings Limited trading as CBSbutler
Sheffield
Hybrid
Senior
£430/day

Role Title: Lead Data Engineer

Location: Sheffield/hybrid (3 days on site)

Duration: 9 months

Rate: 430 per day inside ir35

We are seeking a Lead Data Engineering Consultant with proven experience in leading and developing data engineering platforms.

Experience required:

  • Extensive enterprise experience with Hadoop, Spark, and Splunk.
  • Proficiency in object-oriented and functional scripting, particularly in Python.
  • Skilled in handling raw, structured, semi-structured, and unstructured data (SQL and NoSQL).
  • Experience integrating large, disparate datasets using modern tools and frameworks.
  • Strong background in building and optimizing ETL/ELT data pipelines.
  • Familiarity with source control and implementing Continuous Integration, Delivery, and Deployment via CI/CD pipelines.
  • Experience supporting and collaborating with BI and Analytics teams in fast-paced environments.
  • Ability to pair program and work effectively with other engineers.
  • Excellent analytical and problem-solving abilities.
  • Knowledge of agile methodologies such as Scrum or Kanban is a plus.
  • Comfortable representing the team in standups and problem-solving sessions.
  • Capable of driving the creation of technical test plans and maintaining records, including unit and integration tests, within automated test environments to ensure high code quality.
  • Promote SRE (Site Reliability Engineering) culture by addressing challenges through data engineering.
  • Ensure service resilience, sustainability, and adherence to recovery time objectives for all delivered software solutions.

Soft Skills (Consultant):

  • Demonstrated ability and enthusiasm for enhancing team performance.
  • Strong active listening and effective communication skills.
  • Self-mastery, with a focus on positive mindsets and professional behaviours.
  • Maintains up-to-date expertise in current tools, technologies, and key areas such as cybersecurity, data privacy, consent, and data residency regulations.
  • Engages with industry groups and external vendors to represent and advance HSBC’s interests and influence.
  • Takes accountability for ensuring control and compliance throughout the engineering process.
  • Champions innovation and the adoption of advanced technologies and best practices within the domain.

If you are interested in this role or wish to apply, please feel free to submit your CV.

Senior Data Scientist
RCRTR
Birmingham
Remote or hybrid
Senior
£400/day - £450/day

400 per day - inside IR35

Are you an expert in Natural Language Processing who thrives on building scalable, real-world AI solutions? We are seeking a hands-on Data Scientist to join a premier global credit ratings and financial information firm. You will be a key player in launching a brand-new, from-scratch analytics platform designed for elite institutional clients including corporate banks and asset managers.

The Opportunity

In this role, you will go beyond conventional boundaries to design, build, and deploy quantitative models that power advanced insights. You will collaborate with a cross-domain team of economists, political scientists, and developers to transform proprietary risk data into actionable strategic assets.

Your Impact

  • Model Innovation: Design and optimize risk models for analytics and generative AI applications using proprietary NLP data generation processes.
  • Pipeline Development: Develop and maintain robust ML and data pipelines for experimentation and deployment.
  • Insight Extraction: Prototype and test new approaches for extracting insights from structured and unstructured data.
  • Technical Translation: Explain ML/NLP model outputs and methodologies to non-technical stakeholders to drive strategic decisions.

Your Experience

  • Experience Level: 5-7+ years (More experience is welcomed).
  • Core Technicals: High expertise in Python and Machine Learning (ML).
  • NLP Expertise: 3-5 years of experience in Natural Language Processing.
  • AI Knowledge: Familiarity with LangChain and LlamaIndex. The role involves using Large Language Models (LLMs) to build data models rather than building LLMs from scratch.
  • Deployment: Must understand the deployment process and CI/CD practices to troubleshoot, though a dedicated engineering team handles the heavy lifting.
Data Engineer
Damia Group Ltd
Shropshire
Hybrid
Junior - Mid
£393/day

Data Engineer - Telford 2 days onsite - 393 per day inside IR35 - 6 months We are looking for an ideally SC Cleared Data Engineer or one who is eligible for clearance This developer role will be primarily working on Talend and Oracle RDS systems, within our existing Talend framework and patterns. Experience of ETL tooling will be needed, preferably Talend but Pentaho/Informatica experience will be transferable. Experience working in Oracle RDS databases will also be required. Must have: Data ETL product experience - Talend preferred Oracle RDS Nice to have: SQL AWS GenAI *Damia Group Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept our Data Protection Policy which can be found on our website.* *Please note that no terminology in this advert is intended to discriminate on the grounds of a person's gender, marital status, race, religion, colour, age, disability or sexual orientation. Every candidate will be assessed only in accordance with their merits, qualifications and ability to perform the duties of the job.* *Should the role require the successful candidate to undergo and be eligible for UK Security Vetting. Clearance sponsorship will be provided where required. Due to the nature of the work, candidates should meet the relevant residency requirements. If applicable, Reserved Post nationality restrictions will be confirmed by the client. Damia is committed to inclusive recruitment and welcomes applicants from all backgrounds.* *Damia Group is acting as an Employment Business in relation to this vacancy and in accordance to Conduct Regulations 2003.*

BI Developer
Gleeson Recruitment Ltd
Wolverhampton
Hybrid
Mid
£35

BI Developer (Power BI | Azure | SQL)

Hybrid - Wolves based office 3 days per week

Are you a data-driven problem solver who loves turning complex data into clear insights? We’re looking for a skilled BI Developer to join our clients growing team.

What you’ll do:

  • Build impactful dashboards and reports in Power BI
  • Develop and optimise data solutions using Azure
  • Write and maintain efficient SQL queries and data models
  • Work closely with stakeholders to translate business needs into actionable insights

What we’re looking for:

  • Strong experience with Power BI, Azure, and SQL
  • Solid understanding of data modelling and ETL processes
  • Ability to communicate insights clearly to non-technical audiences
  • A proactive, solutions-focused mindset

BI Developer - apply ASAP if interesed. GleeIT

At Gleeson Recruitment Group, we embrace inclusivity and welcome applicants of all backgrounds, experiences, and abilities. We are proud to be a disability confident employer.

By applying you will be registered as a candidate with Gleeson Recruitment Limited. Our Privacy Policy is available on our website and explains how we will use your data.

Data Engineer
CBSbutler Holdings Limited trading as CBSbutler
Shropshire
Hybrid
Mid - Senior
£430/day

Job Title: Data Engineer (AEOI) Rate: £430 per day inside ir35 Duration: 6 months Location: Telford/hybrid (2 days onsite) SC security clearance is required for this role. We're hiring an ETL Developer to support a major government AEOI programme covering Pillar R7, ETR Exchange, NTJ Exchange and CRS Outbound Exchange. Due to growing demand, new teams are being stood up and existing teams expanded to deliver critical data exchange services. Job Description: Project - AEOI Projects - Pillar R7/ETR Exchange/NTJ Exchange/CRS Outbound Exchange Demand in the AEOI programme space is expected to increase necessitating the stand-up of an additional team and the expansion of existing teams to support. This developer role will be primarily working on Talend and Oracle RDS systems, within my clients existing Talend framework and patterns.Expereince required: Experience of ETL tooling will be needed, preferably Talend but Pentaho/Informatica experience will be transferable. Experience working in Oracle RDS databases will also be required. Data ETL product experience - Talend preferred Oracle RDSNice to have: SQL AWS GenAIIf you are interested in this role, please feel free to submit your CV

Senior Data Engineer
Pontoon
Warwick
Hybrid
Senior
£500/day - £550/day

Pontoon is an employment consultancy. We put expertise, energy, and enthusiasm into improving everyone’s chance of being part of the workplace. We respect and appreciate people of all ethnicities, generations, religious beliefs, sexual orientations, gender identities, and more. We do this by showcasing their talents, skills, and unique experience in an inclusive environment that helps them thrive.

Join Our Team as a Senior Data Engineer!

Are you a passionate Data Engineer with a flair for innovation? Do you thrive in a dynamic environment where your skills can shape the future of data architecture? If so, we have the perfect opportunity for you! Our client, a leader in the Utilities sector, is seeking a Senior Data Engineer for a temporary role of 3 months.

Role: Senior Data Engineer

Duration: 3 Months (extension options)

Location: Warwick (Hybrid - 1 day on site)

Rate: 500- 550 per day (umbrella)

Role Overview: As a Senior Data Engineer, you will play a pivotal role in enhancing the Interconnectors Data Platform (ICDP), a cloud-based data warehouse essential for commercial, financial modeling, and operational decision-making. With the platform evolving towards a modernized Medallion Architecture and Azure-native ingestion patterns, your expertise will drive architectural direction and technical leadership.

Key Responsibilities:

Data Architecture & Platform Engineering:

  • Lead the design and implementation of scalable data architectures using Bronze/Silver/Gold layered models.
  • Shape the platform’s architectural roadmap, ensuring alignment with cutting-edge engineering practices.
  • Develop secure and observable ingestion and transformation pipelines.

Pipeline Development & Operations:

  • Spearhead the migration from legacy ETL tools to modern Azure-based pipelines, using Azure Functions, Azure Data Factory (ADF), and event-driven frameworks.
  • Build and maintain high-performance SQL transformations, curated layers, and reusable data models.
  • Embed CI/CD, testing, version control, and observability into workflows.

Data Quality & Governance:

  • Ensure robust data validation, reconciliation, profiling, and auditability across platform layers.
  • Collaborate with business stakeholders to guarantee analytical and operational needs are met.

Leadership:

  • Mentor fellow data engineers, fostering technical growth within the ICDP team.
  • Collaborate with Product teams, IT&D, and external partners to achieve high-quality outcomes.
  • Serve as a technical authority on engineering approaches, patterns, and standards.

Required Skills & Experience:

Essential Technical Skills:

  • Python: Strong hands-on experience in building production-grade data pipelines and orchestration.
  • Advanced SQL: Expert-level skills in analytical SQL, query optimization, and data modeling.
  • Azure Cloud: Familiarity with Azure Functions, Azure Data Factory, Azure Storage, and cloud security fundamentals.
  • Data Warehousing: In-depth understanding of data architecture principles and scalable enterprise data design.
  • Version Control: Proficient in Git, CI/CD, automated testing, and modern engineering practices.
  • Pipeline Design: Experience with API ingestion, SFTP ingestion, and resilient pipeline design.

Soft Skills:

  • Exceptional problem-solving and architectural thinking abilities.
  • Strong communication and stakeholder collaboration skills.
  • Capability to lead and provide clarity in complex technical environments.

Desirable Experience:

  • Involvement in data-platform re-architecture programs.
  • Exposure to Medallion/Lakehouse patterns or Databricks-style ecosystems.
  • Experience in regulated or high-assurance data environments.

Why Join Us? This is your chance to be part of a transformative journey in the Utilities industry! Not only will you be enhancing your skills, but you will also contribute to a vital platform that impacts decision-making at every level.

If you’re ready to take on this exciting challenge and make a significant impact, we want to hear from you! Apply now and become a key player in our client’s innovative team!

Candidates will ideally show evidence of the above in their CV to be considered.

Please be advised if you haven’t heard from us within 48 hours then unfortunately your application has not been successful on this occasion, we may however keep your details on file for any suitable future vacancies and contact you accordingly.

We use generative AI tools to support our candidate screening process. This helps us ensure a fair, consistent, and efficient experience for all applicants. Rest assured, all final decisions are made by our hiring team, and your application will be reviewed with care and attention.

Frequently asked questions
Haystack features a wide range of contract Data Engineer positions, including short-term, long-term, remote, and on-site roles across various industries such as finance, healthcare, and technology.
To apply, simply create a profile on Haystack, upload your resume, and submit applications directly through the platform to any contract Data Engineer job that matches your skills and experience.
Yes, many contract Data Engineer listings on Haystack offer remote or hybrid working options. You can filter job searches by location and remote availability to find the best fit.
Typically, contract Data Engineers should have experience with big data tools (like Hadoop, Spark), SQL, Python/Scala, ETL processes, and data warehousing solutions. Specific requirements may vary by job.
While Haystack facilitates job postings and applications, contract negotiations including rates and terms are generally handled between you and the hiring company or recruiter directly.