Make yourself visible and let companies apply to you.
Roles
Contract Data Engineer Jobs
Overview
Looking for contract Data Engineer jobs? Explore top freelance and contract opportunities for Data Engineers on Haystack. Find flexible, high-paying gigs where you can design, build, and optimize data pipelines using the latest tools and technologies. Start your contract Data Engineer career today with Haystack’s curated listings!
Data Analytics Engineer
Teksystems
Normanton
Hybrid
Senior
Private salary
RECENTLY POSTED

? Job Ad: Analytics Engineer (6-Month Initial Contract)

Role: Analytics Engineer

Location: Hybrid London

Contract Length: 6-Month Initial Contract (likely extension)

Client: A large insurance company

Industry Preference: Financial Services or Insurance experience strongly preferred

?? About the Role

We are supporting a large insurance company in hiring a skilled Analytics Engineer for a 6-month initial contract. Youll join a data function operating with modern engineering practices, helping to build robust, well-tested, scalable analytics models that power key decision-making across the business.

?? What Youll Be Doing

  • Build, test, and maintain analytics models using dbt Core, ensuring data accuracy and reliability.
  • Create comprehensive dbt tests and documentation to support transparency and quality.
  • Collaborate with the team through Git-based workflows (branches, PRs, code reviews).
  • Partner with analysts to convert business needs into scalable, production-ready data models.
  • Develop and maintain dbt models on on-prem MSSQL environments.
  • Use Python for data transformation, automation, and workflow improvements.
  • Uphold data quality, modular design, and analytics engineering best practices.
  • Continuously optimise data pipelines and support enhancements to the wider analytics infrastructure.

?? What Were Looking For

  • Strong hands-on experience with dbt Core (modelling, testing, documentation).
  • Skilled in building models for on-prem Microsoft SQL Server (MSSQL).
  • Excellent SQL skills with a focus on performance optimisation.
  • Solid Python experience in data or analytics engineering contexts.
  • Comfortable working in Git-based, collaborative engineering environments.
  • Strong understanding of data warehousing, dimensional modelling, ELT/ETL.
  • Clear communicator with strong documentation habits.
  • Ability to work closely with non-technical stakeholders.

? Industry Preference

Because of the nature and regulatory environment of the client, we are especially interested in candidates from:

  • Insurance (highly preferred)
  • Financial Services

Such backgrounds tend to align strongly with the systems, data models, and governance structures used by the client.

?? Nice to Have

  • experience with BI tools (Power BI, etc.)
  • Exposure to CI/CD for analytics engineering
  • Familiarity with data governance and security concepts

?? Why Join Us?

  • Work with a large, established insurance organisation undergoing meaningful data modernisation
  • Collaborate with a high-performing, engineering-led data team
  • Highly extendable contract with ongoing project pipeline
  • Opportunity to influence core data models used across the business

Job Title: Data Analytics Engineer

Location: London, UK

Job Type: Contract

Trading as TEKsystems. Allegis Group Limited, Maxis 2, Western Road, Bracknell, RG12 1RT, United Kingdom. No. 2876353. Allegis Group Limited operates as an Employment Business and Employment Agency as set out in the Conduct of Employment Agencies and Employment Businesses Regulations 2003. TEKsystems is a company within the Allegis Group network of companies (collectively referred to as “Allegis Group”). Aerotek, Aston Carter, EASi, Talentis Solutions, TEKsystems, Stamford Consultants and The Stamford Group are Allegis Group brands. If you apply, your personal data will be processed as described in the Allegis Group Online Privacy Notice available at https://www.allegisgroup.com/en-gb/privacy-notices.

To access our Online Privacy Notice, which explains what information we may collect, use, share, and store about you, and describes your rights and choices about this, please go to https://www.allegisgroup.com/en-gb/privacy-notices.

We are part of a global network of companies and as a result, the personal data you provide will be shared within Allegis Group and transferred and processed outside the UK, Switzerland and European Economic Area subject to the protections described in the Allegis Group Online Privacy Notice. We store personal data in the UK, EEA, Switzerland and the USA. If you would like to exercise your privacy rights, please visit the “Contacting Us” section of our Online Privacy Notice at https://www.allegisgroup.com/en-gb/privacy-notices for details on how to contact us. To protect your privacy and security, we may take steps to verify your identity, such as a password and user ID if there is an account associated with your request, or identifying information such as your address or date of birth, before proceeding with your request. If you are resident in the UK, EEA or Switzerland, we will process any access request you make in accordance with our commitments under the UK Data Protection Act, EU-U.S. Privacy Shield or the Swiss-U.S. Privacy Shield.

Python Developer (GIS)
Halian Technology Limited
Bath
Hybrid
Mid - Senior
Private salary
RECENTLY POSTED

Location: London / Hybrid
Salary: Competitive

The Role

We are looking for a Python Developer with strong GIS experience to join a growing engineering team building geospatial data platforms used by global organisations.

This role sits within the Client Solutions team, responsible for developing tools and data pipelines that deliver risk analytics and intelligence in a geospatial context.

Youll work in a modern engineering environment using Python, Django, AWS and geospatial data tooling, building scalable systems that process large volumes of spatial data and power mapping solutions used by clients worldwide.

This is a strong opportunity for an engineer who enjoys building real data platforms rather than scripts, and wants to work on meaningful problems around global risk, sustainability and data intelligence.

Responsibilities

  • Design and develop Python-based geospatial data pipelines
  • Build and improve mapping solutions and GIS data workflows
  • Work within an Agile engineering team delivering continuous improvements
  • Collaborate with developers, analysts and data scientists
  • Deliver well-tested, maintainable code
  • Review code and contribute to improving engineering standards
  • Mentor junior developers where appropriate

Required Skills

  • Strong Python development experience
  • Experience working with GIS data pipelines
  • Experience with geospatial libraries such as
    Fiona, Shapely, Rasterio, Numpy
  • Experience working with GDAL and spatial data processing
  • Strong understanding of vector and raster data
  • Experience working with PostgreSQL / SQL databases
  • Knowledge of cloud environments (AWS preferred)
  • Experience designing data pipelines and backend systems

Nice to Have

  • Experience with Django
  • Experience with GeoServer
  • Knowledge of spatial projections
  • Experience with ETL pipelines or data lakes
  • Experience working with large geospatial datasets

What You’ll Work With

  • Python
  • Django
  • AWS
  • PostgreSQL
  • GDAL / Rasterio / Shapely
  • Geospatial datasets
  • Modern data pipelines

Why Join

  • Work with large-scale geospatial datasets
  • Build tools used by global organisations
  • Modern engineering stack and cloud-native architecture
  • Collaborative team environment
  • Opportunity to influence platform architecture and data pipelines

How to Apply

If youre an Applied ML Contractor looking a new exciting opportunity working focused on real operational decision systems, get in touch.

Please apply if interested and well aim to respond within 24 hours.

Data Scientist - Defence - DV Clearance
Arqtech Search Ltd
Huntingdon
In office
Mid - Senior
Private salary
RECENTLY POSTED
+1

DV Data ScientistLocation: RAF Wyton, Huntingdon (45 days onsite)Clearance: Active DV EssentialContract: 12 months Inside IR35 (extension likely up to 3 years) or Permanent roleRate: Negotiable
Programme Overview
Arqtech Search is supporting a defence delivery partner on a secure data transformation programme within an RAF operational environment. The programme focuses on designing, sustaining and evolving secure data platforms that underpin intelligence, surveillance and operational decision-support capabilities. The Data Scientist will play a critical role in delivering resilient, maintainable, and scaleable AI solutions.

Key Responsibilities

  • Maintain and continuously improve the AS AI computer vision model operating in production.
  • Support and manage the COTS model to ensure consistent performance.
  • Expand the solutions functionality through ongoing development, introducing additional features and new use cases.
  • Uphold adherence to security, ethical, and regulatory requirements.
  • Deliver AI solutions that are robust, maintainable, and scalable.

Required Skills

  • Programming: Advanced proficiency in Python.
  • ML Engineering & MLOps: Hands-on experience with widely used ML tooling (e.g., MLflow, Airflow, Docker, Kubernetes).
  • CI/CD & DevOps: Familiarity with CI/CD workflows and automation tools (e.g., GitLab, Jenkins) for streamlined deployments.
  • Data Engineering Fundamentals: Knowledge of ETL (Extract, Transform, Load) processes, data warehousing, and streaming architectures.
  • System Architecture: Experience designing and implementing scalable, cloud-native pipelines and systems.

Due to nature of this work, active DV clearance is essential and 5 days per week are required on-site in RAF Wyton for a 9 day condensed working week (every other Friday off).We cannot consider candidates who do not currently hold an active DV clearance.

Principle Engineer - Databases
Teksystems
North West England
In office
Senior - Leader
£500/day - £600/day
RECENTLY POSTED
+2

Description

Role Summary The Principal Engineer is the senior technical authority responsible for setting the engineering direction, ensuring platform reliability, and driving innovation across a critical enterprise technology domain. This role provides deep technical leadership, shapes long term strategy, and ensures that engineering teams deliver secure, scalable, and resilient services that underpin the organisations digital ecosystem. The Principal Engineer acts as the highest level hands on expert, partnering with architects, product owners, and engineering squads to define standards, modernise platforms, and embed automation and DevOps practices across all services. Technical Leadership Serve as the domains foremost technical expert, accountable for engineering excellence and long term platform strategy. Define and maintain technical standards, patterns, and best practices across the domain. Lead complex design decisions, ensuring solutions are secure, scalable, and aligned with enterprise architecture. Drive adoption of automation, DevOps tooling, and platform as a product principles across all engineering teams. Platform Ownership Oversee the health, performance, and lifecycle of core platforms within the domain. Ensure capacity planning, resilience, backup, recovery, and disaster readiness are embedded into all services. Champion observability, monitoring, and proactive incident prevention. Innovation & Modernisation Identify opportunities to modernise legacy platforms, reduce technical debt, and introduce new technologies. Evaluate emerging tools, frameworks, and architectures relevant to the domain. Lead proof of concepts and guide engineering teams through adoption. Collaboration & Influence Partner with cross domain Principal Engineers to ensure cohesive enterprise wide engineering standards. Work closely with product, security, architecture, and operations teams to deliver integrated solutions. Mentor senior engineers and uplift engineering capability across the organisation. Governance & Risk Management Ensure compliance with security, regulatory, and operational standards. Provide technical oversight for major changes, upgrades, and transformation initiatives. Act as an escalation point for critical incidents and complex technical challenges. Job Description: Database Engineering (Oracle, ExaCC, PostgreSQL, MongoDB) Role Summary Were looking for a Principle Database Engineer with deep expertise across enterprise grade and open source database platforms, including Oracle, Exadata Cloud@Customer (ExaCC), PostgreSQL, and MongoDB. Youll design, operate, and optimise mission critical data services that support high volume, high availability applications. This role blends hands on engineering with architectural thinking, ensuring our data platforms are secure, scalable, and performant. Key Responsibilities Oracle & ExaCC Engineering Administer and optimise Oracle databases running on Exadata and Exadata Cloud@Customer platforms. Perform capacity planning, performance tuning, and storage optimisation for ExaCC environments. Manage Oracle RAC, Data Guard, RMAN, and advanced Oracle features for high availability and disaster recovery. Support patching, upgrades, and lifecycle management across Oracle estates. PostgreSQL Administration Deploy, configure, and maintain PostgreSQL clusters in production environments. Implement replication, backup strategies, and failover mechanisms. Optimise query performance, indexing strategies, and storage utilisation. Develop automation for provisioning, monitoring, and patching PostgreSQL instances. MongoDB Operations Manage MongoDB clusters, including sharding, replication, and scaling strategies. Monitor and tune performance for document based workloads. Implement backup, restore, and disaster recovery processes for MongoDB environments. Ensure data integrity, schema design best practices, and operational resilience. Cross Platform Database Engineering Build automation for database provisioning, configuration, and compliance using IaC and scripting. Implement robust monitoring and observability for all database platforms. Collaborate with application teams to design schemas, optimise queries, and troubleshoot performance issues. Ensure security hardening, auditing, and adherence to data governance standards. Participate in on call rotations supporting critical database services. Skills & Experience Essential Strong hands on experience with Oracle Database and Exadata/ExaCC platforms. Solid PostgreSQL administration skills in production environments. Practical experience managing MongoDB clusters at scale. Proficiency in SQL, PL/SQL, and scripting languages (Python, Bash, PowerShell). Understanding of replication, clustering, high availability, and disaster recovery patterns. Familiarity with database performance tuning and query optimisation. Experience with automation tools (Terraform, Ansible, Liquibase, Flyway, etc.).

Skills

  • oracle
  • exadata
  • mongodb

Job Title: Principle Engineer - Databases

Location: Manchester, UK

Rate/Salary: 500.00 - 600.00 GBP Daily

Job Type: Contract

Trading as TEKsystems. Allegis Group Limited, Maxis 2, Western Road, Bracknell, RG12 1RT, United Kingdom. No. 2876353. Allegis Group Limited operates as an Employment Business and Employment Agency as set out in the Conduct of Employment Agencies and Employment Businesses Regulations 2003. TEKsystems is a company within the Allegis Group network of companies (collectively referred to as “Allegis Group”). Aerotek, Aston Carter, EASi, Talentis Solutions, TEKsystems, Stamford Consultants and The Stamford Group are Allegis Group brands. If you apply, your personal data will be processed as described in the Allegis Group Online Privacy Notice available at https://www.allegisgroup.com/en-gb/privacy-notices.

To access our Online Privacy Notice, which explains what information we may collect, use, share, and store about you, and describes your rights and choices about this, please go to https://www.allegisgroup.com/en-gb/privacy-notices.

We are part of a global network of companies and as a result, the personal data you provide will be shared within Allegis Group and transferred and processed outside the UK, Switzerland and European Economic Area subject to the protections described in the Allegis Group Online Privacy Notice. We store personal data in the UK, EEA, Switzerland and the USA. If you would like to exercise your privacy rights, please visit the “Contacting Us” section of our Online Privacy Notice at https://www.allegisgroup.com/en-gb/privacy-notices for details on how to contact us. To protect your privacy and security, we may take steps to verify your identity, such as a password and user ID if there is an account associated with your request, or identifying information such as your address or date of birth, before proceeding with your request. If you are resident in the UK, EEA or Switzerland, we will process any access request you make in accordance with our commitments under the UK Data Protection Act, EU-U.S. Privacy Shield or the Swiss-U.S. Privacy Shield.

Data Engineer / Analyst
Opus Recruitment Solutions
Nottingham
Hybrid
Junior - Mid
£425/day - £475/day
RECENTLY POSTED

Technical Data Engineer / Analyst | Contract | Azure | 6 months | Hybrid | Outside IR35 | £425 - £475 | Nottingham | We’re supporting a major data transformation programme and looking for a Technical Data Analyst to help migrate legacy systems into a modern Azure data platform. This is a hybrid role 3 days a week in the Nottingham office which is none negotiable. This is a ideal role for someone who is looking for hands on experience with Azure analytics tools and thrives in a data heavy environment. What you'll be doing: Analysing, understanding, and documenting legacy data structures Supporting data model mapping from old systems into the new Azure platform Validating migrated datasets for accuracy and completeness Creating reporting assets and dashboards using Power BI Working with engineers to identify data issues, test pipeline outputs, and improve overall data quality Key Skills Strong SQL skills and experience working with Azure‑hosted datasets Power BI reporting/dashboards creation Python experience Exposure to Azure Data Lake, Databricks, or Data Factory is beneficial If this is a role that suits your skill set, you are immediately available and can work 3 days a week in the office near Nottingham then please apply for the attached job or send your CV to (url removed). Technical Data Engineer / Analyst | Contract | Azure | 6 months | Hybrid | Outside IR35 | £425 - £475 | Nottingham

Software Engineer
McGregor Boyall Associates Limited
City of London
Hybrid
Mid
£500/day - £700/day
RECENTLY POSTED

Mid-Level Engineer - Technology & Innovation

London, UK | Hybrid 3 days a week in office.

We’re hiring a Mid-Level Engineer to join a growing Technology & Innovation team within a Financial Advisors firm, one of the UK and Ireland’s leading Corporate Finance advisory firms.

Reporting to the CTO, you’ll help deliver and manage our cloud-agnostic data & AI platform , scale client digital products , and work across cutting-edge technologies including AI, cloud platforms, and data engineering . This role sits at the heart of our strategy to unlock value from data across real assets, sustainability, and essential services .

What you’ll do

Build and enhance our cloud platforms and data infrastructure

Deploy AI/ML and GenAI solutions into production

Drive automation and operational efficiency using AI and RPA

Support development of client-facing digital platforms and dashboards

Modernise systems into cloud-native, API-first architectures

Key skills

Cloud engineering (Azure, Kubernetes, serverless, DevOps)

Python and relational databases (SQL)

Data platforms, ETL/ELT, and analytics

AI tools and LLM integrations (e.g. GPT, Claude, Gemini)

Security, governance, and regulated environment experience

Why this team?

Work in a high-performing, cooperative team

Exposure to advanced AI and data technologies

Opportunity to shape technology strategy in corporate finance

A values-driven B-Corp organisation focused on Finance with a Purpose

If you’re an ambitious technologist who enjoys solving complex problems and building impactful platforms, please apply below.

*This role does not offer Visa sponsorship*

McGregor Boyall is an equal opportunity employer and do not discriminate on any grounds.

TPBN1_UKTJ

Aircraft Maintenance CAMO Data Management Support - Brize Norton & Remote - 6 months+
Octopus Computer Associates
Carterton
Fully remote
Junior - Mid
Private salary
RECENTLY POSTED
TECH-AGNOSTIC ROLE

One of our Blue Chip Clients is urgently looking for a Data Management Support - (Aircraft Maintenance CAMO Data Support). This role is fully remote but candidate need to be within 45 min commute of Brize Norton as per Client's request due to the on call element of the role. CONTRACTOR MUST BE ELIGIBLE FOR BPSS MUST BE PAYE THROUGH UMBRELLA - Aerospace manufacturing - Manufacturing Process capture and document creation - Technical areas o Aerospace Primary Structure Composite (carbon fibre) Composite wing manufacture Composite primary structure - curing/co-bonding Inspection methods Manufacturing methods Metallic Treatments Inspection Methods Mechanical assembly and drilling (metallic/composite) Assembly Machining Structural adhesives Paints Sealants Shift Pattern Early shift: 7:00 am to 15:00 pm Central shift: 9:00 am to 17.00 pm Late shift: 12:30 am to 20:00 pm Primary shift: 20:00 pm to 7:00 am. Only need to go to the office if an A/C lands. Weekend shift: Currently from 07:00 Saturday morn to 07:00 Monday morn Please send CV for full details and immediate interviews. We are a preferred supplier to the client.

RPA & Data Automation Developer
Adecco
Surrey
Hybrid
Mid - Senior
£600/day - £700/day
RECENTLY POSTED
TECH-AGNOSTIC ROLE

Job Title: RPA & Data Automation Developer
Contract Type: 6 month contract
Inside IR35 - £550-£700 per day (umbrella rate)
Location: Hybrid working - Surrey

Are you ready to take your career to the next level in the world of Robotic Process Automation (RPA) and Data Automation? We’re on the lookout for a passionate RPA & Data Automation Developer to drive efficiency, improve data quality and deliver actionable insights. If you thrive in a collaborative environment and enjoy working with cutting-edge technologies, we want to hear from you!

Key Responsibilities:

  • Data Design & Preparation: Design processes for preparing, enriching, and documenting data using semantic models, Lakehouses and data warehouses to enable insightful analysis.
  • Automation Proficiency: Utilize multiple automation technologies, including AI, ML, Power Automate and Power Apps, to streamline data access and empower developers and analysts.
  • Transform & Test Data: Transform and rigorously test data using dataflows, procedures and notebooks to design user-friendly visualizations that uncover valuable insights.
  • Data Storage Solutions: Implement robust storage and querying strategies for Lakehouses and data warehouses, ensuring a single version of the truth across the organization.
  • Stakeholder Communication: Engage with both technical and non-technical stakeholders to understand business requirements and communicate potential insights effectively.
  • Quality Assurance: Conduct careful testing of data lists and aggregations, creating UAT parameters and checklists to ensure accuracy and enable business sign-off.
  • Collaborative Governance: Work alongside other team members to design and document solutions while establishing strong governance and control processes.
  • Data Flow Analysis: Analyze and document data flows to meet corporate standards, ensuring reusability and maximizing insights for informed decision-making.

What We’re Looking For:

  • Experience with RPA tools.
  • Strong knowledge of data automation technologies, including Power Automate, Power Apps and data storage solutions.
  • Excellent analytical skills with a keen eye for detail.
  • Ability to communicate complex concepts clearly to a diverse audience.
  • A collaborative spirit with a commitment to achieving business objectives.

If you’re excited about the possibility of making a difference through RPA and Data Automation, we’d love to hear from you! Apply now and become a vital part of our mission to enhance public services through data-driven insights.

Let’s innovate together!

Adecco is a disability-confident employer. It is important to us that we run an inclusive and accessible recruitment process to support candidates of all backgrounds and all abilities to apply. Adecco is committed to building a supportive environment for you to explore the next steps in your career. If you require reasonable adjustments at any stage, please let us know and we will be happy to support you.

contract data scientist - Hounslow x3 days a week - £500 inside IR35
Exalto Consulting ltd
London
Hybrid
Mid - Senior
£130,446/day
RECENTLY POSTED
+2

Exalto consulting are currently recruiting for a contract data scientist, this is inside IR35 paying £500 per day, need to go to site x3 days a week in Hounslow.

Skills/capabilities
• Strong knowledge of either machine learning and optimization techniques, incl. supervised (regression, tree methods, etc.), unsupervised (clustering) learning, and operations research (linear, mixed integer programming, heuristics)
• Fluent in Python (required) and other programming languages (preferred) with strong skills in applying DS, ML, and OR packages (scikit-learn, pandas, numpy, gurobi etc.) to solve real-life problems and visualise the outcomes (e.g. seaborn)
• Proficient in working with cloud platforms (AWS preferred), code versioning (Git), experiment tracking (e.g. MLflow)
• Experience with cloud-based ML tools (e.g. SageMaker), data and model versioning (e.g. DVC), CI/CD (e.g. GitHub Actions), workflow orchestration (e.g. Airflow/Dagster) and containerised solutions (e.g. Docker, ECS) nice to have
• Experience in code testing (unit, integration, end-to-end tests)
• Strong data engineering skills in SQL and Python
• Proficient in use of Microsoft Office, including advanced Excel and Powerpoint Skills
• Advanced analytical skills, including the ability to apply a range of data science and analytic techniques to quickly generate accurate business insights

If you have the above experience and are looking for a new contract role please send your CV for immediate consideration as our client are looking to hire ASAP

contract data scientist - Hounslow x3 days a week - £500 inside IR35

Senior Data Engineer
VC Talent
London
Hybrid
Senior
£59,000/day - £64,000/day
RECENTLY POSTED
+2

A successful FTSE-listed organisation with a friendly, close-knit culture is looking for a Senior Data Engineer to help shape and deliver its evolving data platform. This is a hands-on, high-impact role in a smaller organisation where you will act as both a senior engineer and strategic data partner, designing solutions that support both operational and long-term business goals. You will play an important role in the company’s migration to Microsoft Fabric, while continuing to build and optimise its existing SQL Server-based data warehouse environment. While primarily a backend engineering role, you will also support reporting via SSRS/Microsoft Power BI, with an increasing focus on AI-driven capabilities over time. As the Senior Data Engineer, you will work closely with the Solutions Architect and collaborate with teams across the business, with occasional travel to France and Germany. This opportunity suits someone with in-depth SQL experience who is looking to work with modern tools like Microsoft Fabric. Essential Skills Strong SQL Server experience (T-SQL, SSIS, SSRS, stored procedures, functions, triggers) Data warehouse architecture, build and maintenance Excellent communication skills and ability to gather requirements from non-technical stakeholders at all levels Degree in Computer Science or STEM subject Comfortable working 4 days per week onsite in Vauxhall Desirable Microsoft Fabric, Azure Synapse, Azure Data Factory, Snowflake, Databricks Python or any other relevant language Experience with tools such as Power BI, Qlik or Tableau The role offers a package of £64k+, plus a bonus of up to 15%, along with a generous pension, 28 days' holiday, private medical insurance, permanent health insurance, study support, and a modern office with an onsite gym. If your experience aligns, apply with an up-to-date CV as soon as possible, as this is expected to be a popular opportunity

Azure Data Engineer
Opus Recruitment Solutions
Bristol
Hybrid
Mid - Senior
£400/day - £500/day
RECENTLY POSTED

Azure Data Engineer | £400 - £500 Outside IR35 | Bristol | Hybrid | 6‑Month Initial Term |

A large‑scale data transformation programme is underway, and our client is looking for an experienced Azure Data Engineer to support the rebuild of their cloud data platform. This role is hands‑on and delivery‑focused — you’ll be designing and developing Azure‑native data pipelines, working extensively with Databricks, and shaping scalable data models across the Microsoft ecosystem. The role would require you to be on site in Bristol 4 days per week, please only apply for this position if you are local enough to do this without relocating.

What you’ll be doing

Build, enhance and maintain data pipelines using Azure Databricks, Data Factory, and Delta Lake

Develop and optimise Lakehouse components and cloud‑based data flows

Create robust data models to support analytics, MI and downstream reporting

Assist in migrating legacy warehouse assets into a modern Azure environment

Contribute to cloud architecture decisions, data standards and best‑practice engineering patterns

Develop reliable Python and PySpark code to support data ingestion, transformation, and end‑to‑end processing.

What you’ll bring

Strong hands‑on experience across Azure Data Services (ADF, ADLS, Synapse, Databricks)

Excellent SQL skills, with experience in performance tuning and optimisation

Solid understanding of data modelling (star schema, medallion, ETL frameworks)

Ability to work with complex, inconsistent or legacy data sources

Experience building scalable, production‑ready pipelines in a cloud environment

Azure Data Engineer | £400 - £500 Outside IR35 | Bristol | Hybrid | 6‑Month Initial Term

Lead PySpark Engineer
Randstad Technologies Recruitment
London
Remote or hybrid
Senior
£281/day - £292/day
RECENTLY POSTED

PySpark Engineer Lead

As the Technical Lead, you will drive the high-stakes migration of legacy SAS analytics to a modern, cloud-native PySpark ecosystem on AWS. This isn’t just a lift and shift you will refactor complex procedural logic into scalable, production-ready distributed pipelines for a Tier-1 financial services environment.

Core Responsibilities

  • Engineering Leadership: Design and develop complex ETL/ELT pipelines and Data Marts using PySpark, EMR, and Glue.
  • Legacy Modernisation: Architect the conversion of SAS Base/Macros into modular, testable Python code using SAS2PY and manual refactoring.
  • Performance Tuning: Optimise Spark execution (partitioning, shuffling, caching) to ensure cost-efficient processing of massive financial datasets.
  • Quality & Governance: Implement rigorous CI/CD, unit testing, and data reconciliation frameworks to ensure “penny-perfect” accuracy.

Technical Stack

  • Engine: PySpark (Expert), Python (Clean Code/SOLID principles).
  • AWS: EMR, Glue, S3, Athena, IAM, Lambda.
  • Data Modeling: SCD Type 2, Fact/Dimension tables, Data Vault/Star Schema.
  • Legacy: Proficiency in reading/debugging SAS (Base, Macros, DI Studio).
  • DevOps: Git-based workflows, Jenkins/GitLab CI, Terraform.

Randstad Technologies is acting as an Employment Business in relation to this vacancy.

Data Scientist - SC Cleared
Hays Technology
London
Hybrid
Mid - Senior
£500/day - £550/day
RECENTLY POSTED

Your new company
One of the most influential Central Government Organisations in the current economic climate
Your new role
Data Scientist - SC Cleared - SQL, Python & R
What you’ll need to succeed
My client is looking for an Analytical Data Scientist, leading/working alongside a team of data scientists to deliver key outputs for commissioned projects (use cases).
You will also support the development of GSCIP through developing tools, data visualisations, and data available for analysis.
You will have the opportunity to work on bespoke data science projects to improve understanding and interpretation of the data, and enhance use case delivery capability.
This role can only be offered to candidates with Active and Existing SC or DV Clearance.

Essential Criteria:

  • Experience of delivering high-quality coding projects that make use of at least two of: SQL, Python & R
  • Experience of engaging with stakeholders across government to scope and deliver impactful analysis
  • Experience of leading teams through complex data science projects - a track record of delivering complex data science projects on time to meet user needs, and overcoming any challenges
  • A demonstrable commitment to developing your knowledge and expertise
  • Significant technical data science knowledge

Desirable Criteria:

  • Understanding of supply chain data sources
  • Experience with Pyspark/Spark SQL
  • Experience delivering interactive visualisations in Python and SQL
  • Graph data experience, including with graph query languages and knowledge graphs
  • Experience working on technical commercial procurements

This is a hybrid role at 40% and Monday is a mandatory team day. Successful candidates will join the role asap and interviews are to commence 16/3/26 onwards.

What you’ll get in return
This is an excellent role to join the GSCI Programme as an experienced Data Scientist, ensuring existing delivery and data standards are maintained and services scaled up!
What you need to do now

If you’re interested in this role, click ‘apply now’ to forward an up-to-date copy of your CV, or call us now.
If this job isn’t quite right for you, but you are looking for a new position, please contact us for a confidential discussion about your career.

Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C’s, Privacy Policy and Disclaimers which can be found at (url removed)

Data Engineer
TEAM
Chandler's Ford
Hybrid
Mid - Senior
£500/day
RECENTLY POSTED

A Data Engineer is needed for a contract where your work will directly shape how a business trusts, structures, and uses its data. If you enjoy building reliable pipelines, improving models, and turning messy data into dependable assets, this is the kind of project where your impact is felt quickly. This role focuses on practical delivery. You’ll be strengthening the foundations of analytics and reporting by building dependable solutions that teams across the organisation rely on every day. What’s in it for you £500 per day contract with immediate impact on a growing environment Hybrid working with a balanced onsite and remote setup A delivery-focused project where practical engineering skills are valued The opportunity to improve and shape core assets used across the business A collaborative environment working closely with technical teams and stakeholders Real ownership over the reliability and structure of pipelines and models What you’ll be getting stuck into as a Data Engineer Building and maintaining scalable pipelines that support analytics, reporting, and operational data use Developing and refining warehouse models that align with real business requirements Writing and optimising SQL for transformation, integration, and performance improvements Strengthening quality through validation, governance, and structured data workflows Delivering reliable, accessible datasets for reporting and decision-making Supporting monitoring, testing, and continuous improvement across data processes What you’ll bring to the table as a Data Engineer Strong hands-on experience delivering practical solutions Strong SQL capability for transformation, modelling, and optimisation Previous experience designing and working with data warehouse models Experience building and maintaining production pipelines Exposure to platforms such as Databricks, Synapse, or Microsoft Fabric If you're a Data Engineer ready to step into a contract where you can quickly add value by building dependable pipelines and models, apply now to learn more. Candidate Source Ltd is an advertising agency.  Once you have submitted your application it will be passed to the third party Recruiter who is responsible for processing your application. This will include holding and sharing your personal data, our legal basis for this is legitimate interest subject to your declared interest in a job. Our privacy policy can be found on our website and we can be contacted to confirm who your application has been forwarded to

Data Engineer
Meritus
London
Hybrid
Mid - Senior
£550/day - £600/day
RECENTLY POSTED

Contract Data Engineer - Azure / Databricks
Location: London (2 days onsite)
Rate: 550- 600 per day (Inside IR35)
Contract: 6 months

A leading UK financial institution is seeking an experienced Data Engineer to support the development and enhancement of a modern cloud-based data platform. This role will focus on building scalable data pipelines and supporting the evolution of a cloud-first data architecture.

Key Responsibilities

  • Design and develop scalable data pipelines using modern cloud technologies.
  • Build and optimise distributed data processing solutions using Databricks, Spark and Python.
  • Develop and maintain data integration workflows using Azure Data Factory.
  • Work with large datasets stored in Azure Data Lake environments.
  • Collaborate with architects, analysts and engineering teams to deliver reliable and secure data solutions.
  • Contribute to improving data quality, performance and operational monitoring across the platform.

Key Skills & Experience

  • Strong experience with Azure Databricks, Azure Data Factory and Azure Data Lake.
  • Advanced Python, SQL and Spark (PySpark) development experience.
  • Experience building and optimising ETL / data pipelines in cloud environments.
  • Knowledge of CI/CD and version control (Azure DevOps, GitHub or similar).
  • Experience working with large-scale distributed data processing systems.

Contract Details

  • 6-month initial contract
  • 550- 600 per day (Inside IR35)
  • Hybrid working: 2 days per week onsite in London

If you’re an experienced Data Engineer with strong Azure and Databricks expertise and are available for a new contract, please apply or get in touch to discuss further.

Data Scientist - UKIC DV Clearance Required
Matchtech
London
In office
Mid - Senior
Private salary
RECENTLY POSTED
+1

Our client, a prominent entity in the Defence & Security sector, is seeking a meticulous Data Scientist with a strong understanding of Linux, Data Science, and AWS to join their team. This is a contract position located in London for a duration of 12 months, requiring a UKIC DV clearance to undertake sensitive and impactful work.

Key Responsibilities:

  • Develop and deploy data science solutions to support national security missions.
  • Build and optimise data pipelines for processing large, complex datasets.
  • Apply machine learning and statistical techniques to extract actionable insights.
  • Create clear visualisations to communicate findings to stakeholders.
  • Collaborate in agile teams to deliver robust, scalable solutions.
  • Support cloud-based deployments and integration into operational environments.

Job Requirements:

  • Proficiency with scripting languages like Python for data exploration, cleansing, and manipulation.
  • Knowledge of machine learning models and statistical techniques, including validation.
  • Understanding of data analytics and data visualisation techniques.
  • Ability to process large datasets via batch or stream processing using Apache Spark or similar tools.
  • Exposure to techniques used for acquiring and fusing data.
  • Experience with cloud platforms (preferably AWS) or implementing cloud-based data science solutions.
  • Knowledge of, or willingness to learn, DataOps.
  • Experience with structured or unstructured databases.
  • Experience with container technologies, including Docker and Kubernetes.
  • Familiarity with agile ways of working.
  • Understanding of software best practices including version control, CI/CD pipelines for automated testing, and deployment.
  • Proficiency in Linux.
  • BPSS & Current UKIC DV clearance.

Additional Details:

  • Location: 5 days per week onsite - London
  • Duration: 12 months

If you are a dedicated Data Scientist with the necessary clearances and skills, and are eager to contribute to mission-critical projects in the realm of national security, we want to hear from you. Apply now to take the next step in your career with our client.

TM1 Planning Developer
Square One Resources
England
Fully remote
Mid - Senior
Private salary
RECENTLY POSTED

Job Title: TM1 Planning Developer
Location: Remote - Inside IR35.
Start Date: April
Job Type: Contract

We’re looking for a TM1/IBM Planning Analytics Developer to support the development and optimisation of enterprise planning solutions within a complex data environment.

You will be responsible for designing and maintaining TM1 models and cubes, developing business rules and processes, and supporting financial and operational planning workflows. The role involves working closely with finance and business stakeholders to deliver scalable, high-performance planning and reporting solutions.

This is a 3 month initial contract, remote and Inside IR35.

Key requirements:

  • Strong experience with IBM TM1/Planning Analytics
  • Development of cubes, dimensions, rules and TurboIntegrator (TI) processes
  • Experience supporting financial planning, forecasting and reporting
  • Performance optimisation and troubleshooting of TM1 models
  • Strong stakeholder engagement skills

If you are interested in this opportunity, please apply now with your updated CV in Microsoft Word/PDF format.

Disclaimer
Notwithstanding any guidelines given to level of experience sought, we will consider candidates from outside this range if they can demonstrate the necessary competencies.
Square One is acting as both an employment agency and an employment business, and is an equal opportunities recruitment business. Square One embraces diversity and will treat everyone equally. Please see our website for our full diversity statement.

Senior KDB+ Developer
Korn Ferry
London
Hybrid
Senior
£1,000/day
RECENTLY POSTED

Rate: up to £1000 a day inside IR35

Location: 3 days at London Office

We are working with a leading global financial institution on a senior hire within their Real Time market data engineering team. This role is focused on building and operating low-latency, high-performance KDB+ platforms that support mission-critical trading, analytics and monitoring use cases.

What You’ll Be Doing

  • Design, develop and maintain large-scale KDB+/q systems for Real Time and historical market data
  • Build and operate tickerplants (TP), Real Time processes (RTP), and HDBs, including recovery and log replay
  • Implement performant time-series data models, schemas, and APIs
  • Optimize q code for latency, throughput, and memory efficiency
  • Develop Real Time and batch pipelines for tick data ingestion, normalization, and enrichment
  • Work closely with quants and stakeholders to productionise analytics and trading signals
  • Support and troubleshoot production KDB systems on Linux, including participation in on-call rotations

What We’re Looking For

  • Extensive hands-on experience with KDB+/q in a production environment

  • Proven experience designing or operating Real Time tick data systems

  • Strong knowledge of:

    • Tickerplant architectures and recovery models
    • Time-series joins (eg as-of joins)
    • Attributes, iterators/adverbs, and performance internals
  • Experience building low-latency systems where performance matters

  • Strong Linux/Unix skills, including debugging running processes

power bi developer
Pontoon
Manchester
In office
Mid - Senior
Private salary
RECENTLY POSTED

Power BI Developer

Manchester

6 month contract

Inside ir35

Purpose of the Role

The role is responsible for designing high quality Power BI datasets, developing scalable data models, and delivering insightful dashboards that support critical business processes. The position also leads the establishment of a resilient BI data foundation, integration with ServiceNow, and uplift of AI ready reporting capabilities across the organisation.

Key Objectives

  • Establish a governed, trusted Data Pillar with certified datasets, semantic models, and defined data lineage.
  • Develop innovative Power BI-to-ServiceNow integrations for surfacing dataset health, ownership, and automated workflows.
  • Assess and improve resilience of key BI assets, including data quality, refresh reliability, performance, and access controls.
  • Standardise tooling (templates, DAX libraries, CI/CD, documentation, testing) to support consistent and governed development.
  • Enable Gen AI and Co pilot Studio adoption by producing AI ready datasets and identifying opportunities for automation, anomaly detection, and forecasting.
  • Optimise BI workflows by leveraging AI for documentation, requirements capture, DAX optimisation, and natural language querying.

Key Responsibilities

  • Design and optimise Power BI data models, schemas, relationships, and calculated fields.
  • Build high performance, scalable datasets in Power BI Service.

Report & Dashboard Development

  • Develop interactive dashboards and paginated reports with strong usability, accuracy, and visual clarity.

DAX & Data Transformation

  • Write efficient DAX measures for KPIs, time intelligence, and complex business logic.
  • Deliver robust data transformations using Power Query (M) and SQL-based pipelines.

Data Integration & Gen BI

  • Integrate Power BI with Co pilot Studio and prepare models for AI agent consumption.
  • Build workflow automation using Power Automate.

Power BI Service Administration

  • Manage datasets, workspaces, refresh schedules, usage monitoring, and performance optimisation.

Stakeholder Engagement & Documentation

  • Gather business requirements and translate them into scalable BI solutions.
  • Collaborate with data engineers, SMEs, and product teams.
  • Document data lineage, business logic, and model architecture.

Essential Skills & Experience

  • Strong expertise in Power BI Desktop, Power Query (M), DAX, and Power BI Service.
  • Solid SQL skills (joins, CTEs, window functions) and experience with SQL Server.
  • Strong understanding of dimensional modelling and data warehouse principles.
  • Experience building and maintaining ETL pipelines.
  • Excellent analytical, problem solving, and troubleshooting skills.
  • Strong communication skills, able to translate complex data concepts for non technical audiences.

Desirable Skills

  • Experience with Atlassian tools (JIRA, Confluence).
  • Familiarity with ServiceNow GRC and related data structures.

Behaviours & Mindset

  • Customer-focused, delivering intuitive and value-driven reporting.
  • Detail oriented, ensuring strong governance and accuracy.
  • Collaborative and responsive, working effectively across teams.
  • Proactive and improvement-driven, seeking opportunities to simplify and automate.

If you believe you have the experience required, please apply with your CV now for instant consideration!

TO APPLY - PLEASE APPLY WITH AN UP-TO-DATE CV

Candidates will ideally show evidence of the above in their CV in order to be considered.

Please be advised if you haven’t heard from us within 48 hours then unfortunately your application has not been successful on this occasion, we may however keep your details on file for any suitable future vacancies and contact you accordingly.

Pontoon is an employment consultancy. We put expertise, energy, and enthusiasm into improving everyone’s chance of being part of the workplace. We respect and appreciate people of all ethnicities, generations, religious beliefs, sexual orientations, gender identities, and more. We do this by showcasing their talents, skills, and unique experience in an inclusive environment that helps them thrive.

We use generative AI tools to support our candidate screening process. This helps us ensure a fair, consistent, and efficient experience for all applicants. Rest assured, all final decisions are made by our hiring team, and your application will be reviewed with care and attention.

Confluent Engineer
LA International Computer Consultants Ltd
London
Hybrid
Senior
£403/day
RECENTLY POSTED
+2

Role Title: Confluent Engineer
Location: London
Duration: 27/05/2026
Days on site: 2-3
Rate INSIDE IR35 £402.75 PER DAY

MUST BE THROUGH UMBRELLA

Role Description:

* Role Title: Senior Software Engineer - Confluent Streaming Platform

Role Purpose
* We are seeking a Senior Software Engineer with strong hands-on experience in Confluent Platform, Apache Kafka, and Apache Flink to support the introduction and evolution of Intact’s enterprise streaming capabilities. This role sits within the Integration function, responsible for enabling Real Time data, event-driven architecture, and high-performance integrations across the organisation.
* The ideal candidate will contribute to the design, development, and scaling of Intact’s Confluent-based streaming platform, supporting teams across the organisation in adopting event-driven approaches.
* This opportunity sits within a significant cloud-modernisation programme, leveraging Agile and DevOps practices to continuously deliver business value.

Key Accountabilities
* Deliver engineering tasks across the Confluent Streaming Platform, including the design, development, testing, and deployment of event-driven services and data pipelines.
* Develop Kafka topics, schemas, and streaming applications using Kafka, Kafka Connect, Schema Registry, and Flink.
* Collaborate with architects and platform teams to shape the event streaming roadmap.
* Provide subject-matter expertise in distributed streaming and event-driven architecture.
* Review streaming applications produced by other engineers, ensuring quality and best practices.
* Troubleshoot production streaming issues and conduct root-cause analysis.
* Promote platform standards, governance models, and reusable patterns.
* Collaborate with vendor teams and Confluent professional services.
* Actively participate in Agile ceremonies and technical discussions.

Customer Conduct Framework
* Understand how FCA Conduct Rules apply to this role and consistently demonstrate behaviours that support positive customer outcomes and safe data handling.

Functional/Technical Skills
* 6+ years of software engineering experience, including 3+ years hands-on with Confluent/Apache Kafka.
* Experience designing and building distributed streaming applications using Confluent Platform components.
* Strong understanding of event-driven architecture concepts (event streaming, event sourcing, Pub/Sub, stream processing).
* Experience with Avro/JSON/Protobuf, schema evolution, and Schema Registry.
* Integration experience with Back End systems such as SQL/NoSQL databases, APIs, and cloud data platforms.
* Familiarity with API design, data modelling, and microservice integration patterns.
* Proficient with Git, Jira, Azure DevOps, Docker, Kubernetes.
* Working knowledge of AWS and/or Azure.
* Strong understanding of clean code, reusable component design, and Agile/DevOps practices.

Decision-Making Authority
* Makes decisions on design and implementation of streaming components within agreed architecture and standards.
* Provides expert guidance impacting platform reliability, performance, and integration quality.
* Escalates risks, issues, and architectural concerns appropriately.

Please send latest CV

LA International is a HMG approved ICT Recruitment and Project Solutions Consultancy, operating globally from the largest single site in the UK as an IT Consultancy or as an Employment Business & Agency depending upon the precise nature of the work, for security cleared jobs or non-clearance vacancies, LA International welcome applications from all sections of the community and from people with diverse experience and backgrounds.

Award Winning LA International, winner of the Recruiter Awards for Excellence, Best IT Recruitment Company, Best Public Sector Recruitment Company and overall Gold Award winner, has now secured the most prestigious business award that any business can receive, The Queens Award for Enterprise: International Trade, for the second consecutive period.

AI Engineer, Agentic AI, Python, Pycharm, LLM, Agentic
Experis
London
Hybrid
Mid - Senior
£400/day - £475/day
RECENTLY POSTED
+9

Up to £475/day Outside IR35 London 2 days per week in Office We are seeking a highly skilled AI Engineer with deep expertise in Agentic AI, Large Language Models, NLP, GenAI pipelines, cloud ML platforms, and vector-based retrieval systems . This is an opportunity to join an advanced AI team building next-generation intelligent systems, multi-agent applications, and high-scale GenAI microservices. You will design, deploy, and optimise production-grade AI/ML systems powering millions of customer interactions. You will work across Python, cloud-native architectures, vector search, RAG frameworks, orchestration engines, and multi-agent systems , shaping AI capabilities that transform how organisations interact, automate, and understand their customers. Key Responsibilities AI / LLM / Agentic Engineering Design, build, and optimise agentic AI systems using frameworks such as LangChain, LangGraph, Vertex AI Agent Builder, Bedrock Agents, AgentKit, CrewAI , and custom orchestration. Build LLM-powered applications using models including GPT-4o/5, Llama3, Claude, Gemini 2.5 Pro, Bard , and enterprise-grade LLM deployments. Implement RAG and CAG architectures using Pinecone, OpenSearch, Google GenAI Search , and custom vector stores. Engineer domain-tuned embeddings using ADA-002, Gecko, Word2Vec, BERT, Sentence Encoder, and topic modelling. AI/ML Pipelines & MLOps Develop scalable AI/ML microservices using Docker, Kubernetes (EKS/GKE), and CI/CD-driven automation. Build and enhance pipelines for model evaluation, bias/drift detection, real-time inference, and monitoring . Optimise inference latency for high-volume, near-real-time applications such as transcript and behavioural analysis. NLP & Applied Machine Learning Apply text clustering, N-gram analytics, sentiment modelling, intent classification, and summarisation for insight extraction. Refine conversational intent taxonomies and behavioural models for more accurate AI assistant interactions. Data Engineering & Cloud Integration Use cloud services including SageMaker, Azure ML Studio, Vertex AI for training, deployment, and monitoring. Manage datasets using GCP Cloud Storage and implement secure, compliant data workflows. AI Governance & Quality Assurance Establish guardrails, safety layers, automated evaluation frameworks, and prompt governance patterns. Ensure all AI systems meet stringent data governance, privacy, and financial-sector compliance requirements. Technical Skills Languages & Development Python, Java, SQL, Shell Scripting, Node.js, Streamlit IDE experience: PyCharm, VS Code, JupyterLab, Eclipse, Notepad++, Sagemaker Studio, Azure ML Studio, Vertex AI Workbench Python Libraries NumPy, Pandas, Matplotlib, Scikit-learn, TensorFlow, Keras, PyTorch, PySpark, SpaCy, SciPy, NLTK, Statsmodels, Boto3, AzureSDK NLP & LLMs BERT, Word2Vec, Universal Sentence Encoder, NLTK, embeddings, fuzzy matching, topic modelling LLM experience: GPT-3.5/4o/5, Llama2/3, Claude, Gemini, Bedrock models, SQuAD fine-tuning, custom RAG agents AI Search & Vector Innovations Pinecone, OpenSearch, LangChain/LangGraph, LangIndex, Vertex AI Search, Vector DBs, RAG pipelines What We're Looking For Proven experience developing production-grade LLM, GenAI, NLP, or agent-based AI systems . Strong engineering foundation across Python, cloud platforms, APIs, and vector search. Experience with complex multi-agent AI orchestration. Ability to deliver high-scale, low-latency AI solutions in demanding environments. Strong collaboration, architectural thinking, and a passion for cutting-edge AI innovation. TPBN1\_UKTJ

Frequently asked questions
Haystack features a wide range of contract Data Engineer positions, including short-term, long-term, remote, and on-site roles across various industries such as finance, healthcare, and technology.
To apply, simply create a profile on Haystack, upload your resume, and submit applications directly through the platform to any contract Data Engineer job that matches your skills and experience.
Yes, many contract Data Engineer listings on Haystack offer remote or hybrid working options. You can filter job searches by location and remote availability to find the best fit.
Typically, contract Data Engineers should have experience with big data tools (like Hadoop, Spark), SQL, Python/Scala, ETL processes, and data warehousing solutions. Specific requirements may vary by job.
While Haystack facilitates job postings and applications, contract negotiations including rates and terms are generally handled between you and the hiring company or recruiter directly.