Make yourself visible and let companies apply to you.
Roles
Data Engineer Jobs
Overview
Looking for top Data Engineer jobs? Explore the latest data engineering opportunities on Haystack, your go-to IT job board. Whether you're skilled in ETL, data pipelines, or big data technologies, find the perfect role to advance your career today. Start your search for Data Engineer positions now!
Data Ingest Engineer - Python
Preservica
Abingdon
Hybrid
Mid - Senior
Private salary
RECENTLY POSTED

You are an accomplished Python developer, you know your way around APIs, have a strong handle on supporting or configuring SaaS solutions and you thrive on customer interaction. Are you ready to make your mark on future-proof software?

We are Preservica and our groundbreaking active digital preservation solutions are at the razors edge eliminating the challenge of file obsolescence, data ROT and more, addressing the need for smart digital preservation technology. Our award-winning software is used by leading businesses, archives, libraries, museums and government organisations across the globe.

We are world leaders and proud of our achievements but to stay ahead we need the brightest and most talented commercial and technical innovators to join our professional services team and right now we are looking for a solid Technical Success / Integration Engineer.

About the Role

The Role

Working as part of the integration team with new customers and their IT teams across a range of commercial and government sectors, your key role will be the successful uptake and integration of their legacy data into Preservicas Active Digital Preservation solution.

You will be hands on in the upload and ingest of large volumes of digitized and born digital content, configuring roles and security and integrating and mapping to catalogs and metadata standards

Ideally with a background in either technical support or customer success within a records management or SaaS service environment, you will be familier with real-life best practice workflows, ingest routines and data/metadata mapping, using custom scripts and APIs. Equally you will have sound customer centric skills and a positve can-do attitude.

Responsibilities:

  • Lead portions of customer onboarding projects, including data ingest and metadata mapping.
  • Develop and maintain scripts (Python, PowerShell, or similar) to automate customer workflows.
  • Work with APIs to support integrations
  • Collaborate with senior team members to troubleshoot and resolve customer issues.
  • Contribute to customer satisfaction by providing clear communication and timely support.
  • Document processes and contribute to knowledge base articles for both customers and internal teams.
  • Stay curious about emerging trends in digital archiving and SaaS integration.

Location:

This role can be operated as a hybrid role with monthly days in our Abingdon office.

Requirements

What We Look For:

  • 5 years of professional experience in a technical support, customer success, or IT role
  • Working knowledge of Python or another scripting language.
  • Familiarity with APIs, XML, JSON, or other structured data formats.
  • Understanding of metadata standards or digital archiving concepts (preferred but not required).
  • Strong analytical and troubleshooting skills.
  • Excellent communication skills and a customer-first mindset.
  • Ability to work independently and as part of a team.

What We Offer:

As our business continues to grow we believe in investing in our people and giving them the support and tools to keep us on track. As well as a competitive salary and benefits package, we offer tangible career development opportunities and dedicated training time to support professional growth.

Preservica is an equal opportunities employer.

Analytics Platform Engineer (Python, Kubernetes) - Secure Gov - Cheltenham
Curo Services
Cheltenham
Hybrid
Senior - Leader
Private salary
RECENTLY POSTED
+3

Analytics Platform Engineer (Python, Kubernetes) - Secure Gov - Cheltenham - (RL8096)

Job Title - Analytics Platform Engineer (Principle & Senior)
Location - Cheltenham
Salary - Competitive
Benefits - Bonus and commission scheme, comprehensive benefits package including private medical and pension, flexible hybrid working, clear progression with funded training, and enhanced long-term incentives including additional leave and retention bonuses.

Work on analytics platforms that support highly sensitive, mission-critical programmes within a secure environment. This is an opportunity to build and scale modern data platforms while contributing to projects of national significance, alongside some of the strongest engineers in the sector.

The Client - We’re partnering with a leading organisation in the secure government sector to support the growth of a key programme delivering advanced data and analytics capabilities. This is a critical hire within an expanding team, focused on building and scaling platforms that underpin mission-critical solutions.

Operating at the forefront of data, cloud, and AI-driven innovation, they offer an environment where engineers can work on complex, high-impact challenges with real-world significance.

The Candidate - This would suit a candidate with a strong background in data or analytics platform engineering, who is comfortable working across both software development and infrastructure. You’ll enjoy solving complex technical challenges, working in dynamic environments, and collaborating closely with Data Scientists and MLOps teams. A pragmatic, adaptable mindset is key, along with a passion for building scalable, secure systems that enable data-driven outcomes. You should also be comfortable working in secure, highly regulated environments.

The Role - We are seeking Senior and Principal Analytics Platform Engineers to join a growing team delivering high-impact solutions within a secure environment. You will play a key role in designing, building, and evolving a modern analytics platform, supporting the full life cycle from development through to deployment and ongoing optimisation. This is a hands-on role offering exposure to a broad and evolving technology landscape. Due to the nature of the work, you will be operating within a highly secure environment with specific access requirements.

Key Duties:

  • Design, build and evolve scalable analytics and data platforms
  • Contribute across the full software development life cycle
  • Support cloud migration and data management initiatives
  • Develop, test and deploy new platform capabilities
  • Troubleshoot and enhance existing analytics services
  • Provide hands-on support to Data Scientists and MLOps teams
  • Tackle complex engineering challenges across a varied tech stack

Requirements:

  • Strong experience with Python
  • Experience with Kubernetes and Docker
  • Understanding of CI/CD pipelines (eg GitLab)
  • Exposure to data platforms, MLOps or machine learning environments

Desirable:

  • Spark or Scala
  • AWS services (eg S3)
  • Elasticsearch or graph databases
  • Vector databases/modern data tooling
  • OIDC/OAuth
  • Node.js or React

To apply for this Analytics Platform Engineer permanent job, please click the button below and submit your latest CV.
Curo Services endeavours to respond to all applications, however this may not always be possible during periods of high volume. Thank you for your patience.
Curo Services is a trading name of Curo Resourcing Ltd and acts as an Employment Business for contract and temporary recruitment as well as an Employment Agency in relation to permanent vacancies.

Lead Dashboard Developer (Data Visualisation)
Bright Purple Resourcing
UK
Remote or hybrid
Senior
£95,000
RECENTLY POSTED

Principal Grafana Network Analytics Engineer Edinburgh, UK | Fully Remote (UK-based)
Engineering | Cyber Security
£95,000 & Benefits
Ever wanted your dashboards to actually defend the internet?
This role does exactly that.
Im recruiting on behalf of a high-growth cyber security technology company that protects some of the worlds most critical online services from large-scale DDoS and application attacks. Their software sits right at the heart of customer networks- and when it works (which it must), entire businesses stay online. Theyre now expanding their world-class engineering team and are looking for a Principal-level Grafana expert to lead the charge on network and security analytics.
The Opportunity This is not a keep-the-lights-on role. Youll own and lead the design and development of sophisticated, high-performance dashboards used to visualise real-time network and security data. Youll set technical direction, mentor engineers, and build analytics that help customers instantly understand and stop complex cyber attacks. Youll be trusted to work from first principles, influence architecture, and make decisions that genuinely matter. What Youll Be Doing

  • Leading a small, highly skilled team focused on network & security analytics
  • Designing and building advanced Grafana dashboards running in Kubernetes
  • Turning complex data into clear, insightful visualisations
  • Developing and reviewing complex queries (Grafana, Splunk, Python)
  • Mentoring engineers and shaping technical best practice
  • Balancing hands-on development with technical leadership and ownership

What Were Looking For Essential

  • Strong experience building dashboards and analytics in Grafana
  • Proven background leading engineers in an agile environment
  • Solid understanding of Linux and AWS
  • Excellent communication skills you can explain complex ideas simply
  • A strong technical degree (Computer Science, Maths, Statistics, Engineering, or similar)

Nice to Have

  • Knowledge of networking protocols and how the internet actually works
  • Experience with Splunk & SPL
  • SQL or similar data manipulation skills
  • Exposure to network security products
  • HTML, CSS, JavaScript
  • Data Science or Machine Learning experience

Location & Flexibility

  • Edinburgh-based engineers: hybrid working (typically 2 days in office)

  • Fully remote options available for the right candidate

  • Cutting-edge tech, complex data, and meaningful problems

  • A culture that values ownership, curiosity, and smart engineering

If youre a senior/principal engineer who loves data, networks, and building things that actually matter this ones worth a conversation.
Bright Purple is an equal opportunities employer and we are proud to work with clients who share our values of diversity in our industry,

Data Platform Engineer Microsoft Fabric
interAct Consulting Limited
Milton Keynes
Hybrid
Mid - Senior
£50,000

Were seeking a Data Platform Engineer to design and build a modern enterprise data platform using Microsoft Fabric. Youll play a key role in architecting a secure, scalable and high-performance cloud environment that enables advanced analytics and data-driven decision making. This is a hybrid role with 1 day a week in the Milton Keynes office.

You will -

  • Design and deliver a cloud-native data platform in Microsoft Fabric.
  • Build and optimise data pipelines, lakehouse/warehouse solutions and semantic models.
  • Implement Platform as Code (terraform) and CI/CD practices.
  • Ensure security, governance, monitoring and performance optimisation.
  • Lead incident resolution and continuous platform improvement.

Essential Skills

  • Proven experience building enterprise cloud data platforms.
  • Strong hands-on expertise in Microsoft Fabric and Microsoft SQL Server.
  • Experience with Agile delivery and modern data architecture principles.
  • Ability to support and integrate legacy and modern technologies.

An exciting opportunity to shape and deliver a strategic Microsoft Fabric data platform from the ground up.

Data Engineer
ABERTAY UNIVERSITY
Dundee
Hybrid
Mid - Senior
£38,784 - £46,048

Full Time, Permanent

Grade 7 (£38,784.49 - £46,048.78)

Abertay is a modern university with a global outlook, rooted in its local and national communities. We have made our mark with high-quality, well-directed teaching and research, and a stimulating and enriching experience for our students.

IT Services is a friendly, vibrant and fast-moving department with a focus on delivering excellent customer service and high-quality digital technology services to our staff and students.

Following a recent expansion, we are seeking to appoint a Data Engineer to join the team. This role will report to the Head of Enterprise Applications and requires significant experience in developing, implementing and supporting enterprise data solutions.

To be successful in this role, you will need:

  • Significant experience of developing, implementing, and supporting a data warehouse or real-time reporting platform in a complex enterprise environment.
  • Knowledge of data architecture, relational databases, and APIs.
  • Experience in creating and maintaining new views, tables, and schedules.
  • Experience of diagnosing performance bottlenecks, data inconsistencies, and integration issues.
  • Advanced SQL skills for complex queries, joins, indexing, partitioning, and performance tuning.

This role benefits from hybrid working arrangements.

If you believe you have the skills and experience for this exciting and challenging role, please submit your application through our online recruitment system.

Please note that we will only accept applications through our online recruitment system.

Committed to Equal Opportunities

Abertay University is a Scottish Registered Charity,

No: SC016040

Data Engineer Lead (Openshift)
Infoplus Technologies UK Ltd
Sheffield
Remote or hybrid
Senior
£450/day - £480/day

Key Responsibilities: Design, implement, and maintain data pipelines to ingest and process OpenShift telemetry (metrics, logs, traces) at scale. Stream OpenShift telemetry via Kafka (producers, topics, schemas) and build resilient consumer services for transformation and enrichment. Engineer data models and routing for multi-tenant observability; ensure lineage, quality, and SLAs across the stream layer. Integrate processed telemetry into Splunk for visualization, dashboards, alerting, and analytics to achieve Observability Level 4 (proactive insights). Implement schema management (Avro/Protobuf), governance, and versioning for telemetry events. Build automated validation, replay, and backfill mechanisms for data reliability and recovery. Instrument services with OpenTelemetry; standardize tracing, metrics, and structured logging across platforms. Use LLMs to enhance observability capabilities (e.g., query assistance, anomaly summarization, runbook generation). Collaborate with platform, SRE, and application teams to integrate telemetry, alerts, and SLOs. Ensure security, compliance, and best practices for data pipelines and observability platforms. Document data flows, schemas, dashboards, and operational runbooks.
Required Skills: Hands-on experience building streaming data pipelines with Kafka (producers/consumers, schema registry, Kafka Connect/KSQL/KStream). Proficiency with OpenShift/Kubernetes telemetry (OpenTelemetry, Prometheus) and CLI tooling. Experience integrating telemetry into Splunk (HEC, UF, sourcetypes, CIM), building dashboards and alerting. Strong data engineering skills in Python (or similar) for ETL/ELT, enrichment, and validation. Knowledge of event schemas (Avro/Protobuf/JSON), contracts, and backward/forward compatibility. Familiarity with observability standards and practices; ability to drive toward Level 4 maturity (proactive monitoring, automated insights). Understanding of hybrid cloud and multi-cluster telemetry patterns. Security and compliance for data pipelines: secret management, RBAC, encryption in transit/at rest. Good problem-solving skills and ability to work in a collaborative team environment. Strong communication and documentation skills.

Intern - Business Intelligence & Performance Reporting - (Fixed Term) - GLA14952
Glasgow
UK
In office
Graduate
£10,000/day (Negotiable)
TECH-AGNOSTIC ROLE
Job Description

Glasgow City Council’s Summer Internship Programme will be available from Monday 8 June 2026 – Friday 28 August 2026, inclusive.

Applicants must be available for the full duration of the placement.

The intern will work 35 hours per week and rate of pay will be the Glasgow Living Wage.

Interns will work for 12 weeks, during which time they will accrue 6 days leave, the payment of which is included in their Salary so must be taken during their 12-week placement.

Applicants require to be available week commencing 23 March 2026 - Thurs 2 April 2026 for interview.

The intern will support the development of enhanced Business Intelligence (BI) reporting to strengthen performance monitoring, governance and audit assurance within the Directorate.

Key Responsibilities
• Review and analyse existing BI dashboards and underlying data sources across

Education Services
• Work with officers to define and agree key performance indicators (KPIs) aligned to

Directorate priority committee reporting and Internal Audit requirements.
• Design and develop a consolidated BI dashboard or KPI-based performance report
• Test outputs with key stakeholders, incorporating feedback and ensuring data accuracy and usability
• Produce clear documentation and support handover to ensure outputs can be maintained and refreshed beyond the project period.

Eligibility criteria
• Must live within the Glasgow City Council boundary
• Have the right to live and work in the UK
• Be in the year of study specified in the advert

For more Information please see attached Recruitment outline and Person Specification or please visit our website https://www.glasgow.gov.uk/summerinternship.

Application Packs

We want everyone to be able to apply. If you need the Application Pack in another format, like Braille, large print, or another language, please call us on 0141 287 1054.

If we need to post it to you, we’ll send it by second-class mail within three working days. Please allow enough time to complete and return your application before the closing date. If you think you might need more time because of accessibility needs, please get in touch and we’ll be happy to help.

There are also a number of Accessibility Tools compatible with the myjobscotland website which may assist you with your application. More information on these can be found at https://myjobscotland.gov.uk/accessibility-statement.

Further Information

Please note that Glasgow City Council is currently completing a Job Evaluation exercise and introducing a new pay and grading structure which may impact on current salaries quoted in job adverts, see

Working for Us\Job Evaluation

For further information about working for us please refer to our website GCC HR Policies

CGEMJP00330718 Lead Data Engineer
CBSbutler Holdings Limited trading as CBSbutler
Sheffield
Hybrid
Senior
£430/day

Role Title: Lead Data Engineer Location: Sheffield/hybrid (3 days on site) Duration: 9 months Rate: £430 per day inside ir35 We are seeking a Lead Data Engineering Consultant with proven experience in leading and developing data engineering platforms. Experience required: Extensive enterprise experience with Hadoop, Spark, and Splunk. Proficiency in object-oriented and functional scripting, particularly in Python. Skilled in handling raw, structured, semi-structured, and unstructured data (SQL and NoSQL). Experience integrating large, disparate datasets using modern tools and frameworks. Strong background in building and optimizing ETL/ELT data pipelines. Familiarity with source control and implementing Continuous Integration, Delivery, and Deployment via CI/CD pipelines. Experience supporting and collaborating with BI and Analytics teams in fast-paced environments. Ability to pair program and work effectively with other engineers. Excellent analytical and problem-solving abilities. Knowledge of agile methodologies such as Scrum or Kanban is a plus. Comfortable representing the team in standups and problem-solving sessions. Capable of driving the creation of technical test plans and maintaining records, including unit and integration tests, within automated test environments to ensure high code quality. Promote SRE (Site Reliability Engineering) culture by addressing challenges through data engineering. Ensure service resilience, sustainability, and adherence to recovery time objectives for all delivered software solutions.Soft Skills (Consultant): Demonstrated ability and enthusiasm for enhancing team performance. Strong active listening and effective communication skills. Self-mastery, with a focus on positive mindsets and professional behaviours. Maintains up-to-date expertise in current tools, technologies, and key areas such as cybersecurity, data privacy, consent, and data residency regulations. Engages with industry groups and external vendors to represent and advance HSBC's interests and influence. Takes accountability for ensuring control and compliance throughout the engineering process. Champions innovation and the adoption of advanced technologies and best practices within the domain.If you are interested in this role or wish to apply, please feel free to submit your CV

AI Developer
Experis
London
Hybrid
Mid - Senior
£585/day

Role: AI Developer

Location: London - up to 3 days per week on-site

Duration: 3 Months

Day rate: 540 - 585 Umbrella Only

Minimum of 5 years UK residency required

Role Description:

We’re looking for an experience AI developer with the skills below. They’ll be building out our client’s AI needs and wants using a Microsoft Azure and Copilot stack.

Essential skills and experience

  • Design and develop Microsoft Azure AI solutions using services such as; AI Foundry, Open-AI services, and Microsoft Copilot Studio.
  • Apply governance frameworks for AI/ML models.
  • Python knowledge to automate processes
  • Large datasets experience, including data preprocessing, feature engineering, and model evaluation.
  • Agile development of AI solution from concept to deployment to continuous improvement.
  • Create and maintain technical documentation
  • Communicate complex concepts to all stakeholders.
  • At least 3 years of experience in AI solution engineering.
  • Large Language Models experience including prompt engineering and RAG implementations.
  • Expert data analytics, MLOps practices and API development.
  • Desirable knowledge in Docker and Kubernetes
Data Analyst - Asset Optimisation
Noriker Power
Cheltenham
In office
Graduate - Junior
£35,000 - £55,000

Data Analyst Renewable Energy & Asset Optimisation

Noriker Power develops and optimises rapid response power systems. Through innovation and a strong motivation to protect the environment, Noriker has grown into a vertically integrated developer and service provider in this increasingly important market.

We are looking for a Data Analyst with a strong foundation in Mathematics or Physics to help us optimise the performance of our renewable energy portfolio. As we scale our BESS assets, we need a practical analyst who can distil domain data into commercial results.

Key Responsibilities

  • Use mathematical frameworks to predict supply depth and market pricing, identifying market opportunities
  • Model optimal usage of battery assets using linear programming taking into account opportunity cost and physical characteristics of the asset
  • Model GB electricity system to create forward view of system constraints
  • Understand and model dynamics of all markets BESS participate in, including wholesale electricity, Balancing Services, Balancing Mechanism etc, to create optimised strategies for assets
  • Implement progressively deeper levels of optimisation taking into account real time performance, utilisation and machine state parameters

Requirements

  • A degree in Physics, Mathematics, Theoretical Physics, or a related field where you have spent time modelling physical systems in an UK university
  • Experience in Statistical Significance and Time-Series Analysis
  • Ability to handle Vector Calculus or Linear Algebra
  • A “First Principles” approach to solving problems
  • Python (specifically libraries like NumPy, SciPy, and Pandas)
  • SQL or similar for managing large-scale time-series datasets
  • Familiarity with SCADA systems is a bonus
  • Understanding of the UK/EU Energy Market (e.g., Balancing Mechanism, Ancillary Services) is desirable
  • Must have permanent right to work in the UK without sponsorship.

Why Join Noriker Power?

  • Solve Real Problems: Your analysis doesn’t just increase clicks; it directly speeds up the global transition to Net Zero.
  • Intellectual Rigour: Work in an environment where your academic background is respected and utilised daily.
  • Career Progression: work across a wide variety of disciplines with plentiful opportunities
PLM Data Analyst
Computer Futures - London & S.E(Permanent and Contract)
Not Specified
In office
Mid - Senior
£500/day - £800/day
TECH-AGNOSTIC ROLE

PLM Data Analyst Opportunity

Are you an experienced PLM Data Analyst with a background in aerospace and defense? Join our client’s team on a contract basis to participate in advanced projects at the forefront of the industry. This exciting opportunity involves working with innovative tools and technologies, helping to shape the future by leveraging your expertise in PLM systems.

Role Overview

As a PLM Data Analyst, you will play a key role in analysing existing CATIA V5 PLM data, such as CAD, metadata, and structures. You’ll support data mapping activities from CATIA V5 to the 3DEXPERIENCE (3DX) data model and contribute to the seamless integration of PLM object models. This role is especially suited to someone with a strong understanding of parts, products, documents, and BOMs within the ENOVIA ecosystem.

Key Skills and Responsibilities

  • CATIA V5 and 3DEXPERIENCE (3DX) expertise: Proficient in analysing and working with PLM data models to enhance system performance.
  • PLM object models: In-depth knowledge of parts, products, documents, and BOMs.
  • Data mapping: Supporting integration and alignment activities between CATIA V5 and the 3DX data model.
  • ENOVIA data handling: Expertise in managing and manipulating ENOVIA-related data structures.

Join a dynamic sector and contribute to a leading client’s innovative projects. If you’re looking for a challenging and rewarding role, apply today to bring your skills to our client’s esteemed team.

Please visit our website to find out more about our Key Information Documents. Please note that the documents provided contain generic information. If we are successful in finding you an assignment, you will receive a Key Information Document which will be specific to the vendor set-up you have chosen and your placement.

To find out more about Computer Futures please visit our website

Computer Futures, a trading division of SThree Partnership LLP is acting as an Employment Business in relation to this vacancy | Registered office | 8 Bishopsgate, London, EC2N 4BQ, United Kingdom | Partnership Number | OC387148 England and Wales

CGEMJP Lead Data Engineer
CBSbutler Holdings Limited trading as CBSbutler
Sheffield
Hybrid
Senior
£430/day

Role Title: Lead Data Engineer

Location: Sheffield/hybrid (3 days on site)

Duration: 9 months

Rate: 430 per day inside ir35

We are seeking a Lead Data Engineering Consultant with proven experience in leading and developing data engineering platforms.

Experience required:

  • Extensive enterprise experience with Hadoop, Spark, and Splunk.
  • Proficiency in object-oriented and functional scripting, particularly in Python.
  • Skilled in handling raw, structured, semi-structured, and unstructured data (SQL and NoSQL).
  • Experience integrating large, disparate datasets using modern tools and frameworks.
  • Strong background in building and optimizing ETL/ELT data pipelines.
  • Familiarity with source control and implementing Continuous Integration, Delivery, and Deployment via CI/CD pipelines.
  • Experience supporting and collaborating with BI and Analytics teams in fast-paced environments.
  • Ability to pair program and work effectively with other engineers.
  • Excellent analytical and problem-solving abilities.
  • Knowledge of agile methodologies such as Scrum or Kanban is a plus.
  • Comfortable representing the team in standups and problem-solving sessions.
  • Capable of driving the creation of technical test plans and maintaining records, including unit and integration tests, within automated test environments to ensure high code quality.
  • Promote SRE (Site Reliability Engineering) culture by addressing challenges through data engineering.
  • Ensure service resilience, sustainability, and adherence to recovery time objectives for all delivered software solutions.

Soft Skills (Consultant):

  • Demonstrated ability and enthusiasm for enhancing team performance.
  • Strong active listening and effective communication skills.
  • Self-mastery, with a focus on positive mindsets and professional behaviours.
  • Maintains up-to-date expertise in current tools, technologies, and key areas such as cybersecurity, data privacy, consent, and data residency regulations.
  • Engages with industry groups and external vendors to represent and advance HSBC’s interests and influence.
  • Takes accountability for ensuring control and compliance throughout the engineering process.
  • Champions innovation and the adoption of advanced technologies and best practices within the domain.

If you are interested in this role or wish to apply, please feel free to submit your CV.

Senior Data Scientist
RCRTR
Birmingham
Remote or hybrid
Senior
£400/day - £450/day

400 per day - inside IR35

Are you an expert in Natural Language Processing who thrives on building scalable, real-world AI solutions? We are seeking a hands-on Data Scientist to join a premier global credit ratings and financial information firm. You will be a key player in launching a brand-new, from-scratch analytics platform designed for elite institutional clients including corporate banks and asset managers.

The Opportunity

In this role, you will go beyond conventional boundaries to design, build, and deploy quantitative models that power advanced insights. You will collaborate with a cross-domain team of economists, political scientists, and developers to transform proprietary risk data into actionable strategic assets.

Your Impact

  • Model Innovation: Design and optimize risk models for analytics and generative AI applications using proprietary NLP data generation processes.
  • Pipeline Development: Develop and maintain robust ML and data pipelines for experimentation and deployment.
  • Insight Extraction: Prototype and test new approaches for extracting insights from structured and unstructured data.
  • Technical Translation: Explain ML/NLP model outputs and methodologies to non-technical stakeholders to drive strategic decisions.

Your Experience

  • Experience Level: 5-7+ years (More experience is welcomed).
  • Core Technicals: High expertise in Python and Machine Learning (ML).
  • NLP Expertise: 3-5 years of experience in Natural Language Processing.
  • AI Knowledge: Familiarity with LangChain and LlamaIndex. The role involves using Large Language Models (LLMs) to build data models rather than building LLMs from scratch.
  • Deployment: Must understand the deployment process and CI/CD practices to troubleshoot, though a dedicated engineering team handles the heavy lifting.
TM1 Developer - Senior Planning Analytics
Manpower UK Ltd
Lancashire
Hybrid
Senior
Private salary

TM1 Developer Warton/Samlesbury(Hybrid -1 day p/w onsite) Competitive Salary +Bonus & Overtime

My client a multinational Defence organisation are looking for a TM1 Developer to join either their Warton or Samlesbury site working on a hybrid basis 1 day per week onsite.

What you’ll be doing:

  • Collaborate with stakeholders, analysts, and developers to understand systems and translate business requirements into technical designs for IBM Planning Analytics (PA) solutions
  • Develop, configure, and deploy PA solutions using coding standards and change management procedures, integrating tools like Cognos Analytics, PAW dashboards, and PASS reports
  • Work within an Agile Development framework, participate in sprints, daily stand-ups, and update the Air Data & Analytics change management database to reflect development lifecycle status
  • Conduct database queries and data analysis using DV and SQL databases, particularly for Finance tables, ensuring all development includes unit tests, test plans, and peer reviews
  • Partner with business stakeholders to transition solutions into business-as-usual operations, including providing standard operating procedures and responding to incidents within SLA timelines
  • Demonstrate understanding of release and deployment processes, including version control; mentor junior developers to enhance their skills in line with development standards
  • Lead development on complex projects, engaging with senior stakeholders to ensure successful delivery, and present sprint outcomes and progress updates
  • Act as PA Team Lead during absences and show strong competence in developing PA solutions across PAfE, PAW, and PASS platforms

Your skills and experiences:

  • Proven experience in developing IBM Planning Analytics / TM1 solutions(Essential)
  • Demonstrated ability to connect to and work with data from SQL data warehouses and within Data Virtualisation (DV) environments(Desirable)
  • Background in financial environments, with clear understanding of the impact of development work on core functions such as Finance and Resource Planning(Desirable)
  • Proficient in developing within an Agile methodology, adhering to a structured software development lifecycle, and using change management and version control tools
  • Deep business knowledge of supported functions, with the ability to align development work to business needs and maintain up-to-date stakeholder and business continuity plans
  • Able to independently manage work queues, maintain high-quality outputs, and mentor junior Planning Analytics developers while supporting the Team Lead and covering duties in their absence
  • Skilled in investigating and resolving support incidents within agreed service levels, escalating unresolved issues to appropriate team members or technology partners
  • Committed to rigorous testing and peer review of all development work before release, ensuring solutions are robust, maintainable, and aligned to customer needs

To apply for this role, please send your CV to Peter Bibby on the email address below

Data Engineer
Damia Group Ltd
Shropshire
Hybrid
Junior - Mid
£393/day

Data Engineer - Telford 2 days onsite - 393 per day inside IR35 - 6 months We are looking for an ideally SC Cleared Data Engineer or one who is eligible for clearance This developer role will be primarily working on Talend and Oracle RDS systems, within our existing Talend framework and patterns. Experience of ETL tooling will be needed, preferably Talend but Pentaho/Informatica experience will be transferable. Experience working in Oracle RDS databases will also be required. Must have: Data ETL product experience - Talend preferred Oracle RDS Nice to have: SQL AWS GenAI *Damia Group Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept our Data Protection Policy which can be found on our website.* *Please note that no terminology in this advert is intended to discriminate on the grounds of a person's gender, marital status, race, religion, colour, age, disability or sexual orientation. Every candidate will be assessed only in accordance with their merits, qualifications and ability to perform the duties of the job.* *Should the role require the successful candidate to undergo and be eligible for UK Security Vetting. Clearance sponsorship will be provided where required. Due to the nature of the work, candidates should meet the relevant residency requirements. If applicable, Reserved Post nationality restrictions will be confirmed by the client. Damia is committed to inclusive recruitment and welcomes applicants from all backgrounds.* *Damia Group is acting as an Employment Business in relation to this vacancy and in accordance to Conduct Regulations 2003.*

Senior Data Engineer
Experis
Warwick
Remote or hybrid
Senior
Private salary

Role: Senior Data Engineer Background: Leveraging data analytics to provide insights and recommendations to drive strategic decision-making collaborating with cross-functional teams, including Finance, Accounting, Operations, HR, and others to deliver accurate and timely financial reporting, dashboards, analytics, and data-driven insights. Key Accountabilities A Senior Data Engineer (Production Support) will be responsible for monitoring, maintaining, and supporting ETL processes, data pipelines, and data warehouse environments. The ideal candidate should have strong troubleshooting skills, hands-on experience with ETL tools, and the ability to quickly resolve production issues to ensure data availability, accuracy, and reliability. \* Monitor and support daily ETL processes, data pipelines, and batch jobs to ensure timely and accurate data delivery. \* Troubleshoot and resolve production issues, job failures, and performance bottlenecks across ETL and data warehouse systems. \* Work Closely with Data platform team to resolve data load issues. \* Perform root cause analysis of recurring issues and implement permanent fixes. \* Collaborate with development teams to transition projects smoothly into production and ensure operational readiness. \* Implement and maintain monitoring, alerting, and logging solutions for proactive issue detection. \* Ensure data quality, consistency, and availability through ongoing validation and health checks. \* Apply best practices for production support, including incident management, change management, and problem management. \* Work closely with business users, data analysts, and other stakeholders to resolve data-related queries. \* Document runbooks, support procedures, and knowledge base articles to streamline production operations. \* Continuously optimize processes for reliability, performance, and scalability in production environments. \* Ensure compliance with data security, access controls, and audit requirements in production systems. Day-to-Day Tasks - Senior Data Engineer (Production Support) Production Support: \* Check system dashboards, logs, and alerts for failures or anomalies. \* Verify data quality and integrity checks (row counts, duplicates, missing data, schema changes). \* Review ETL/ELT job runs, data pipeline executions, and batch processes. \* Validate data loads into staging, warehouse, and downstream systems for critical tables. \* Monitor real-time and scheduled jobs to ensure SLAs are met. \* Investigate and resolve production issues (job failures, data inconsistencies, performance delays). \* Collaborate with business users to resolve data access or reporting issues. \* Coordinate with development/engineering teams for fixes, hot patches, or re-runs of failed jobs. \* Track and document incidents, resolutions, and preventive measures in ticketing systems (e.g., ServiceNow, Jira). \* Participate in daily/weekly operations meetings to report status and highlight issues. \* Handover critical ongoing issues to on-call/offshore support (if applicable). Minor Works/ Maintenance: \* Enhance Existing models with addition of fields as per the requirements. \* Help with Deployments and initial loads during Go-live. \* Perform root cause analysis for recurring or high-severity incidents. Proactive/Preventive Work: \* Fine-tune ETL workflows and SQL queries to improve performance. \* Implement monitoring scripts and automation to reduce manual intervention. \* Restructure the Load plans to improve effeciency. \* Review security and access controls to ensure compliance. \* Update documentation (runbooks, troubleshooting guides, SOPs) for operational continuity. Skills and Capability requirements: \* 6+ years of experience with ETL, data pipelines, and data warehouse production environments. \* Strong expertise in troubleshooting ETL/ELT processes using tools such as Matillion, Informatica, ODI, or SSIS. \* Experience in cloud-based data platforms like Snowflake. \* Proven ability to analyze job failures, perform root cause analysis, and implement permanent fixes. \* Hands-on experience with monitoring, alerting, and logging tools. \* Familiarity with incidents, problem, and change management processes in ITIL-based environments. \* Strong SQL programming and debugging skills with relational and cloud databases. \* Experience with traditional and non-traditional forms of analytical data design (Kimbal, Inmon etc) \* Excellent communication skills to interact with business users, analysts, and cross-functional technical teams. Nice to Have \* Domain knowledge in the area of finance data is preferred. \* Experience with SAP Systems and Databases \* Knowledge of data visualization tools, such as PowerBI or Tableau

BI Developer
Gleeson Recruitment Ltd
Wolverhampton
Hybrid
Mid
£35

BI Developer (Power BI | Azure | SQL)

Hybrid - Wolves based office 3 days per week

Are you a data-driven problem solver who loves turning complex data into clear insights? We’re looking for a skilled BI Developer to join our clients growing team.

What you’ll do:

  • Build impactful dashboards and reports in Power BI
  • Develop and optimise data solutions using Azure
  • Write and maintain efficient SQL queries and data models
  • Work closely with stakeholders to translate business needs into actionable insights

What we’re looking for:

  • Strong experience with Power BI, Azure, and SQL
  • Solid understanding of data modelling and ETL processes
  • Ability to communicate insights clearly to non-technical audiences
  • A proactive, solutions-focused mindset

BI Developer - apply ASAP if interesed. GleeIT

At Gleeson Recruitment Group, we embrace inclusivity and welcome applicants of all backgrounds, experiences, and abilities. We are proud to be a disability confident employer.

By applying you will be registered as a candidate with Gleeson Recruitment Limited. Our Privacy Policy is available on our website and explains how we will use your data.

Data Engineer
CBSbutler Holdings Limited trading as CBSbutler
Shropshire
Hybrid
Mid - Senior
£430/day

Job Title: Data Engineer (AEOI) Rate: £430 per day inside ir35 Duration: 6 months Location: Telford/hybrid (2 days onsite) SC security clearance is required for this role. We're hiring an ETL Developer to support a major government AEOI programme covering Pillar R7, ETR Exchange, NTJ Exchange and CRS Outbound Exchange. Due to growing demand, new teams are being stood up and existing teams expanded to deliver critical data exchange services. Job Description: Project - AEOI Projects - Pillar R7/ETR Exchange/NTJ Exchange/CRS Outbound Exchange Demand in the AEOI programme space is expected to increase necessitating the stand-up of an additional team and the expansion of existing teams to support. This developer role will be primarily working on Talend and Oracle RDS systems, within my clients existing Talend framework and patterns.Expereince required: Experience of ETL tooling will be needed, preferably Talend but Pentaho/Informatica experience will be transferable. Experience working in Oracle RDS databases will also be required. Data ETL product experience - Talend preferred Oracle RDSNice to have: SQL AWS GenAIIf you are interested in this role, please feel free to submit your CV

Senior .Net Developer - Remote - UK Based Only
Morris Sinclair Recruitment
London
Fully remote
Senior
£70,000 - £100,000

Candidates must be based in UK and eligible to work in UK.

Remote-Friendly Software Development Opportunity in Agricultural Technology
Join an award-winning agricultural analytics software company at the forefront of sustainable technology. We’re seeking exceptional Back-End Developers who can transform data into impactful solutions that enhance food production, farm economics, and environmental resilience.

Position Highlights:

  • Remote Work
  • London based head office
  • Cutting-Edge Agricultural Technology Platform
  • Opportunity to Drive Sustainability through Innovative Software Solutions

Candidate Assessment Process:

  • LeetCode-Style Coding Challenge
  • Designed to evaluate algorithmic problem-solving skills
  • Focus on efficient, optimised solutions
  • Challenges will test:
  • Data Structures
  • Algorithm Design
  • Computational Thinking
  • Performance Optimisation

Ideal Candidate Profile:
Academic and Professional Requirements:

  • Ideally PhD graduate but minimum qualification: Degree/Masters in Computing, Computer / Data Science, or Equivalent
  • Proven Commercial Experience as a Back-End Microsoft Developer
  • Strong Academic Background with Demonstrable Achievements
  • Proven ability to excel in time-constrained, algorithmic problem-solving

Technical Expertise:

  • Advanced Proficiency in:
  • C# and .NET Framework
  • Microsoft Azure Cloud Platform
  • SQL and SQL Server
  • Complex Data-Driven Software Development
  • Strong Algorithm and Data Structure Knowledge

Preferred Background:

  • Experience in FinTech or Similar Structured Data Environments
  • Demonstrated Ability to Scale Applications for Large User Base
  • Previous success in technical coding assessments

Personal Attributes:

  • Exceptional Attention to Detail
  • Strong Collaborative and Communication Skills
  • Creative Problem-Solving Approach
  • Quick Analytical Thinking
  • Ability to perform under time pressure
  • Passion for Technological Innovation in Sustainability

What We Offer:

  • Work with Respected Professionals
  • Cutting-Edge Technology Solutions
  • Meaningful Impact on Agricultural Sustainability
  • Opportunity to Develop Market-Leading Software
  • On top of the very competitive salary all employees are included in the company share scheme
Data Engineer
First Databank
Exeter
Hybrid
Mid - Senior
Private salary

Exeter, Devon (Hybrid 2 days per week in office)

About Us

At FDB (First Databank), we create and deliver the world s most trusted drug knowledge, enabling healthcare professionals to make critical decisions that improve patient safety, efficiency, and outcomes. Our solutions are embedded across hospitals, GP practices, pharmacies, and wider healthcare systems, supporting millions of patients every day.

Our values guide everything we do: Better Together, Clear Expectations, Constantly Curious, and Health at the Heart. If these resonate with you, you ll feel right at home with us.

The Opportunity

We are now looking for an experienced Data Engineer to join us on a full-time, permanent basis.

Working within Agile teams and collaborating with a range of experts, you ll have the chance to utilise your skills and build solutions that genuinely make a difference.

What s more, with hybrid working, a strong focus on wellbeing, an annual bonus scheme and a comprehensive benefits package, you ll have the flexibility, recognition and backing to do your best work while continuing to develop your expertise.

So, if you want to be part of building innovative solutions that support millions every day, read on and apply today!

The Role

As a Data Engineer, you will design and develop high-quality data solutions that support innovative software products aimed at improving health and environmental outcomes.

Working within Agile methodologies, you will collaborate closely with the Product Owner and a wide range of technical and subject matter experts to understand customer requirements and shape effective, scalable data components.

You will undertake requirements analysis, solution scoping, specification definition and data analysis, while challenging assumptions and defining appropriate acceptance criteria to mitigate risk.

Through the creation of production code and participation in code reviews, you will apply established design patterns and best practices to ensure performance, data quality, security, robust error handling, monitoring and logging.

Additionally, you will:

  • Perform critical assessments to inform solution scoping and risk mitigation
  • Support project management activities as required
  • Use AI environments to enhance productivity and efficiency

About You

To be considered as a Data Engineer, you will need experience using AI environments and good verbal and written communication skills, including presentation skills, as well as experience with the following:

  • Databricks and Power BI
  • Python and TSQL
  • Extract, Transform, Load (ETL)
  • Analysis and design
  • Test Automation and Refactoring
  • Unit Testing and mocking
  • Agile & Scrum development methodologies

You will also need some experience with Azure / AWS, PowerShell, Data lakes and Zoho Creator / Analytics.

The Benefits

You will be joining a very supportive team where you will have the opportunity to grow and develop new skills. In addition, FDB offers:

  • Competitive salary
  • 25 working days holiday per annum plus statutory holidays
  • Flexible option for employees to take additional holiday
  • Annual company bonus scheme
  • Health and Wellbeing allowance
  • HealthShield flexible health cash-back scheme
  • Electric Vehicle scheme
  • Enhanced pension scheme
  • Cycle to work scheme
  • Charity days
  • Full flexible working
  • Enhanced maternity/paternity schemes
  • and many more !

Other organisations may call this role Software Engineer, Data Module Developer, BI Engineer, Business Intelligence Engineer, Power BI Engineer, Python Developer, Python Programmer, R Developer, Python Engineer, or IT Data Engineer.

Data Manager
Context Recruitment
Birmingham
Hybrid
Senior
£70,000

Data Manager - Birmingham (hybrid) 70,000 PA Opportunity for a Data Manager to join a well-known organisation undergoing significant technology transformation. A reputable, complex organisation with numerous sites, providing services to hundreds-of-thousands across the country. You'll be joining at a particularly exciting time for the business. Reporting directly to the Head of IT, you'll be responsible for establishing and leading an enterprise-wide data management capability within a regulated, operationally complex environment. This is a key role responsible for ensuring organisational data is accurate, trusted, secure and fit for operational, regulatory and strategic decision-making, spanning data strategy, governance, architecture, engineering, reporting and analytics. Key Responsibilities: Build and deliver an enterprise data strategy, aligned to business objectives and measurable outcomes Establish robust data governance, ownership, standards, quality controls and prioritisation Lead the development of target data architecture, including warehousing, modelling, integrations and pipelines Oversee data integrity, security, availability and compliance (including GDPR / Data Protection) Manage delivery through internal teams and external partners, including procurement and supplier management Recruit and lead a small team (up to 3 data engineers / BI analysts) over time Work closely with stakeholders to deliver timely, accurate reporting and actionable insights Drive continuous improvement through data quality metrics, audits and process optimisation Skills & Experience: Strong experience in enterprise data management, governance and architecture Excellent knowledge of Microsoft data platforms (Power Platform, Microsoft Fabric, Azure data technologies) Confident communicator able to translate complex data concepts for senior/non-technical stakeholders Experience in regulated, asset-intensive or safety-critical sectors Salary up to 70,000 PA The role offers excellent benefits, including free/heavily discounted public transport travel, 25 days holiday (+bank holidays) and an excellent pension scheme.

Frequently asked questions
Typically, a Data Engineer should have a strong background in computer science or related fields, proficiency in programming languages like Python or Java, and experience with data warehousing, ETL processes, and big data technologies such as Hadoop or Spark.
Haystack features a wide range of Data Engineer positions, including roles in startups, large enterprises, and remote opportunities. You can find jobs specializing in cloud data engineering, real-time data processing, data pipeline development, and more.
To improve your chances, tailor your resume to highlight relevant skills and projects, gain hands-on experience with popular data tools, contribute to open-source projects, and stay updated with the latest trends in data engineering.
Yes, Haystack lists entry-level Data Engineer roles suitable for recent graduates or professionals transitioning into data engineering, as well as internships and junior positions to help you start your career.
Absolutely. Haystack offers many remote and flexible Data Engineer job listings to suit your preferred working style and location.