Make yourself visible and let companies apply to you.
Roles
Data Engineer Jobs
Overview
Looking for top Data Engineer jobs? Explore the latest data engineering opportunities on Haystack, your go-to IT job board. Whether you're skilled in ETL, data pipelines, or big data technologies, find the perfect role to advance your career today. Start your search for Data Engineer positions now!
Starburst Developer
Stackstudio Digital Ltd.
Multiple locations
Hybrid
Mid - Senior
£450/day - £500/day
RECENTLY POSTED

Role/Job title: Starburst Developer
Work Location: London (2 days in a week MUFG Client Office)
The RoleStarburst DeveloperYour responsibilities: Design and maintain data pipelines using Starburst and related technologies.
Optimize query performance and resolve data processing bottlenecks.
Manage databases to ensure high availability, reliability and security.
Integrate Starburst with various data sources including cloud services and APIs.
Monitor data pipelines and troubleshoot issues proactively.
Collaborate with business users and stakeholders on data requirements.
Maintain comprehensive and up-to-date documentation for data processes.
Stay current with data engineering advancements and propose innovative solutions.
Implement best practices for data quality assurance and testing.Your ProfileEssential skills/knowledge/experience: (Up to 10, Avoid repetition) Minimum 8+ Years of Strong hands-on experience with Starburst Enterprise or Denodo Tool
Advanced SQL skills, including query optimization and performance tuning.
Experience with cloud data platforms (Snowflake, Databricks)
Proficiency in any Scripting language (e.g. Python/Scala)
Proficiency in data analytics and data engineeringDesirable skills/knowledge/experience: (As applicable) Handson experience Starburst or Denodo data virtualization
Data Integration background
Database development experience in any database (Oracle, SQL Server)
Banking domain preferred.

Data Visualisation Software Engineer
Bright Purple Resourcing
UK
Remote or hybrid
Senior - Leader
£95,000
RECENTLY POSTED

Principal Software Engineer - Data Visualisation / Dashboard Development Edinburgh, UK | Fully Remote (UK-based)
Engineering | Cyber Security
£95,000 & Benefits
Ever wanted your dashboards to actually defend the internet?
This role does exactly that.
Im recruiting on behalf of a high-growth cyber security technology company that protects some of the worlds most critical online services from large-scale DDoS and application attacks. Their software sits right at the heart of customer networks- and when it works (which it must), entire businesses stay online. Theyre now expanding their world-class engineering team and are looking for a Principal-level expert to lead the charge on network and security analytics.
The Opportunity This is not a keep-the-lights-on role. Youll own and lead the design and development of sophisticated, high-performance dashboards used to visualise real-time network and security data. Youll set technical direction, mentor engineers, and build analytics that help customers instantly understand and stop complex cyber attacks. Youll be trusted to work from first principles, influence architecture, and make decisions that genuinely matter. What Youll Be Doing

  • Leading a small, highly skilled team focused on network & security analytics
  • Designing and building advanced Grafana dashboards running in Kubernetes
  • Turning complex data into clear, insightful visualisations
  • Developing and reviewing complex queries (Grafana, Splunk, Python)
  • Mentoring engineers and shaping technical best practice
  • Balancing hands-on development with technical leadership and ownership

What Were Looking For Essential

  • Strong experience building dashboards and analytics in Python and ideallyGrafana
  • Proven background leading engineers in an agile environment
  • Solid understanding of Linux and AWS
  • Excellent communication skills you can explain complex ideas simply
  • A strong technical degree (Computer Science, Maths, Statistics, Engineering, or similar)

Nice to Have

  • Knowledge of networking protocols and how the internet actually works
  • Experience with Splunk & SPL
  • SQL or similar data manipulation skills
  • Exposure to network security products
  • HTML, CSS, JavaScript
  • Data Science or Machine Learning experience

Location & Flexibility

  • Edinburgh-based engineers: hybrid working (typically 2 days in office)

  • Fully remote options available for the right candidate

  • Cutting-edge tech, complex data, and meaningful problems

  • A culture that values ownership, curiosity, and smart engineering

If youre a senior/principal engineer who loves data, networks, and building things that actually matter this ones worth a conversation.
Bright Purple is an equal opportunities employer and we are proud to work with clients who share our values of diversity in our industry,

BI and Data Engineering Lead
The Wilf Ward Family Trust
York
In office
Senior
£40,000
RECENTLY POSTED

Salary: £40,000£45,000 per annum
Reporting to: Digital Delivery Manager
Location: Trust sites (with travel required)
Organisation: The Wilf Ward Family Trust

At The Wilf Ward Family Trust, were on an ambitious digital transformation journey. A key part of this is changing how we use data moving away from manual, spreadsheet-based reporting to modern, automated, and trusted
business intelligence.
Were looking for a BI and Data Engineering Lead to establish and lead our central reporting capability. This is a unique opportunity to assist in making a genuine impact in a values-driven organisation. Youll design and deliver robust BI solutions, integrate data from multiple systems, and create intuitive dashboards that help colleagues across the Trust make better, data-informed decisions ultimately supporting our mission of enabling extraordinary lives through outstanding support.

What youll be doing
As our BI and Data Engineering Lead, you will:

  • Design, build, and maintain cost effective, automated reporting solutions.
  • Develop data models integrating multiple data sources including data from APIs, Excel, CSV, and JSON.
  • Work with the organisation to identify data quality issues and improve quality, integrity, and consistency of data.
  • Establish and document data flows, reporting processes, and technical solutions.
  • Champion a data-driven culture and promote best practice in reporting and analysis.
  • Contribute to data governance, standards, and the development of a trusted single source of truth.

What were looking for
Were looking for a specialist who combines strong technical capability with a collaborative and curious mindset.

Essential experience and skills:

  • Proven experience building BI solutions using Power BI, including data modelling, DAX, and dashboards
  • Understanding of dimensional data modelling
  • Experience integrating data from multiple sources (APIs, Excel, CSV, JSON)
  • Ability to translate business needs into effective technical solutions
  • Strong data analysis and problem-solving skills
  • Experience designing user-friendly dashboards for non-technical audiences
  • Ability to work independently and manage your own priorities
  • Excellent communication skills able to explain technical concepts clearly
  • Experience documenting data processes and delivering user guidance or training
  • Confidence using Microsoft 365 tools including SharePoint, Teams, and OneDrive
  • Full UK driving licence and willingness to travel between Trust sites

Desirable:

  • Knowledge of data governance and data quality frameworks
  • Familiarity with modern data platforms (e.g., Microsoft Fabric, Data Factory, Databricks)
  • Experience supporting digital transformation initiatives
  • Experience with Python or R for data analysis or transformation
  • Experience building deployment pipelines in Microsoft 365 environment

We recognise the importance of recruiting skilled, compassionate, and reliable staff, whilst demonstrating in practice our commitment to inclusion, safeguarding and promoting the welfare of adults at risk. Please be aware its a criminal offence for people who are barred from working in Regulated Activity (under the Safeguarding and Vulnerable Groups Act 2006) to apply for roles that require them to work unsupervised with that particular group such as adults or children. Please note any successful job offer will be conditional based on pre-employment checks such as DBS and referencing prior to a start date agreed. The Wilf Ward Family Trust has a clear commitment to safeguarding within all practices, for more information on criminal discloses you can visit Check if you need to tell someone about your criminal record: When you need to tell someone about your criminal record

Data Platform Engineer
HAYS
Milton Keynes
Hybrid
Junior - Mid
£50,000
RECENTLY POSTED

Your new company
You will be working for a large, well-known organisation who are a powerhouse within their industry.
Your new role
Our client are looking for a Data Platform Engineer to support the design, operation and improvement of our cloud-based data platforms. You will work within a team of technical specialists, contributing to platform reliability, capacity, configuration management and continuous improvement.
Key responsibilities:

  • Support the development and maintenance of the organisation’s data platform.
  • Manage platform stability, availability and incident resolution.
  • Perform root cause analysis and contribute to ongoing service improvements.
  • Participate in out of hours support where required.
  • Assist with proactive monitoring, capacity management and escalations.
  • Collaborate with technical teams and third-party providers.
  • Contribute to project delivery and platform enhancements.

What you’ll need to succeed

  • Experience in similar technical engineering or platform roles.
  • Strong background in incident, request, change and problem management.
  • Experience with cloud technologies, including Azure.
  • Knowledge of Microsoft Fabric is essential.
  • Experience managing SQL Server.
  • Comfortable working with both legacy and modern technologies.
  • Strong attention to detail and analytical thinking.
  • Experience with GitHub, Git Actions and Terraform are desirable.
  • Ability to influence stakeholders at multiple levels.
  • Experience with cloud cost monitoring and reporting.

What you’ll get in return
A permanent role paying up to £50,000pa + benefits and hybrid working on offer with the role being based in Milton Keynes.
What you need to do now

If you’re interested in this role, click ‘apply now’ to forward an up-to-date copy of your CV, or call us now.
If this job isn’t quite right for you, but you are looking for a new position, please contact us for a confidential discussion about your career.

Hays EA is a trading division of Hays Specialist Recruitment Limited and acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C’s, Privacy Policy and Disclaimers which can be found at hays.co.uk

Software Simulation Engineer
Quest Global Engineering Limited
Redditch
In office
Mid - Senior
Private salary
RECENTLY POSTED

Redditch, UK

12 months +

Work Experience

  • Bachelors or Masters degree in Computer Science ./Industrial Engineering
  • Experience: Minimum 37 years of relevant experience in supply chain simulation, industrial engineering, or discrete event simulation.
  • Simulation Software/ Tools : Proficiency in specialized software such as FlexSim, AnyLogic, .ANSYS (FEA/CFD), MATLAB/Simulink (dynamic systems), COMSOL, or AnyLogic.
  • Programming: Strong scripting skills (Python, R) for data analysis and automating simulation tasks.
  • Programming Languages: Proficiency in Python (scientific computing), C++ (high-performance tasks), and MATLAB/R for data analysis and mathematical modeling.
  • Domain Knowledge: Solid understanding of warehousing automation technologies (AGVs, sorters, AS/RS).
  • WMS Knowledge: Familiarity with WMS systems.
  • Simulation Tools: Experience with industry-specific software such as ANSYS (FEA/CFD), MATLAB/Simulink (dynamic systems), COMSOL, or AnyLogic.
  • Experience with CAD tools (e.g., AutoCAD) for layout creation.

Job Requirements

  • The Simulation Engineer will develop, validate, and analyze discrete-event simulation models of warehouse facilities, incorporating automation, conveyors, and WMS software. The goal is to identify bottlenecks, improve efficiency, and validate operational scenarios before implementation.
  • Modeling & Simulation: Design and build 3D simulation models of distribution centers and warehouse logistics using software like FlexSim, Simio, or AnyLogic.
  • Data Analysis & Validation: Analyze operational data (e.g., order profiles, inventory levels, stock audit ) to validate simulation models, ensuring they accurately represent real-world operations.
  • Process Optimization: Conduct experiments to identify bottlenecks, test “what-if” scenarios, and optimize resource requirements (staffing, automated equipment).
  • WMS Integration: Simulate interactions between physical equipment (AGVs, ASRS, conveyors) and warehouse software systems (e.g., SAP EWM, Manhattan Associates).
  • Documentation & Reporting: Create detailed technical reports and visualizations (Tableau, R) to present findings to stakeholders and support data-driven decision-making.
  • Cross-Functional Collaboration: Collaborate with engineering and operations teams to integrate simulation results into final warehouse designs.
Data Platform Engineer
Morson Edge
Preston
In office
Mid - Senior
£64/hour - £74/hour
RECENTLY POSTED

Data Platform Engineer – Warton – SC Cleared – 12 Month Contract

About Your Role:

As a Data Platform Engineer in a highly regulated environment, you will be responsible for designing, building, and maintaining secure and scalable data infrastructure that supports both cloud and on premises platforms.
You will play a key role in ensuring that all data systems comply with industry regulations and security standards while enabling efficient access for analytics and operational teams.
A strong command of Apache NiFi is essential for this role.
You will design, implement, and maintain data flows using NiFi, ensuring accurate, efficient, and secure data ingestion, transformation, and delivery.
You should be adept at identifying and resolving issues within NiFi flows, managing performance bottlenecks, and implementing robust error handling strategies.
You’ll work closely with cross-functional teams including data architects, compliance officers, and cybersecurity specialists to integrate data from various systems such as databases, APIs, and cloud platforms.

About You:

As an experienced Data Platform Engineer, your skills and experience may include;

• Strong experience of Apache NiFi
• Experience designing and optimizing data flows for batch, real-time streaming, and event-driven architectures.
• Ability to identify and resolve flow issues, optimize performance, and implement error-handling strategies.
• Optional scripting skills for creating custom NiFi processors.
• Knowledge of data modelling, replication, and query optimization.
• Hands-on experience with SQL and NoSQL databases is desirable.
• Familiarity with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery) would be beneficial.
• Data Platform Management
• Comfortable operating in hybrid environments (cloud and on-prem).
• Experience integrating diverse data sources and systems

Senior AI Engineer
Copa Resourcing Group LTD
City of London
Remote or hybrid
Senior
£100,000
RECENTLY POSTED

Senior AI Engineer / Developer

Seeking an experienced AI Engineer / Developer to design and deliver production-grade AI systems using Large Language Models (LLMs).

This is a hands-on senior role focused on building scalable, enterprise-ready AI solutions.

Key Responsibilities

Design and implement LLM-powered applications

Build and optimise RAG (Retrieval Augmented Generation) pipelines

Develop AI agents with tool integration and multi-step workflows

Engineer scalable Python-based services and APIs

Implement semantic search using vector databases

Deploy and manage AI solutions within cloud environments (Azure preferred)

Contribute to best practices across evaluation, monitoring and reliability

Provide technical guidance within a growing AI function

Required Experience

Strong commercial experience building LLM-driven systems

Hands-on RAG architecture implementation

Solid Python software engineering background

Experience with frameworks such as LangChain, LlamaIndex or similar

Exposure to vector databases and embedding workflows

Cloud deployment experience (Azure advantageous)

Experience delivering production systems at scale

Role Type

Both Permanent / Contract options available.

Location

Remote (with occasional travel to office).

Salary

Permanent roles: £100k+ (potentially more for specialist experience).

Contract: Day rates

TPBN1_UKTJ

Data Engineer
Anson McCade
Stevenage
In office
Mid - Senior
£55,000

We are seeking a highly skilledSenior Data Engineerto join our systems engineering team. In this role, you will be at the forefront of innovation, designing and maintaining the robust data architectures that power mission-critical AI and NLP initiatives.

Since we operate in a highly regulated and secure environment, you will focus heavily onon-premise infrastructure, ensuring our Generative AI capabilities are powerful, private, and resilient.

Key Responsibilities

  • Architect & Build:Design, develop, and optimize scalable data pipelines (ETL/ELT) withinair-gapped or on-premisedata centers.
  • AI Integration:Engineer data structures specifically forNatural Language Processing (NLP)andLarge Language Models (LLMs), including local vector databases and private model hosting.
  • Infrastructure Management:Manage and scale on-premise big data clusters, ensuring high availability without reliance on public cloud providers.
  • Data Governance:Maintain rigorous data quality and security standardscrucial for sensitive engineeringwhile managing complex datasets from disparate sources.
  • Collaboration:Work alongside Data Scientists to transition GenAI prototypes into production-ready, locally-hosted solutions.

Technical Requirements

Languages

Expert-levelPythonand advancedSQL.

Data Engineering

Experience withETL/ELTand orchestration tools likeApache AirfloworNiFi.

On-Prem Tech

Proficiency withHadoop/HDFS,Spark, and containerization viaDocker/Kubernetes (K3s/OpenShift).

AI/ML

Practical experience withNLP(HuggingFace) andGenAIframeworks (LangChain) tailored for local execution.

Databases

Experience withPostgreSQLand on-premVector DBs(e.g., Milvus, Qdrant, or pgvector).

Security

Experience working withinLinux-basedsecure environments and air-gapped networking.

Essential Qualifications

  • Education:A degree in Computer Science, Data Engineering, Mathematics, or a related technical field.
  • Security Clearance:Must be eligible forhigh-level security clearance(SC or DV level).
  • Technical Rigor:A “security-first” mindset with the ability to troubleshoot complex hardware/software interactions on-site.
GCP FinOps Engineer
Stackstudio Digital Ltd.
Newport
In office
Mid - Senior
£400/day - £450/day

Job DetailsJob Title: GCP FinOps EngineerLocation- Newport, UKKey Responsibilities(Individual contributor role)*

  • Optimise large scale data analytics workloads through partitioning, clustering, query rewrites, storage format improvements, and lifecycle policies.
  • Tune containerised microservices by recalibrating CPU/memory requests, improving autoscaling efficiency, and restructuring workload placement on cost efficient compute.
  • Redesign workflow orchestration pipelines for parallel execution, increased concurrency, and offloading heavy tasks to lower cost execution environments.
  • Analyse distributed data processing pipelines to right size worker types, adjust scaling thresholds, and adopt low cost compute for batch workloads.
  • Reduce log processing and storage overhead through log level standardisation, routing rules, exclusion filters, and retention optimisation.
  • Implement storage tiering strategies based on access patterns and enforce lifecycle rules to minimise cold data retention costs.
  • Improve relational database performance through index tuning, connection optimisation, and instance right sizing.
  • Enhance horizontally scalable database performance via autoscaling policies, index improvements, and mitigation of read/write hotspots.
  • Build dashboards, budgets, alerts, and guardrails to drive ongoing cost governance and financial accountability.
  • Collaborate with engineering teams to embed cost efficient architecture patterns and operational best practices.

Key Skills / Knowledge

  • 5+ years of hands-on experience in Google Cloud
  • Strong understanding of GCP Data services (indexing, slots, pruning, partitioning, clustering)
  • Expert-level Kubernetes & GKE resource tuning
  • Hands-on experience with Dataflow job pipelines and worker optimisation
  • Strong Airflow/Composer knowledge (DAG design, scheduling, PodOperator)
  • Strong Dataflow processing pipelines development & schedulers knowledge
  • Deep understanding of Cloud Logging routing, sinks, exclusion filters
  • Experience with Cloud Spanner autoscaling, indexing, schema optimisation
  • Cloud SQL performance tuning and indexing
  • Ability to analyse billing data & resource consumption
  • Experience using GCP Cost Explorer, Recommender API, Billing Export
  • Ability to quantify cost savings and present ROI to leadership
  • Build dashboards, alerts, and budget guardrails
  • Strong communication and stakeholder management
  • Ability to collaborate across engineering, data, and product teams
  • Structured problem-solving mindset
  • Ownership-driven, proactive and independent

Experience Required

  • Min 5+ years experience in Google Cloud
Data Engineer | Outside IR35 | £400 - £500 | 6 months | Hybrid Nottingham
Opus Recruitment Solutions
Nottingham
Hybrid
Mid - Senior
£400/day - £500/day

We’re looking for a highly skilled Data Engineer to join a growing data team supporting a large-scale modern data platform project. Working 3 days a week onsite on the outskirts of Nottingham, you'll be a key contributor in evolving the organisation’s data capability, focusing on best‑practice engineering, clean architecture, and high‑value BI delivery. Key Responsibilities Design, build and optimise ETL/ELT pipelines using Azure Data Factory and Databricks. Develop scalable data models and transformations using SQL and Python. Work hands‑on with Databricks (Lakehouse, Delta tables, notebooks, workflows). Deliver high-quality dashboards and reporting solutions using Power BI. Implement best practices for data quality, governance, lineage and automation. Collaborate with cross‑functional teams including analysts, product owners and business stakeholders. Support performance tuning, cost optimisation and reliability improvements across the data estate. Document pipelines, models and processes to ensure smooth knowledge transfer.Technical Skills Required Databricks – notebooks, Delta Lake, Spark (PySpark desirable) Power BI – data modelling, DAX, dashboard/report development SQL – advanced querying, performance optimisation, data modelling Python – scripting, transformation logic, automation Azure Data Factory – pipelines, triggers, mapping data flows Understanding of data warehousing / lakehouse principles Experience working in cloud-based data ecosystems (Azure) Strong appreciation of data quality, governance and best practicesIf this is a role that suits your skillset, can work onsite 3 days per week and immediately available then please apply for the job advert directly or reach out to myself at (url removed)

Senior Data Engineer - Azure & Snowflake
EMBS Engineering
London
Hybrid
Senior
Private salary

Location: Central London - 3–4 days onsite each week

Salary: £ Negotiable + benefits

We are supporting an enterprise-level client who is investing heavily in a modern cloud data platform that will sit at the centre of its data strategy. This programme will enable more advanced analytics, reporting and insight across multiple business functions.

We are looking to appoint three experienced Senior Data Engineers with strong Azure and Snowflake expertise.

The Role

This is a senior, hands-on engineering position within a high-performing data team. You will play a key role in shaping, developing and enhancing a large-scale Azure-based data platform, ensuring it is scalable, reliable and built to enterprise standards.

The position requires regular collaboration with stakeholders and an onsite presence in Central London 3–4 days per week, so this is not a fully remote role.

What You Will Be Doing

* Building and enhancing scalable data pipelines using Azure and Snowflake

* Developing and improving ETL / ELT processes across batch and micro-batch workloads

* Working extensively with Azure Data Factory, Azure SQL, Azure Storage and Azure Functions

* Designing and maintaining data warehouse structures including star and snowflake schemas

* Applying recognised data warehousing approaches such as Kimball and Inmon

* Writing and optimising complex SQL queries to support analytics and reporting

* Ensuring strong data governance, quality, validation and reconciliation processes

* Partnering with BI teams to enable effective reporting solutions

* Contributing to architectural decisions around performance, scalability and infrastructure

* Identifying and resolving issues to improve platform reliability and efficiency

What We Are Looking For

* 7+ years in software engineering or development

* 5+ years working within data-focused environments

* At least 2 years hands-on experience with Azure cloud data platforms

* Strong expertise across the Azure Data Platform including Data Factory, SQL, Storage and Functions

* Proven experience in SQL development and data modelling

* Experience building both periodic batch and micro-batch data pipelines

* Solid understanding of enterprise data warehouse design and loading strategies

* A minimum of 1 year hands-on experience with Snowflake

* Experience working with large-scale enterprise datasets

* Strong analytical mindset with a clear focus on data integrity and performance

Desirable Experience

* Advanced Snowflake performance tuning and optimisation

* Python and or Databricks exposure

* Experience designing full end-to-end data platform architectures

* Background supporting enterprise BI ecosystems

* Familiarity with CI/CD pipelines and infrastructure-as-code practices

Additional Details

* Visa candidates will be considered

* Salary is open and negotiable depending on experience

* Immediate requirement

If you are an experienced Senior Data Engineer with strong Azure and Snowflake expertise and are comfortable with a London-based hybrid working model, we’d love to hear from you

Data Engineer
Bowerford Associates Limited
Exeter
Hybrid
Mid - Senior
£45,000/day - £52,500/day

I am searching for a Data Engineer for an exciting and growing technology focused business based in Exeter. The role requires you in the office 2-days per week so you will need to live within a commutable distance of Exeter to be considered for the role or you will be in a position to relocate to the area. In this position you will be following agile methodologies for the design, development and acceptance of the data components for complex software solutions. Working closely with the Product Owner you will gain a good understanding of customer requirements and knowledge of implementation processes to help solution scoping. You will be responsible for requirements analysis, specification definition, data analysis and project management, as required, to meet the needs of each solution. You will create production code and perform code reviews with the team - you will be equally comfortable working alone or in pairs (pair programming). I am looking to speak with candidates who use design patterns and adopt best practices, candidates who take responsibility for ensuring high quality coding and development in their work. To be a success in this role you will need to be skilled in a mixture of the following: \* Databricks \* Power BI \* Python \* TSQL \* Extract Transform Load (ETL) \* Analysis and design \* Test Automation \* Refactoring \* Unit Testing (Mocking) \* Agile \* Scrum Any experience working with PowerShell, Azure, AWS, Data Lakes or Zoho is highly desirable but is NOT essential. Experience of using AI environments to enhance productivity and efficiency through intelligent task management is also desirable (i.e. Copilot and ChatGPT). I am looking to speak with good communicators who like to work collaboratively within a diverse range of technical experts - this is a highly effective technology team. The role comes with a competitive salary and an outstanding benefits package which includes an enhanced pension, medical and healthcare, a bonus, good holiday allowance and much, much more! Please note, to be considered for this role you will MUST have the Right to Work in the UK long-term without company sponsorship. Our customer is not able to sponsor candidates for this opportunity. The role comes with an outstanding benefits package which include an enhance pension, medical and healthcare, a bonus, good holiday allowance and much, much more! KEYWORDS Data Engineer, Databricks, Power BI, Python, TSQL, Extract Transform Load (ETL), Analysis and design, Test Automation, Refactoring, Unit Testing (Mocking), Agile, Scrum, PowerShell, Azure, AWS, Data Lakes, Zoho Please note that due to a high level of applications, we can only respond to applicants whose skills and qualifications are suitable for this position. No terminology in this advert is intended to discriminate against any of the protected characteristics that fall under the Equality Act 2010. Bowerford Associates Ltd is acting as an Employment Agency in relation to this vacancy.

Power BI Data Analyst
Hays Technology
Sheffield
Hybrid
Mid - Senior
£45,000 - £50,000

Sheffield City Centre & Home working (2 days per week) Up to 48,000 + Bonus + Free Parking + Other Benefits Your new role As a Power BI / Data Analyst you will help deliver the strategic vision with its subsidiaries. The role is to provide insight and data to our internal companies, whether this be high-level interactive performance indicators and dashboards, or detailed data extracts using reports and Power BI. Your expertise will be invaluable to deliver new ways of accessing our data, understanding trends and visual reporting to better inform all levels of our people. Responsibilities Be proactive in identifying issues and the action of change. Lead in the development of new Power BI reports and the improvement of existing reports, applying the latest methods and best practices. Manage workspaces and settings within the Power BI Service. Share specialist knowledge on Power BI and related topics with members of the IT team. Ensure the correct security permissions are assigned to authorised users and are regularly reviewed. Work with users at all levels within the organisation, to gather, understand and document reporting requirements. Provide training, documentation and support to relevant departments when delivering solutions. Assist with the development and maintenance of simple apps and workflows within the Power Platform and show a willingness to advance this knowledge over time. Articulate the capabilities and limitations of Power BI clearly to personnel at all levels of the organisation. Work with data owners to investigate data accuracy and validity in various data-related projects. Work with teams across the organisation to assist in data collection and analysis, in line with relevant legislation such as GDPR. Evaluate user needs and system functionality for reporting purposes. Experience needed Proven track record of consolidating data from multiple sources, into a single or a group of reports. Ability to 'tell a story' with one or multiple sets of data, and presenting this appropriately for the intended audience. Ability to develop Power BI reports/dashboards and publish these in Power BI Service. Good understanding of Power App (model-driven and canvas), PowerFX and Power Automate development. Ability to provide support and documentation to end users. An understanding of creating reports from Dataverse, with particular emphasis on D365 data. Experience administering reports and workspaces in Power BI Service. Experience using SQL, Power Query and Data Analysis Expression (DAX). Proven track record in delivering application-based reporting solutions (dashboards). Experience in using Azure DevOps, JIRA or similar tools. Be passionate about Business Intelligence and Data and how this can add value Have demonstratable expertise in data handling and how this can add value Understanding the data warehouse lifecycle such as ETL Demonstrate experience in managing and reporting from large and small data sets Have good interpersonal skills and ability to work effectively in a team The ability to design data structures to support reporting needs Be able to explain complex information to lay audiences Desirable Microsoft Certified: Power BI Data Analyst Associate (PL-300) Experienced working with commercial data. Experience in running successful data visualisation projects. Knowledge of Microsoft Fabric or data warehousing. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found at (url removed)

Group Data Manager
JMG Group
Yorkshire
Remote or hybrid
Senior - Leader
Private salary

Location: Home based / RemoteDepartment: JMG GroupJob Type: Full timeContract Type: Permanent

JMG Group is a Private Equity backed insurance broking group based in Guiseley, West Yorkshire with strong regional office representation around the UK. We are a top 30 broker, growing rapidly having completed numerous acquisitions since our MBO in 2020. We place over £350m+ of Gross Written Premium into the market annually and our teams are very well respected in the market. Customer excellence is the backbone of our business, which means that our people, systems, and processes are central to our success

The Role

We’re accelerating our data strategy to support sustained organic and acquisitive growth. As Data Manager, you will own the group’s data platform, governance, and analytics enablement-ensuring reliable, secure, and timely management information (MI) and unlocking value with SQL, Power BI, and applied AI. The role blends hands on technical leadership with stakeholder engagement across the Group.

Our environment

You’ll work closely with colleagues across the IT team and wider Group functions; the role is hands on and outcomes-focused, supporting leadership with consistent, actionable insights as we scale.

Key Responsibilities

  • Data platform ownership: Lead the design, operation, and evolution of our SQL Server data warehouse. Establish best practice standards for schema design and performance tuning in SQL Server/SSMS.
  • ETL/ELT & integration: Define and manage pipelines (e.g., SSIS & Power Query) that consolidate sources including core broking systems (e.g., Acturis) and newly acquired businesses; deliver prompt, accurate, and comprehensive integration into group reporting.
  • Power BI enablement: Own datasets, semantic models, DAX development, row-level security, gateway/configuration, and performance. Partner with BI Developers/Analysts to deliver automated, actionable dashboards for leadership and frontline teams.
  • AI for analytics: Introduce and steward responsible AI use to enrich insights and automate MI.
  • Data governance: Implement policies and controls for data quality, develop and maintain step-by-step documentation of key weekly and monthly processes, embed validation and exception reporting so errors are identified and resolved promptly.
  • Stakeholder management: Build strong relationships across group, including Finance, Compliance, Operations, Insurer Relations and business units; translate requirements into robust data solutions and ensure MI is understood and used.
  • Team leadership: Mentor, develop and utilise a small team (BI Developers/Analysts) to deliver data strategy.
  • Acquisition reviews: identify potential synergies.
  • Roadmap & delivery: Maintain a roadmap covering platform improvements, AI use-cases, new acquisitions integration, and reporting simplification; track benefits and adoption.

Key Skills Required

  • Advanced SQL: stored procedures, views, window functions, temp tables, CTEs, query optimisation; confident with advanced SSMS query writing.
  • Significant experience delivering Power BI models and dashboards (DAX, Power Query, RLS, performance).
  • Advanced Excel: advanced formulas, Pivot tables, Power Pivot.
  • ETL/ELT tooling (SSIS/Power Query) and dimensional modelling (Star Schema).
  • Practical AI/ML exposure (e.g., Python notebooks, Azure ML, or equivalent) and the ability to operationalise models responsibly within reporting workflows.
  • Proven track record integrating disparate systems post-acquisition and delivering reliable and consistent group MI.
  • Excellent communication at all levels across the business from entry level to director.
  • Stakeholder engagement, and the ability to translate complex data into clear business actions.
  • Team management experience: Confident in leading, managing and developing a small team of analysts.

Nice to Have skills:

  • Experience with Acturis data extraction and automation of broking MI.
  • Microsoft Fabric/Azure experience.
  • Implementation, management and usage of API pipelines for reporting.
  • Experience of working in the insurance industry.
  • SSIS administration; Power BI governance at tenant level.
  • Project management experience.

Qualifications

  • Degree in a quantitative field (or equivalent experience).
  • Microsoft certifications (DP-203, PL-300) advantageous.
  • Evidence of continuous learning in data engineering, BI, and AI.

What we offer

  • Competitive salary commensurate with level of experience
  • Hybrid working options considered following training & probation completion
  • Free parking
  • Company Pension scheme
  • Generous holiday entitlement & Birthday off
  • Death in service scheme

REF-

Senior Data Engineer
VIQU IT
Bolton
Hybrid
Senior
£65,000 - £68,500
+3

Lancashire Permanent Hybrid
Competitive Salary

VIQU have partnered with a leading organisation seeking a Senior Data Engineer to join their Data and Platform Engineering team during an exciting period of cloud and data platform transformation. In this hands-on role, you will design, build, and deliver modern data platforms within a cloud-first, Data Mesh environment, work closely with product managers, architects, and engineers, take ownership of your projects, and mentor junior colleagues, making a real impact on both the technology and the team.

Key Responsibilities:

• Lead the design, development, and delivery of cloud-based data platforms and data products as a Senior Data Engineer.
• Own the full data product lifecycle, from initial design through to decommissioning.
• Build and maintain robust ETL / ELT pipelines using SQL, Python, and modern tooling.
• Collaborate closely with product managers, architects, and engineers to solve complex technical and business challenges.
• Act as the go-to technical expert for junior engineers, providing mentorship, code reviews, and quality assurance.
• Produce clear, well-documented solutions for both technical and non-technical audiences.
• Support CI/CD, environment control (dev/test/prod), and effective change management practices.
• Contribute to cloud platform development, with a strong preference for GCP (BigQuery), within a Data Mesh architecture.

Key Requirements:

• 5+ years experience as a Data Engineer with a strong focus on ETL / ELT.
• Advanced SQL and Python development skills.
• Hands-on experience with DBT, GIT, Terraform, Docker, IAM, and Airflow (Composer).
• Proven experience working on cloud platforms ideally GCP (BigQuery), but Azure or AWS also considered.
• Strong understanding of Data Mesh, Test Driven Design, and Agile delivery.
• Experience with documentation, CI/CD pipelines, and multi-environment controls.
• Excellent communication skills and the ability to lead by example within engineering teams.
• Experience supporting mergers, integrations, or large-scale organisational change is highly desirable.

Senior Data Engineer
London Permanent Hybrid
Competitive Salary

Apply today to speak with VIQU in confidence or contact Belle Hegarty at (url removed).
Know someone exceptional for this position? Refer them and receive up to £1,000 if successful (terms apply).
Follow us on LinkedIn IT Recruitment for more exciting opportunities.

BI and Data Engineering Lead
Wilf Ward Family Trust
Yorkshire
In office
Senior
£40,000 - £45,000

Salary: £40,000 £45,000 per annum
Reporting to: Digital Delivery Manager
Location: Trust sites (with travel required)
Organisation: The Wilf Ward Family Trust

At The Wilf Ward Family Trust, we’re on an ambitious digital transformation journey. A key part of this is changing how we use data moving away from manual, spreadsheet-based reporting to modern, automated, and trusted
business intelligence.
We’re looking for a BI and Data Engineering Lead to establish and lead our central reporting capability. This is a unique opportunity to assist in making a genuine impact in a values-driven organisation. You’ll design and deliver robust BI solutions, integrate data from multiple systems, and create intuitive dashboards that help colleagues across the Trust make better, data-informed decisions ultimately supporting our mission of enabling extraordinary lives through outstanding support.

What you’ll be doing
As our BI and Data Engineering Lead, you will:

  • Design, build, and maintain cost effective, automated reporting solutions.
  • Develop data models integrating multiple data sources including data from APIs, Excel, CSV, and JSON.
  • Work with the organisation to identify data quality issues and improve quality, integrity, and consistency of data.
  • Establish and document data flows, reporting processes, and technical solutions.
  • Champion a data-driven culture and promote best practice in reporting and analysis.
  • Contribute to data governance, standards, and the development of a trusted ‘single source of truth’.

What we’re looking for
We’re looking for a specialist who combines strong technical capability with a collaborative and curious mindset.

Essential experience and skills:

  • Proven experience building BI solutions using Power BI, including data modelling, DAX, and dashboards
  • Understanding of dimensional data modelling
  • Experience integrating data from multiple sources (APIs, Excel, CSV, JSON)
  • Ability to translate business needs into effective technical solutions
  • Strong data analysis and problem-solving skills
  • Experience designing user-friendly dashboards for non-technical audiences
  • Ability to work independently and manage your own priorities
  • Excellent communication skills able to explain technical concepts clearly
  • Experience documenting data processes and delivering user guidance or training
  • Confidence using Microsoft 365 tools including SharePoint, Teams, and OneDrive
  • Full UK driving licence and willingness to travel between Trust sites

Desirable:

  • Knowledge of data governance and data quality frameworks
  • Familiarity with modern data platforms (e.g., Microsoft Fabric, Data Factory, Databricks)
  • Experience supporting digital transformation initiatives
  • Experience with Python or R for data analysis or transformation
  • Experience building deployment pipelines in Microsoft 365 environment

We recognise the importance of recruiting skilled, compassionate, and reliable staff, whilst demonstrating in practice our commitment to inclusion, safeguarding and promoting the welfare of adults at risk. Please be aware it’s a criminal offence for people who are barred from working in Regulated Activity (under the Safeguarding and Vulnerable Groups Act 2006) to apply for roles that require them to work unsupervised with that particular group such as adults or children. Please note any successful job offer will be conditional based on pre-employment checks such as DBS and referencing prior to a start date agreed. The Wilf Ward Family Trust has a clear commitment to safeguarding within all practices, for more information on criminal discloses you can visit Check if you need to tell someone about your criminal record: When you need to tell someone about your criminal record

Founded in 1986, The Wilf Ward Family Trust is dedicated to enhancing the lives of individuals with disabilities. We empower them to lead fulfilling lives and embrace innovation to stay at the forefront of social care. Join us on our journey towards equality and a brighter future. Welcome to the Wilf Ward Family Trust.

Data Engineer (BD&A - DAPM Live Service Support) - Hybrid
CBSbutler Holdings Limited trading as CBSbutler
Shropshire
Hybrid
Mid - Senior
£400/day - £430/day

Job Title: Data Engineer (BD&A - DAPM Live Service Support)

Max Rate: 430 per day inside ir35

Duration: 6 months

Location: Telford/hybrid 2 days per week onsite)

Active SC security clearance is required for this role.

Job Description:

We are seeking an SC Cleared Live Support & Monitoring Engineer to provide operational support across a suite of data integration and analytics platforms. This role focuses on maintaining stability, enhancing monitoring capability, and improving service visibility through consolidated dashboards and intelligent alerting.

Responsibilities

Live Service Support

  • Provide ongoing live support across platforms including:
  • Denodo
  • Talend
  • Pentaho Data Integration (PDI)
  • Git
  • MySQL
  • Amazon Redshift
  • Investigate, diagnose and resolve incidents across data and integration services
  • Work closely with technical teams to maintain service availability and performance

Grafana Monitoring & Alerting

  • Design, create and consolidate Grafana dashboards
  • Transform multiple independent dashboards into a unified Live Service view with drill-down capability by service
  • Gather monitoring requirements from stakeholders
  • Configure and implement alerting for legacy services that currently lack monitoring
  • Deliver fit-for-purpose alert thresholds and notifications aligned to operational needs
  • Improve visibility, observability and proactive incident management

Experience & Skills

Essential

  • Active SC Clearance
  • Experience supporting live production environments
  • Exposure to data platforms such as Denodo, Talend, PDI, MySQL or Redshift
  • Experience creating or maintaining Grafana dashboards
  • Understanding of monitoring, alerting and service observability principles
  • Strong troubleshooting and analytical skills
  • Ability to gather requirements and translate them into monitoring solutions

Desirable

  • Experience configuring Grafana alerting
  • Experience working in a client-side environment
  • Knowledge of legacy system monitoring uplift
  • Familiarity with Git version control
Senior Data Engineer (Airflow)
Harnham - Data & Analytics Recruitment
Manchester
Hybrid
Senior
£70,000 - £75,000

SENIOR DATA ENGINEER

£75,000 + BENEFITS

MANCHESTER (Hybrid)

This is an opportunity to take real ownership of a modern data platform within a growing, technology-driven bank. You will join a collaborative data team, influence architectural direction, and shape the foundations that will support analytics, governance and future AI capabilities.

THE COMPANY:

I’m partnering with a fast-growing, UK-based financial services organisation that operates at the intersection of banking, technology and fintech enablement. Recently established as a fully licensed UK bank, they provide core banking infrastructure to hundreds of fintech and digital asset companies, alongside a growing SME lending operation.

THE ROLE:

  • Design and maintain scalable ELT and ETL pipelines
  • Own data warehousing architecture,
  • Introduce and enhance governance tooling,
  • Mentor a junior data engineer
  • Collaborate closely with data, product, and operational stakeholders to deliver high-impact solutions.

YOUR SKILLS AND EXPERIENCE:

  • Strong commercial experience as a Data Engineer working across AWS, Python, SQL, and Airflow.
  • Hands-on expertise designing and delivering end-to-end pipelines
  • Experience with data modelling, data warehousing, and architectural design.
  • Knowledge of governance frameworks, lineage, quality, and best-practice engineering standards.
  • Experience with infrastructure-as-code and version control tooling such as Terraform and GitHub.

THE BENEFITS:

You will receive a salary, dependent on experience. Salary is up to £75,000 On top of the salary there are some fantastic extra benefits.

HOW TO APPLY

Please register your interest by sending your CV to Molly Bird via the apply link on this page.

Repairs Data Analyst
Sellick Partnership
Manchester
Hybrid
Junior - Mid
£25/hour - £28/hour

Repairs Data Analyst - Hybrid Role

Location: Manchester
Contract: Up to 3 months
Pay: 25 - 27 Umbrella

About the Role We’re looking for a Repairs Planning Officer to join our team on a hybrid basis. You wil be responsible for providing analytical insight across data linked to a key materials project; supporting informed decision-making, with particular focus on performance monitoring, process compliance and the tracking of materials purchasing.

The Repairs Data Analyst responsibilities include:

  • Ensuring that data collected and managed by the Distribution Centre team is accurate, reliable, up to date, and sufficient to support data-driven decision making within the department and wider business.
  • Collating, organising, and analysing data to provide operational and business insight.
  • Identifying trends across datasets to inform investigations, proactive surveys, or planned programmes of work.
  • Producing analysis and reports for the department and wider business, aligned to the project scope.
  • Processing, analysing, and interpreting data related to Great Places’ performance and operations.
  • Creating visualisations and reports to communicate findings effectively to key stakeholders.
  • Providing accurate, timely, and relevant business-critical performance information.

The successful Repairs Data Analyst will have:

  • Proficiency in the full Microsoft Office suite, with advanced skills in Microsoft Excel
  • Experience working with large datasets, analysing and comparing information, and communicating results effectively
  • Experience of project management
  • Advantageous experience in SQL, power BI and data warehouse reporting and extraction

Please contact Josh at the Derby Office for more information.

Sellick Partnership is proud to be an inclusive and accessible recruitment business and we support applications from candidates of all backgrounds and circumstances. Please note, our advertisements use years’ experience, hourly rates, and salary levels purely as a guide and we assess applications based on the experience and skills evidenced on the CV. For information on how your personal details may be used by Sellick Partnership, please review our data processing notice on our website.

Senior Data Engineer - Azure & Snowflake
EMBS Engineering
London
Hybrid
Senior
Private salary

Location: Central London - 3 4 days onsite each week
Salary: £ Negotiable + benefits

We are supporting an enterprise-level client who is investing heavily in a modern cloud data platform that will sit at the centre of its data strategy. This programme will enable more advanced analytics, reporting and insight across multiple business functions.

We are looking to appoint three experienced Senior Data Engineers with strong Azure and Snowflake expertise.

The Role

This is a senior, hands-on engineering position within a high-performing data team. You will play a key role in shaping, developing and enhancing a large-scale Azure-based data platform, ensuring it is scalable, reliable and built to enterprise standards.

The position requires regular collaboration with stakeholders and an onsite presence in Central London 3 4 days per week, so this is not a fully remote role.

What You Will Be Doing

  • Building and enhancing scalable data pipelines using Azure and Snowflake
  • Developing and improving ETL / ELT processes across batch and micro-batch workloads
  • Working extensively with Azure Data Factory, Azure SQL, Azure Storage and Azure Functions
  • Designing and maintaining data warehouse structures including star and snowflake schemas
  • Applying recognised data warehousing approaches such as Kimball and Inmon
  • Writing and optimising complex SQL queries to support analytics and reporting
  • Ensuring strong data governance, quality, validation and reconciliation processes
  • Partnering with BI teams to enable effective reporting solutions
  • Contributing to architectural decisions around performance, scalability and infrastructure
  • Identifying and resolving issues to improve platform reliability and efficiency

What We Are Looking For

  • 7+ years in software engineering or development
  • 5+ years working within data-focused environments
  • At least 2 years hands-on experience with Azure cloud data platforms
  • Strong expertise across the Azure Data Platform including Data Factory, SQL, Storage and Functions
  • Proven experience in SQL development and data modelling
  • Experience building both periodic batch and micro-batch data pipelines
  • Solid understanding of enterprise data warehouse design and loading strategies
  • A minimum of 1 year hands-on experience with Snowflake
  • Experience working with large-scale enterprise datasets
  • Strong analytical mindset with a clear focus on data integrity and performance

Desirable Experience

  • Advanced Snowflake performance tuning and optimisation
  • Python and or Databricks exposure
  • Experience designing full end-to-end data platform architectures
  • Background supporting enterprise BI ecosystems
  • Familiarity with CI/CD pipelines and infrastructure-as-code practices

Additional Details

  • Visa candidates will be considered
  • Salary is open and negotiable depending on experience
  • Immediate requirement

If you are an experienced Senior Data Engineer with strong Azure and Snowflake expertise and are comfortable with a London-based hybrid working model, we d love to hear from you.

Data Analyst
Kinetic PLC
Newcastle upon Tyne
Hybrid
Junior - Mid
£18/hour - £22/hour

Kinetic are currently recruiting for a Data Analyst to work alongside one of our valued clients in a dynamic and fast-paced environment.

What’s on offer:

Monday - Friday working hours (37)
Long term opportunity with a view to take on permanent
Hybrid working
Location - Newcastle/Leeds

The Role:

Responsible for transforming raw operational, commercial, and technical data into clear, actionable insights. This role will leverage Snowflake, Power BI, and other enterprise systems to develop reliable data models, automated reporting, and intuitive dashboards that support evidence-based decision-making across the business.

Qualifications:

Degree in Data Science, Computer Science, Engineering, Mathematics, or related discipline or equivalent experience.
Professional certifications (e.g., Snowflake SnowPro, Microsoft Power BI Data Analyst) beneficial.

Key Responsibilities:

Data Management & Processing:

Extract, transform, and load (ETL) data from multiple sources, primarily using Snowflake, SQL, and associated pipelines.
Ensure high data quality, integrity, consistency, and availability.
Develop repeatable processes for data cleansing and validation.

Reporting & Dashboard Development:

Design, build, and maintain Power BI dashboards and reports for operational, commercial, and strategic use.
Optimise dashboard performance, parameterisation, and data refresh logic.
Work with stakeholders to define KPIs, metrics, and data visualisation standards.

Analytics & Insights:

Analyse large datasets to identify trends, patterns, and opportunities for improvement.
Provide insights that support Continuous Improvement, operational performance, root cause analysis, and forecasting.
Produce clear written and verbal summaries tailored to technical and non technical audiences.

Collaboration & Stakeholder Engagement:

Work closely with cross-functional teams (Operations, Engineering, Service, Finance, Supply Chain, etc.) to understand their data needs.
Translate business questions into structured analytical problems.
Provide training and knowledge sharing on dashboards, reports, and data tools.

Governance & Best Practice:

Support data governance, cataloguing, and security frameworks.
Maintain documentation for data sources, models, definitions, and dashboard usage.
Ensure compliance with internal data policies and procedures

Skills & Experience

Technical Skills:

Strong experience with SQL (Snowflake preferred).
Proficiency building Power BI dashboards, DAX formulas, and data models.
Experience working with cloud-based data warehousing platforms.
Understanding of ETL / ELT concepts, data modelling, and data architecture.
Proficient in Excel and general data manipulation tools.

Analytical Skills:

Ability to interpret large datasets into actionable insights.
Strong problem-solving and structured analysis capabilities.
Ability to create meaningful visuals and simplify complex information.

Professional Skills:

Excellent communication and stakeholder management.
Ability to work independently and prioritise multiple requests.
Strong documentation and reporting discipline.

Kinetic plc is a Recruitment Consultancy with over 40 years’ experience delivering staffing solutions to the engineering, manufacturing and technical industries.
Kinetic plc treats all applications confidentially and we review all submissions. Those that do not meet the specification may not be contacted but their CV retained to be considered against future opportunities.

S&T1

Frequently asked questions
Typically, a Data Engineer should have a strong background in computer science or related fields, proficiency in programming languages like Python or Java, and experience with data warehousing, ETL processes, and big data technologies such as Hadoop or Spark.
Haystack features a wide range of Data Engineer positions, including roles in startups, large enterprises, and remote opportunities. You can find jobs specializing in cloud data engineering, real-time data processing, data pipeline development, and more.
To improve your chances, tailor your resume to highlight relevant skills and projects, gain hands-on experience with popular data tools, contribute to open-source projects, and stay updated with the latest trends in data engineering.
Yes, Haystack lists entry-level Data Engineer roles suitable for recent graduates or professionals transitioning into data engineering, as well as internships and junior positions to help you start your career.
Absolutely. Haystack offers many remote and flexible Data Engineer job listings to suit your preferred working style and location.