Make yourself visible and let companies apply to you.
Roles
Data Engineer Jobs in Sheffield
Overview
Looking for top Data Engineer jobs in Sheffield? Haystack connects you with the latest data engineering opportunities in Sheffield’s thriving tech scene. Find your next role working with big data, ETL pipelines, and cloud platforms at leading Sheffield companies. Start your Data Engineer job search today and advance your career with Haystack!
AI Engineer - Python , LLMs
INFUSED SOLUTIONS LIMITED
Sheffield
Hybrid
Mid - Senior
£400/day - £500/day
RECENTLY POSTED

Job Title: AI Engineer - Python, LLMs

Location: Sheffield (Hybrid)

Type: Contract, Full-Time

We are looking for a skilled and experienced Analytics Engineer to join a rapidly growing organisation.

You will Partner with business stakeholders to design and deliver impactful dashboards, analytical products, and metric frameworks.

You will build high-quality analytics datasets, semantic layers, and reusable data models that power reporting and advanced insights.

You will be responsible for :

Develop and maintain CI/CD pipelines for analytics workflows, ensuring reliable, automated testing and deployment.

Use Python or similar scripting languages to support analytics, automation, and data manipulation tasks.

Leverage LLMs and AI-assisted tools to accelerate insight generation, documentation, and development workflows.

Key Skills and Experience you must have:

Strong SQL and data modelling skills.

Hands-on experience with enterprise analytics platforms such as Power BI and Tableau.

Experience in implementing CI/CD pipelines for Analytics.

Excellent experience with Python.

Excellent experience working with large Enterprise Data Platforms.

Experience using LLMs or AI-assisted tools to accelerate analytics and insight generation.

You must be a team player with the ability to work in a collaborative environment.

If the role is of interest please get across your CV.

TPBN1_UKTJ

AI Engineer - Python , LLMs
Infused Solutions Ltd
Sheffield
Hybrid
Mid - Senior
£400/day - £500/day
RECENTLY POSTED

Job Title: AI Engineer - Python, LLMs Location: Sheffield (Hybrid) Type: Contract, Full-Time We are looking for a skilled and experienced Analytics Engineer to join a rapidly growing organisation. You will Partner with business stakeholders to design and deliver impactful dashboards, analytical products, and metric frameworks. You will build high‑quality analytics datasets, semantic layers, and reusable data models that power reporting and advanced insights. You will be responsible for : Develop and maintain CI/CD pipelines for analytics workflows, ensuring reliable, automated testing and deployment. Use Python or similar scripting languages to support analytics, automation, and data manipulation tasks. Leverage LLMs and AI‑assisted tools to accelerate insight generation, documentation, and development workflows.Key Skills and Experience you must have: Strong SQL and data modelling skills. Hands-on experience with enterprise analytics platforms such as Power BI and Tableau. Experience in implementing CI/CD pipelines for Analytics. Excellent experience with Python. Excellent experience working with large Enterprise Data Platforms. Experience using LLMs or AI-assisted tools to accelerate analytics and insight generation. You must be a team player with the ability to work in a collaborative environment. If the role is of interest please get across your CV

Data Migration Specialist
etiCloud
Sheffield
In office
Mid - Senior
£60,000
RECENTLY POSTED

Location: Sheffield
Contract Type: Permanent
Hours: Full time
Salary: £45,000 - £60,000 p/a depending on experience

The Data Migration Specialist is responsible for managing and delivering high-quality data migrations from legacy Case/Practice Management Systems (CMS/PMS) into our platform. This role owns the end-to-end migration lifecycle: extraction, transformation, validation, reconciliation, and loading of data using API-led and ETL approaches.

In addition, this role will advise on and help shape an internal data warehouse that will act as the foundation for migration preparation, cleansing, auditability, and as a single source of truth for database migrations.

Key Responsibilities

Lead the data migration workstream, including planning, scoping, mapping, cleansing, testing, and loading.
Extract data from legacy CMS/PMS and related systems and transform datasets to meet target data models.
Use API-based tools and integration frameworks to perform data imports, updates, and reconciliations.
Deliver migrations across core data domains including: Accounts / financial data, Matter / case data, Contacts, Documents and related metadata
Identify data quality issues, perform remediation, and work closely with clients to validate migrated data.
Execute trial migrations, delta migrations, and final production cutover loads.
Advise on and contribute to the design and development of an internal data warehouse to: Support migration data preparation and cleansing, Provide auditability and reconciliation reporting, Act as a single source of truth for migration datasets
Collaborate closely with the Implementation Lead and project team to align migration activities with overall delivery timelines.
Produce and maintain clear documentation for: Data mapping and transformation rules, Migration processes and runbooks, Data validation and reconciliation procedures

Skills & Experience Required

Proven experience delivering data migrations in legal, financial, accounts, or SaaS environments.
Strong understanding of ETL processes, data mapping, data quality, and validation techniques.
Hands-on experience using APIs for data import/export and system integrations.
Experience with scripting or development languages such as Python and/or C#.
Strong SQL skills, including querying, data analysis, and validation.
Experience working with relational databases and structured data models.
Ability to analyse, troubleshoot, and resolve data discrepancies efficiently and methodically.
Excellent organisational, documentation, and stakeholder communication skills.
Understanding of data governance, auditability, and reconciliation best practices.

Desirable

Experience working in professional services, legal tech, or enterprise SaaS implementations.

Joining etiCloud isnt just about the job, its so much more. We want you to forge a successful and rewarding career in the IT industry. Youll be supported every step of the way in a friendly, professional environment where you and your future matter.

Heres a quick overview of what you can expect when you become part of our team:

Competitive salary with regular reviews to reward your progress
Annual company bonus recognising your hard work
Career development through ongoing training, support, and progression opportunities
28 days annual leave
Company pension scheme to support your future
Supportive, friendly team with a down-to-earth culture
Health & wellbeing benefits, including private medical insurance, health cash plan, and mental health support
Modern, secure Sheffield office with kitchen facilities and a coffee machine
Weekly fresh fruit as part of our wellbeing initiatives
Free onsite parking

Apply now and take your next step in the world of tech with etiCloud!

You may also have experience in the following: Data Migration Specialist, Data Migration Consultant, ETL Developer, Data Engineer, SQL Developer, Database Migration Consultant, Technical Implementation Consultant, Data Integration Specialist, Systems Integration Engineer, Data Warehouse Developer, Migration Engineer, Data Conversion Specialist, Application Data Consultant, Legal Tech Data Specialist, SaaS Implementation Data Consultant

REF-224 949

ML&AI Engineer
Vallum Associates Limited
Sheffield
Hybrid
Mid - Senior
£500/day
RECENTLY POSTED
TECH-AGNOSTIC ROLE

We are currently looking for an experienced ML & AI Engineer to join a major technology program delivering advanced AI-driven solutions within the banking sector. The role involves working on innovative AI initiatives, building scalable infrastructure, and developing intelligent systems that power agent-based workflows and conversational AI platforms.

You will collaborate with cross-functional teams to design and implement next-generation AI capabilities and help drive the evolution of AI-powered products.

Program Scope

  • Develop and provision infrastructure that supports agentic AI workflows across both Azure and Google Cloud Platform (GCP) environments.
  • Provide data science expertise to support the design of agent-based solutions, including Coach AI and future AI Assistant capabilities.
  • Create integration patterns for AI agents to interact with banking systems and perform actions on behalf of customers.
  • Contribute to the development of new AI products within the Conversational Banking Lab.

Key Initiatives Include

Agent Summarisation
Develop advanced capabilities to summarise complex and nuanced customer conversations.

App Search Evolution
Transform existing vector search functionality into a fully generative AI-driven search experience, creating a single unified interface for users.

Evaluation Methods
Build automated evaluation frameworks to test and validate both deterministic and generative AI conversations at scale.

Required Skills & ExperienceMust Have

  • Strong Python development skills, with 2+ years of experience building production-grade applications using Large Language Models (LLMs).

  • Solid understanding of software engineering principles, including:

    • Microservices architecture
    • CI/CD pipelines
    • Event-driven architecture
  • Hands-on experience with AI engineering practices, including:

    • RAG (Retrieval-Augmented Generation) pipelines
    • Prompt engineering
    • LLMOps
    • Runtime monitoring and evaluation of AI systems
    • Experience with Vertex AI
  • Experience in data engineering, including building scalable data pipelines using Python and Spark.

  • Strong knowledge of GCP-native services, including:

    • BigQuery (BQ)
    • Spanner
    • Dataflow
    • Firestore

Nice to Have

  • Experience with Agentic AI frameworks, such as:

    • LangGraph
    • ADK
    • CrewAI
    • Multi-agent architectures
  • Experience building deployable AI solutions (production environments rather than notebook-only solutions).

  • Knowledge of data ontologies and graph-based data models.

  • Exposure to Agile or Scrum development methodologies.

Lead Data Engineer
Stealth IT Consulting Limited
Sheffield
Hybrid
Senior
£400/day - £445/day
RECENTLY POSTED

Location: Hybrid (60% Office / 40% Remote) Sheffield
Contract: 6 Month Contract (Extension possible)
Rate: £400+ per day inside IR35

Role Overview

We are seeking an experienced Lead Data Engineer to design, develop, and optimise enterprise-scale data platforms for large, regulated organisations, ideally in banking, financial services, or other regulated sectors. This role requires hands-on technical expertise, leadership, and a consulting mindset to deliver scalable, resilient data solutions while promoting best practices and operational excellence.

Key Responsibilities

  • Lead the design, development, and optimisation of enterprise data engineering platforms
  • Build and maintain robust ETL/ELT pipelines integrating large, complex datasets
  • Work with structured, semi-structured, and unstructured data across SQL and NoSQL technologies
  • Develop solutions using Hadoop, Spark, and Splunk in large-scale environments
  • Write maintainable Python code, applying object-oriented and functional programming principles
  • Implement and maintain CI/CD pipelines, automated testing, and version control
  • Collaborate with BI, Analytics, and downstream teams to support reporting and insights
  • Pair program and mentor other engineers to promote knowledge sharing and code quality
  • Define and maintain technical test plans including unit and integration tests
  • Promote SRE principles to ensure service resilience, sustainability, and recoverability

Essential Skills & Experience

  • Proven experience as a Lead or Senior Data Engineer in enterprise-scale environments
  • Hands-on expertise with Hadoop, Spark, and Splunk
  • Advanced Python development skills
  • Experience designing and optimising high-performance data pipelines
  • Strong understanding of CI/CD, source control, and automated testing
  • Analytical, problem-solving, and leadership skills
  • Experience working in regulated, enterprise environments (e.g., banking, fintech, government)
  • Agile delivery experience (Scrum/Kanban)

Consulting & Soft Skills

  • Ability to mentor and uplift team performance
  • Strong communication and stakeholder engagement skills
  • Collaborative, delivery-focused mindset with high accountability
  • Knowledge of control, compliance, and regulatory requirements
  • Up-to-date awareness of modern tools, cybersecurity, and data privacy regulations
  • Champions innovation, advanced technologies, and best practices

If this role aligns with your skills and experience, we’d love to hear from you. Apply today to be considered.

Lead Data Engineer
Vallum Associates Limited
Sheffield
Remote or hybrid
Senior
Private salary
RECENTLY POSTED

We are seeking a Lead Data Engineering Consultant with proven experience in leading and developing data engineering platforms. The ideal candidate will possess hands-on expertise in the following areas: 1. Extensive enterprise experience withHadoop, Spark, and Splunk. 2. Proficiency in object-oriented and functional scripting, particularly inPython. 3. Skilled in handling raw, structured, semi-structured, and unstructured data (SQL and NoSQL). 4. Experience integrating large, disparate datasets using modern tools and frameworks. 5. Strong background in building and optimizingETL/ELT data pipelines. 6. Familiarity with source control and implementing Continuous Integration, Delivery, and Deployment via CI/CD pipelines. 7. Experience supporting and collaborating with BI and Analytics teamsin fast-paced environments. 8. Ability to pair program and work effectively with other engineers. 9. Excellent analytical and problem-solving abilities. 10. Knowledge of agile methodologies such as Scrum or Kanban is a plus. 11. Comfortable representing the team in standups and problem-solving sessions. 12. Capable of driving the creation of technical test plans and maintaining records, including unit and integration tests, within automated test environments to ensure high code quality. 13. Promote SRE (Site Reliability Engineering) culture by addressing challenges through data engineering. 14. Ensure service resilience, sustainability, and adherence to recovery time objectives for all delivered software solutions.

ML&AI Engineer
Vallum Associates Limited
Sheffield
Remote or hybrid
Mid - Senior
£500/day
RECENTLY POSTED

We are currently looking for an experienced ML & AI Engineer to join a major technology program delivering advanced AI-driven solutions within the banking sector. The role involves working on innovative AI initiatives, building scalable infrastructure, and developing intelligent systems that power agent-based workflows and conversational AI platforms. You will collaborate with cross-functional teams to design and implement next-generation AI capabilities and help drive the evolution of AI-powered products. Program Scope Develop and provision infrastructure that supports agentic AI workflows across both Azure and Google Cloud Platform (GCP) environments. Provide data science expertise to support the design of agent-based solutions, including Coach AI and future AI Assistant capabilities. Create integration patterns for AI agents to interact with banking systems and perform actions on behalf of customers. Contribute to the development of new AI products within the Conversational Banking Lab. Key Initiatives Include Agent Summarisation Develop advanced capabilities to summarise complex and nuanced customer conversations. App Search Evolution Transform existing vector search functionality into a fully generative AI-driven search experience , creating a single unified interface for users. Evaluation Methods Build automated evaluation frameworks to test and validate both deterministic and generative AI conversations at scale. Required Skills & Experience Must Have Strong Python development skills , with 2+ years of experience building production-grade applications using Large Language Models (LLMs) . Solid understanding of software engineering principles , including: Microservices architecture CI/CD pipelines Event-driven architecture Hands-on experience with AI engineering practices , including: RAG (Retrieval-Augmented Generation) pipelines Prompt engineering LLMOps Runtime monitoring and evaluation of AI systems Experience with Vertex AI Experience in data engineering , including building scalable data pipelines using Python and Spark . Strong knowledge of GCP-native services , including: BigQuery (BQ) Spanner Dataflow Firestore Nice to Have Experience with Agentic AI frameworks , such as: LangGraph ADK CrewAI Multi-agent architectures Experience building deployable AI solutions (production environments rather than notebook-only solutions). Knowledge of data ontologies and graph-based data models . Exposure to Agile or Scrum development methodologies . TPBN1\_UKTJ

Data Analyst Training Course (Excel, SQL & Power BI)
Netcom Training
Sheffield
Fully remote
Graduate - Junior
Private salary

About the opportunity

Are you ready to launch a career in Data Analytics and Business Intelligence?

Netcom Training s fully-funded Data course (NCFE Certificate in Data, Level 3) equips you with the technical skills employers are actively seeking. From data sourcing, cleansing, and analysis to visualization and reporting, you ll gain hands-on experience that prepares you for today s fast-growing data-driven roles.

Our learners have successfully moved into roles such as Junior Data Analyst, Operations Analyst, Business Intelligence Assistant, and Database Administrator, working across tech, finance, healthcare, and the public sector.

Complete the course and gain a guaranteed interview with a leading employer, helping you kickstart your career

Course Details

  • Start Date: 23/02/2026, 16/03/2026
  • Duration: 11 weeks
  • Days: Monday Thursday
  • Times: 6:00 PM 9:00 PM
  • Format: Online, practical workshops

What you ll learn

  • Data Management: Understand how to source, gather, and store data securely.
  • Data Cleansing: Learn to collate and format raw data for accurate processing.
  • Analysis & Insight: Analyse datasets to support key business decisions and outcomes.
  • Visualization: Present and communicate insights clearly to stakeholders.
  • Tools & Tech: Gain exposure to professional tools commonly used in the industry (e.g., Excel concepts, Reporting tools).
  • Compliance: Understand secure data handling and GDPR principles.
  • Collaboration: Practice continuous professional development in a team setting.

Career Pathway

Successful participants are guaranteed an interview with our network of UK-wide partners working with leading brands.

Potential Roles:

  • Junior Data Analyst
  • Reporting Assistant
  • Data Administrator
  • Business Analyst

Eligibility

This is a government-funded opportunity. To apply, you must:

  • Live in South Yorkshire.
  • Be aged 19 or over.
  • Earn below the gross annual wage cap of £34,194.
  • Not currently be undertaking other government-funded training.
  • Right to Work: You must have lived in the UK/EU for the last 3 years and have the right to work in the UK (Student/Graduate visas are not eligible).

Cost

This is a fully-funded course with no fees complete the training, gain essential data skills, and secure your guaranteed interview.

provided you meet the learner obligations outlined in our employablility terms and conditions, which can be found on our website.

Trainee Data Analyst - Training Course
Netcom Training
Sheffield
Fully remote
Graduate - Junior
Private salary

Are you ready to launch a career in Data Analytics and Business Intelligence? Netcom Training s fully-funded Data course (NCFE Certificate in Data, Level 3) equips you with the technical skills employers are actively seeking.

From data sourcing, cleansing, and analysis to visualisation and reporting, you ll gain hands-on experience that prepares you for today s fast-growing data-driven roles.

Our learners have successfully moved into roles such as Junior Data Analyst, Operations Analyst, Business Intelligence Assistant, Database Administrator, and Pricing Analyst, working across tech, finance, healthcare, and the public sector. Complete the course and gain a guaranteed interview with a leading employer, helping you kickstart your career.

Course Details

  • Start Date: 16/03
  • Duration: 10 weeks
  • Days: Mon-Thu
  • Times: 6-9pm
  • Format: Online, practical workshops

What you ll learn

  • Data Management: Understand how to source, gather, and store data securely.
  • Data Cleansing: Learn to collate and format raw data for accurate processing.
  • Analysis & Insight: Analyse datasets to support key business decisions and outcomes.
  • Visualisation: Present and communicate insights clearly to stakeholders.
  • Tools & Tech: Gain exposure to professional tools commonly used in the industry (e.g., Excel concepts, Reporting tools).
  • Compliance: Understand secure data handling and GDPR principles.
  • Collaboration: Practice continuous professional development in a team setting.

Career Pathway

Successful participants are guaranteed an interview with our network of UK-wide partners working with leading brands.

  • Potential Roles: Junior Data Analyst, Reporting Assistant, Data Administrator, Business Analyst.

Starting Salaries: Typically £22,000 £28,000 (role dependent)

Eligibility

To apply, you must:

  • Live in South Yorkshire
  • Be aged 19 or over
  • Earn below the gross annual wage cap of £34,194
  • Not currently be undertaking other government-funded training
  • Not be in the UK on a student, graduate, postgraduate, or sponsored visa, or as a dependent
Data Engineer Lead (Openshift)
Infoplus Technologies UK Ltd
Sheffield
Remote or hybrid
Senior
£450/day - £480/day

Key Responsibilities: Design, implement, and maintain data pipelines to ingest and process OpenShift telemetry (metrics, logs, traces) at scale. Stream OpenShift telemetry via Kafka (producers, topics, schemas) and build resilient consumer services for transformation and enrichment. Engineer data models and routing for multi-tenant observability; ensure lineage, quality, and SLAs across the stream layer. Integrate processed telemetry into Splunk for visualization, dashboards, alerting, and analytics to achieve Observability Level 4 (proactive insights). Implement schema management (Avro/Protobuf), governance, and versioning for telemetry events. Build automated validation, replay, and backfill mechanisms for data reliability and recovery. Instrument services with OpenTelemetry; standardize tracing, metrics, and structured logging across platforms. Use LLMs to enhance observability capabilities (e.g., query assistance, anomaly summarization, runbook generation). Collaborate with platform, SRE, and application teams to integrate telemetry, alerts, and SLOs. Ensure security, compliance, and best practices for data pipelines and observability platforms. Document data flows, schemas, dashboards, and operational runbooks.
Required Skills: Hands-on experience building streaming data pipelines with Kafka (producers/consumers, schema registry, Kafka Connect/KSQL/KStream). Proficiency with OpenShift/Kubernetes telemetry (OpenTelemetry, Prometheus) and CLI tooling. Experience integrating telemetry into Splunk (HEC, UF, sourcetypes, CIM), building dashboards and alerting. Strong data engineering skills in Python (or similar) for ETL/ELT, enrichment, and validation. Knowledge of event schemas (Avro/Protobuf/JSON), contracts, and backward/forward compatibility. Familiarity with observability standards and practices; ability to drive toward Level 4 maturity (proactive monitoring, automated insights). Understanding of hybrid cloud and multi-cluster telemetry patterns. Security and compliance for data pipelines: secret management, RBAC, encryption in transit/at rest. Good problem-solving skills and ability to work in a collaborative team environment. Strong communication and documentation skills.

Data Architect
Derbyshire County Council
Matlock
Hybrid
Senior
£40,000
TECH-AGNOSTIC ROLE

We are accepting CVs for this position so its easier than ever to apply

Exciting things are happening within the Digital Services Team at Derbyshire County Council and were looking fora Data Architect to join our Strategy and Architecture team.

A major review of our organisation and strategies has been completed and we have invested in new tools to support acomprehensive and innovative Digital Services transformation programme, to support the delivery of key services to the citizens and communities of Derbyshire.

This role will be pivotal in helping usdeliver our transformation programme, as a key post within theDigital Services, Strategy and Architecture function. This is an excellent opportunity for someone who has significant experience in Data Architecture anddelivering changeand who really wants to make a difference.

Key Responsibilities of this role are:

Based within a dynamic and challenging environment, you will be working in several key areas including:

  • design data models at different levels: conceptual, logical and physical
  • define and maintain the data technology architecture, including metadata and integration or data warehouse architecture
  • provide guidance for the upgrade, management, decommission and archive of data in compliance with related policies
  • support the Principal Data Architect to ensure work packages are managed appropriately and progressed through the prescribed enterprise architecture framework and governance processes
  • manage any issues arising from the Data Management Strategy implementation, managing changes in a pragmatic manner that allow organisational objectives to be achieved, while adhering to its core principles
  • take an active role in Architecture Governance process to ensure that the organisations systems are designed in accordance with the enterprise data architecture
  • implement identified opportunities for consolidation and use of data management tools and storage platforms in line with organisation and business requirements and manage the plan for migration onto these
  • championing data architecture both internally and through collaborating and communicating at the different levels across the Council

What skills and experience do you need to have ?

  • experience of being a subject matter expert on data architecture and the implementation of data management initiatives
  • proven success of providing deep Data architecture knowledge to large complex IT/Digital projects
  • you can undertake data profiling and source system analysis. You can present clear insights to colleagues to support the end use of the data
  • experience in evaluating, selecting and onboarding architecture tools (LeanIX is preferred)
  • you can develop data standards and analyse where data standards have been applied or breached and undertake an impact analysis of that breach
  • you can show an awareness that data needs to be aligned to the needs of the end user.
  • you can take responsibility for the assurance of data solutions and make recommendations to ensure compliance
  • proved experience of building effective relationship with technical and non-technical stakeholders
  • you can show an awareness of opportunities for innovation with new tools and uses of data

What we offer you:

  • 27 days of annual leave plus another 5 once you reach 5 years of continuous service
  • Flexi time, allowing up to an additional 2 days of leave per month
  • Generous local government pension scheme
  • Guaranteed incremental annual pay increases
  • Cycle to work scheme
  • A supportive working environment
  • Commitment and investment in your continuing professional development.
  • Discounts at selected county leisure centres
  • and muchmore!

Our Digital Services Team is contractually based in Matlock, Derbyshire, but we work on a hybrid basis.

Derbyshire County Council is going through a period of positive development and transformation so its a great time to join and drive change.

If you’re passionate about Data and are ready to revolutionise the way Derbyshire County Council operates in the digital era then we would love to hear from you.

If you would like to discuss the role further, please see our website for contact details

Don’t miss out on this exciting opportunity to contribute to the digital revolution at Derbyshire County Council.Apply now and be at the forefront of shaping the future.

Derbyshire County Council is an equal opportunities employer and we welcome applications from individuals of all backgrounds, experiences, and abilities.

All candidates must be able to provide proof of right to work in the UK.

Provisional Interview Date: Week Commencing 23 March 2026

This role currently offers hybrid working options which will be to subject service needs, there’ll be an opportunity to discuss working arrangements for this position at interview.

Important:Derbyshire County Council holds a sponsorship licence, butthis role does not offer visa sponsorship. Please applyonly if you already have the right to work in the UK without sponsorship, as applications that do not meet this requirement will be rejected.

We welcome applications from individuals who share our values being Collaborative, Innovative, Empowered and Accountable. These values describe who we are and what we stand for as a council. They help shape our culture, encourage consistent behaviour and guide how we work together to make a positive difference for both our employees and the residents we serve.

Lead Data Engineer
Rebel Recruitment
Sheffield
Hybrid
Senior
£85,000

Role: Lead Data Engineer - Azure/Databricks

Location: Sheffield

Working Model: Hybrid - 2 days per week in person

Salary: Up to £85k depending on experience

Own the data platform. Shape the future architecture.

This role is for a hands-on data engineering leader who wants to build something properly, not babysit legacy pipelines.

Youll take full ownership of the data engineering domain, leading the design and building of a modern, highly scalable data platform that handles complex, high-frequency industrial data. Youll work closely with data scientists, software engineers, and senior leadership, with the trust and mandate to make meaningful architectural decisions.

If you enjoy balancing strong engineering principles with real-world business needs, and you like mentoring others while staying deeply technical, this role is built for you.

What youll be working on

  • Architecting and rebuilding the data transformation layer in Databricks
  • Designing robust data flows that support both real-time operational views and deep historical analysis
  • Moving pipelines from ad-hoc scripts to software-engineering standards(CI/CD, testing, modular design)
  • Defining clear data models, schemas, and standards across a complex data estate
  • Establishing a stable, high-performance serving layer for analytics, visualisation, and data science workloads
  • Working closely with data scientists to remove bottlenecks and enable better modelling and experimentation
  • Pair-programming, mentoring, and raising the engineering bar for a small, capable team

You wont just be delivering features; youll be setting direction.

The tech youll work with

  • Core:Azure, Databricks, Python, SQL, dbt, MQTT
  • Storage & Serving:Delta Lake, Postgres, TimescaleDB
  • Modelling & ML:MLflow
  • Visualisation:Grafana

What makes you a great fit

Youre someone who can zoom out to architecture and zoom in to code, comfortably.

  • Proven experience building and owning data platforms on Azure
  • Deep, hands-on knowledge of Databricks(lakehouse architecture, cluster management, performance and cost optimisation)
  • Strong opinions on data modelling, schema design, and standardisation
  • You treat data pipelines as software: version control, CI/CD, automated testing
  • Comfortable challenging architectural decisions, and explaining why
  • Able to translate technical trade-offs into business impact for non-technical stakeholders
  • A mentor by nature: you raise the team through pairing, guidance, and example

Industry experience

  • Experience with industrial, sensor-driven, or time-series data is highly desirable
  • Alternatively, background in high-volume or highly variable data environments (missing data, duplicates, schema drift, spiky load) will transfer well

Why youll want this role

  • Real autonomy:Youre trusted to make architectural calls, thats why youre being hired
  • Visible impact:Your work directly unlocks better analytics and machine learning outcomes
  • Modern problems:Youll work on advanced data patterns and architectures, not cosmetic refactors
  • Still technical:This is a leadership role without stepping away from code

What youll get

  • 5 weeks paid holiday plus bank holidays
  • Tax-efficient stock options
  • Company pension scheme
  • Salary sacrifice EV scheme
  • Training and professional development support
  • Regular all-hands sessions with real transparency
  • Hybrid working (2 days in person)
  • Quarterly employee recognition awards
  • Access to discounts via BrightHR

We welcome diverse applicants and are dedicated to treating all applicants with dignity and respect, regardless of background.

CGEMJP00330718 Lead Data Engineer
CBSbutler Holdings Limited trading as CBSbutler
Sheffield
Hybrid
Senior
£430/day

Role Title: Lead Data Engineer Location: Sheffield/hybrid (3 days on site) Duration: 9 months Rate: £430 per day inside ir35 We are seeking a Lead Data Engineering Consultant with proven experience in leading and developing data engineering platforms. Experience required: Extensive enterprise experience with Hadoop, Spark, and Splunk. Proficiency in object-oriented and functional scripting, particularly in Python. Skilled in handling raw, structured, semi-structured, and unstructured data (SQL and NoSQL). Experience integrating large, disparate datasets using modern tools and frameworks. Strong background in building and optimizing ETL/ELT data pipelines. Familiarity with source control and implementing Continuous Integration, Delivery, and Deployment via CI/CD pipelines. Experience supporting and collaborating with BI and Analytics teams in fast-paced environments. Ability to pair program and work effectively with other engineers. Excellent analytical and problem-solving abilities. Knowledge of agile methodologies such as Scrum or Kanban is a plus. Comfortable representing the team in standups and problem-solving sessions. Capable of driving the creation of technical test plans and maintaining records, including unit and integration tests, within automated test environments to ensure high code quality. Promote SRE (Site Reliability Engineering) culture by addressing challenges through data engineering. Ensure service resilience, sustainability, and adherence to recovery time objectives for all delivered software solutions.Soft Skills (Consultant): Demonstrated ability and enthusiasm for enhancing team performance. Strong active listening and effective communication skills. Self-mastery, with a focus on positive mindsets and professional behaviours. Maintains up-to-date expertise in current tools, technologies, and key areas such as cybersecurity, data privacy, consent, and data residency regulations. Engages with industry groups and external vendors to represent and advance HSBC's interests and influence. Takes accountability for ensuring control and compliance throughout the engineering process. Champions innovation and the adoption of advanced technologies and best practices within the domain.If you are interested in this role or wish to apply, please feel free to submit your CV

Power BI Data Analyst
Hays Technology
Sheffield
Hybrid
Mid - Senior
£45,000 - £50,000

Sheffield City Centre & Home working (2 days per week) Up to 48,000 + Bonus + Free Parking + Other Benefits Your new role As a Power BI / Data Analyst you will help deliver the strategic vision with its subsidiaries. The role is to provide insight and data to our internal companies, whether this be high-level interactive performance indicators and dashboards, or detailed data extracts using reports and Power BI. Your expertise will be invaluable to deliver new ways of accessing our data, understanding trends and visual reporting to better inform all levels of our people. Responsibilities Be proactive in identifying issues and the action of change. Lead in the development of new Power BI reports and the improvement of existing reports, applying the latest methods and best practices. Manage workspaces and settings within the Power BI Service. Share specialist knowledge on Power BI and related topics with members of the IT team. Ensure the correct security permissions are assigned to authorised users and are regularly reviewed. Work with users at all levels within the organisation, to gather, understand and document reporting requirements. Provide training, documentation and support to relevant departments when delivering solutions. Assist with the development and maintenance of simple apps and workflows within the Power Platform and show a willingness to advance this knowledge over time. Articulate the capabilities and limitations of Power BI clearly to personnel at all levels of the organisation. Work with data owners to investigate data accuracy and validity in various data-related projects. Work with teams across the organisation to assist in data collection and analysis, in line with relevant legislation such as GDPR. Evaluate user needs and system functionality for reporting purposes. Experience needed Proven track record of consolidating data from multiple sources, into a single or a group of reports. Ability to 'tell a story' with one or multiple sets of data, and presenting this appropriately for the intended audience. Ability to develop Power BI reports/dashboards and publish these in Power BI Service. Good understanding of Power App (model-driven and canvas), PowerFX and Power Automate development. Ability to provide support and documentation to end users. An understanding of creating reports from Dataverse, with particular emphasis on D365 data. Experience administering reports and workspaces in Power BI Service. Experience using SQL, Power Query and Data Analysis Expression (DAX). Proven track record in delivering application-based reporting solutions (dashboards). Experience in using Azure DevOps, JIRA or similar tools. Be passionate about Business Intelligence and Data and how this can add value Have demonstratable expertise in data handling and how this can add value Understanding the data warehouse lifecycle such as ETL Demonstrate experience in managing and reporting from large and small data sets Have good interpersonal skills and ability to work effectively in a team The ability to design data structures to support reporting needs Be able to explain complex information to lay audiences Desirable Microsoft Certified: Power BI Data Analyst Associate (PL-300) Experienced working with commercial data. Experience in running successful data visualisation projects. Knowledge of Microsoft Fabric or data warehousing. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found at (url removed)

CGEMJP Lead Data Engineer
CBSbutler Holdings Limited trading as CBSbutler
Sheffield
Hybrid
Senior
£430/day

Role Title: Lead Data Engineer

Location: Sheffield/hybrid (3 days on site)

Duration: 9 months

Rate: 430 per day inside ir35

We are seeking a Lead Data Engineering Consultant with proven experience in leading and developing data engineering platforms.

Experience required:

  • Extensive enterprise experience with Hadoop, Spark, and Splunk.
  • Proficiency in object-oriented and functional scripting, particularly in Python.
  • Skilled in handling raw, structured, semi-structured, and unstructured data (SQL and NoSQL).
  • Experience integrating large, disparate datasets using modern tools and frameworks.
  • Strong background in building and optimizing ETL/ELT data pipelines.
  • Familiarity with source control and implementing Continuous Integration, Delivery, and Deployment via CI/CD pipelines.
  • Experience supporting and collaborating with BI and Analytics teams in fast-paced environments.
  • Ability to pair program and work effectively with other engineers.
  • Excellent analytical and problem-solving abilities.
  • Knowledge of agile methodologies such as Scrum or Kanban is a plus.
  • Comfortable representing the team in standups and problem-solving sessions.
  • Capable of driving the creation of technical test plans and maintaining records, including unit and integration tests, within automated test environments to ensure high code quality.
  • Promote SRE (Site Reliability Engineering) culture by addressing challenges through data engineering.
  • Ensure service resilience, sustainability, and adherence to recovery time objectives for all delivered software solutions.

Soft Skills (Consultant):

  • Demonstrated ability and enthusiasm for enhancing team performance.
  • Strong active listening and effective communication skills.
  • Self-mastery, with a focus on positive mindsets and professional behaviours.
  • Maintains up-to-date expertise in current tools, technologies, and key areas such as cybersecurity, data privacy, consent, and data residency regulations.
  • Engages with industry groups and external vendors to represent and advance HSBC’s interests and influence.
  • Takes accountability for ensuring control and compliance throughout the engineering process.
  • Champions innovation and the adoption of advanced technologies and best practices within the domain.

If you are interested in this role or wish to apply, please feel free to submit your CV.

Lead Data Engineer - Hadoop - Spark - Python
Square One Resources
Sheffield
Hybrid
Senior
£600/day - £617/day

Job Title: Lead Data Engineer - Hadoop, Spark, Pytthon
Location: Sheffield - 3 days per week in the office
Salary/Rate: Up to 617 per day inside IR35
Start Date: 02/03/2026
Job Type: Contract until November

We are seeking a Lead Data Engineering Consultant with proven experience in leading and developing data engineering platforms.

The ideal candidate will possess hands-on expertise in the following areas:

  • Extensive enterprise experience with Hadoop, Spark, and Splunk.
  • Proficiency in object-oriented and functional scripting, particularly in Python.
  • Skilled in handling raw, structured, semi-structured, and unstructured data (SQL and NoSQL).
  • Experience integrating large, disparate datasets using modern tools and frameworks.
  • Strong background in building and optimizing ETL/ELT data pipelines.
  • Familiarity with source control and implementing Continuous Integration, Delivery, and Deployment via CI/CD pipelines.
  • Experience supporting and collaborating with BI and Analytics teams in fast-paced environments.
  • Ability to pair program and work effectively with other engineers.
  • Excellent analytical and problem-solving abilities.
  • Knowledge of agile methodologies such as Scrum or Kanban is a plus.
  • Comfortable representing the team in standups and problem-solving sessions.
  • Capable of driving the creation of technical test plans and maintaining records, including unit and integration tests, within automated test environments to ensure high code quality.
  • Promote SRE (Site Reliability Engineering) culture by addressing challenges through data engineering.
  • Ensure service resilience, sustainability, and adherence to recovery time objectives for all delivered software solutions.

If you are interested in this opportunity, please apply now with your updated CV in Microsoft Word/PDF format.

Disclaimer
Notwithstanding any guidelines given to level of experience sought, we will consider candidates from outside this range if they can demonstrate the necessary competencies.
Square One is acting as both an employment agency and an employment business, and is an equal opportunities recruitment business. Square One embraces diversity and will treat everyone equally. Please see our website for our full diversity statement.

Page 1 of 1
Frequently asked questions
Sheffield offers a variety of Data Engineer roles, ranging from junior positions to senior and lead roles across industries such as healthcare, finance, and manufacturing.
While certifications like AWS Certified Data Analytics or Google Cloud Professional Data Engineer can enhance your profile, most employers prioritize hands-on experience with tools like SQL, Python, and big data technologies.
The average salary for Data Engineers in Sheffield typically ranges from £40,000 to £60,000 annually, depending on experience and the complexity of the role.
Yes, many companies in Sheffield offer flexible working arrangements, including fully remote or hybrid roles, especially for experienced Data Engineers.
Simply browse the available Data Engineer listings on our Sheffield job page, create a profile, and submit your CV directly through our platform to apply for roles.