Job Title: AI Engineer - Python, LLMs
Location: Sheffield (Hybrid)
Type: Contract, Full-Time
We are looking for a skilled and experienced Analytics Engineer to join a rapidly growing organisation.
You will Partner with business stakeholders to design and deliver impactful dashboards, analytical products, and metric frameworks.
You will build high-quality analytics datasets, semantic layers, and reusable data models that power reporting and advanced insights.
You will be responsible for :
Develop and maintain CI/CD pipelines for analytics workflows, ensuring reliable, automated testing and deployment.
Use Python or similar scripting languages to support analytics, automation, and data manipulation tasks.
Leverage LLMs and AI-assisted tools to accelerate insight generation, documentation, and development workflows.
Key Skills and Experience you must have:
Strong SQL and data modelling skills.
Hands-on experience with enterprise analytics platforms such as Power BI and Tableau.
Experience in implementing CI/CD pipelines for Analytics.
Excellent experience with Python.
Excellent experience working with large Enterprise Data Platforms.
Experience using LLMs or AI-assisted tools to accelerate analytics and insight generation.
You must be a team player with the ability to work in a collaborative environment.
If the role is of interest please get across your CV.
TPBN1_UKTJ
Job Title: AI Engineer - Python, LLMs Location: Sheffield (Hybrid) Type: Contract, Full-Time We are looking for a skilled and experienced Analytics Engineer to join a rapidly growing organisation. You will Partner with business stakeholders to design and deliver impactful dashboards, analytical products, and metric frameworks. You will build high‑quality analytics datasets, semantic layers, and reusable data models that power reporting and advanced insights. You will be responsible for : Develop and maintain CI/CD pipelines for analytics workflows, ensuring reliable, automated testing and deployment. Use Python or similar scripting languages to support analytics, automation, and data manipulation tasks. Leverage LLMs and AI‑assisted tools to accelerate insight generation, documentation, and development workflows.Key Skills and Experience you must have: Strong SQL and data modelling skills. Hands-on experience with enterprise analytics platforms such as Power BI and Tableau. Experience in implementing CI/CD pipelines for Analytics. Excellent experience with Python. Excellent experience working with large Enterprise Data Platforms. Experience using LLMs or AI-assisted tools to accelerate analytics and insight generation. You must be a team player with the ability to work in a collaborative environment. If the role is of interest please get across your CV
Location: Sheffield
Contract Type: Permanent
Hours: Full time
Salary: £45,000 - £60,000 p/a depending on experience
The Data Migration Specialist is responsible for managing and delivering high-quality data migrations from legacy Case/Practice Management Systems (CMS/PMS) into our platform. This role owns the end-to-end migration lifecycle: extraction, transformation, validation, reconciliation, and loading of data using API-led and ETL approaches.
In addition, this role will advise on and help shape an internal data warehouse that will act as the foundation for migration preparation, cleansing, auditability, and as a single source of truth for database migrations.
Key Responsibilities
Lead the data migration workstream, including planning, scoping, mapping, cleansing, testing, and loading.
Extract data from legacy CMS/PMS and related systems and transform datasets to meet target data models.
Use API-based tools and integration frameworks to perform data imports, updates, and reconciliations.
Deliver migrations across core data domains including: Accounts / financial data, Matter / case data, Contacts, Documents and related metadata
Identify data quality issues, perform remediation, and work closely with clients to validate migrated data.
Execute trial migrations, delta migrations, and final production cutover loads.
Advise on and contribute to the design and development of an internal data warehouse to: Support migration data preparation and cleansing, Provide auditability and reconciliation reporting, Act as a single source of truth for migration datasets
Collaborate closely with the Implementation Lead and project team to align migration activities with overall delivery timelines.
Produce and maintain clear documentation for: Data mapping and transformation rules, Migration processes and runbooks, Data validation and reconciliation procedures
Skills & Experience Required
Proven experience delivering data migrations in legal, financial, accounts, or SaaS environments.
Strong understanding of ETL processes, data mapping, data quality, and validation techniques.
Hands-on experience using APIs for data import/export and system integrations.
Experience with scripting or development languages such as Python and/or C#.
Strong SQL skills, including querying, data analysis, and validation.
Experience working with relational databases and structured data models.
Ability to analyse, troubleshoot, and resolve data discrepancies efficiently and methodically.
Excellent organisational, documentation, and stakeholder communication skills.
Understanding of data governance, auditability, and reconciliation best practices.
Desirable
Experience working in professional services, legal tech, or enterprise SaaS implementations.
Joining etiCloud isnt just about the job, its so much more. We want you to forge a successful and rewarding career in the IT industry. Youll be supported every step of the way in a friendly, professional environment where you and your future matter.
Heres a quick overview of what you can expect when you become part of our team:
Competitive salary with regular reviews to reward your progress
Annual company bonus recognising your hard work
Career development through ongoing training, support, and progression opportunities
28 days annual leave
Company pension scheme to support your future
Supportive, friendly team with a down-to-earth culture
Health & wellbeing benefits, including private medical insurance, health cash plan, and mental health support
Modern, secure Sheffield office with kitchen facilities and a coffee machine
Weekly fresh fruit as part of our wellbeing initiatives
Free onsite parking
Apply now and take your next step in the world of tech with etiCloud!
You may also have experience in the following: Data Migration Specialist, Data Migration Consultant, ETL Developer, Data Engineer, SQL Developer, Database Migration Consultant, Technical Implementation Consultant, Data Integration Specialist, Systems Integration Engineer, Data Warehouse Developer, Migration Engineer, Data Conversion Specialist, Application Data Consultant, Legal Tech Data Specialist, SaaS Implementation Data Consultant
REF-224 949
We are currently looking for an experienced ML & AI Engineer to join a major technology program delivering advanced AI-driven solutions within the banking sector. The role involves working on innovative AI initiatives, building scalable infrastructure, and developing intelligent systems that power agent-based workflows and conversational AI platforms.
You will collaborate with cross-functional teams to design and implement next-generation AI capabilities and help drive the evolution of AI-powered products.
Program Scope
Key Initiatives Include
Agent Summarisation
Develop advanced capabilities to summarise complex and nuanced customer conversations.
App Search Evolution
Transform existing vector search functionality into a fully generative AI-driven search experience, creating a single unified interface for users.
Evaluation Methods
Build automated evaluation frameworks to test and validate both deterministic and generative AI conversations at scale.
Required Skills & ExperienceMust Have
Strong Python development skills, with 2+ years of experience building production-grade applications using Large Language Models (LLMs).
Solid understanding of software engineering principles, including:
Hands-on experience with AI engineering practices, including:
Experience in data engineering, including building scalable data pipelines using Python and Spark.
Strong knowledge of GCP-native services, including:
Nice to Have
Experience with Agentic AI frameworks, such as:
Experience building deployable AI solutions (production environments rather than notebook-only solutions).
Knowledge of data ontologies and graph-based data models.
Exposure to Agile or Scrum development methodologies.
Location: Hybrid (60% Office / 40% Remote) Sheffield
Contract: 6 Month Contract (Extension possible)
Rate: £400+ per day inside IR35
Role Overview
We are seeking an experienced Lead Data Engineer to design, develop, and optimise enterprise-scale data platforms for large, regulated organisations, ideally in banking, financial services, or other regulated sectors. This role requires hands-on technical expertise, leadership, and a consulting mindset to deliver scalable, resilient data solutions while promoting best practices and operational excellence.
Key Responsibilities
Essential Skills & Experience
Consulting & Soft Skills
If this role aligns with your skills and experience, we’d love to hear from you. Apply today to be considered.
We are seeking a Lead Data Engineering Consultant with proven experience in leading and developing data engineering platforms. The ideal candidate will possess hands-on expertise in the following areas: 1. Extensive enterprise experience withHadoop, Spark, and Splunk. 2. Proficiency in object-oriented and functional scripting, particularly inPython. 3. Skilled in handling raw, structured, semi-structured, and unstructured data (SQL and NoSQL). 4. Experience integrating large, disparate datasets using modern tools and frameworks. 5. Strong background in building and optimizingETL/ELT data pipelines. 6. Familiarity with source control and implementing Continuous Integration, Delivery, and Deployment via CI/CD pipelines. 7. Experience supporting and collaborating with BI and Analytics teamsin fast-paced environments. 8. Ability to pair program and work effectively with other engineers. 9. Excellent analytical and problem-solving abilities. 10. Knowledge of agile methodologies such as Scrum or Kanban is a plus. 11. Comfortable representing the team in standups and problem-solving sessions. 12. Capable of driving the creation of technical test plans and maintaining records, including unit and integration tests, within automated test environments to ensure high code quality. 13. Promote SRE (Site Reliability Engineering) culture by addressing challenges through data engineering. 14. Ensure service resilience, sustainability, and adherence to recovery time objectives for all delivered software solutions.
We are currently looking for an experienced ML & AI Engineer to join a major technology program delivering advanced AI-driven solutions within the banking sector. The role involves working on innovative AI initiatives, building scalable infrastructure, and developing intelligent systems that power agent-based workflows and conversational AI platforms. You will collaborate with cross-functional teams to design and implement next-generation AI capabilities and help drive the evolution of AI-powered products. Program Scope Develop and provision infrastructure that supports agentic AI workflows across both Azure and Google Cloud Platform (GCP) environments. Provide data science expertise to support the design of agent-based solutions, including Coach AI and future AI Assistant capabilities. Create integration patterns for AI agents to interact with banking systems and perform actions on behalf of customers. Contribute to the development of new AI products within the Conversational Banking Lab. Key Initiatives Include Agent Summarisation Develop advanced capabilities to summarise complex and nuanced customer conversations. App Search Evolution Transform existing vector search functionality into a fully generative AI-driven search experience , creating a single unified interface for users. Evaluation Methods Build automated evaluation frameworks to test and validate both deterministic and generative AI conversations at scale. Required Skills & Experience Must Have Strong Python development skills , with 2+ years of experience building production-grade applications using Large Language Models (LLMs) . Solid understanding of software engineering principles , including: Microservices architecture CI/CD pipelines Event-driven architecture Hands-on experience with AI engineering practices , including: RAG (Retrieval-Augmented Generation) pipelines Prompt engineering LLMOps Runtime monitoring and evaluation of AI systems Experience with Vertex AI Experience in data engineering , including building scalable data pipelines using Python and Spark . Strong knowledge of GCP-native services , including: BigQuery (BQ) Spanner Dataflow Firestore Nice to Have Experience with Agentic AI frameworks , such as: LangGraph ADK CrewAI Multi-agent architectures Experience building deployable AI solutions (production environments rather than notebook-only solutions). Knowledge of data ontologies and graph-based data models . Exposure to Agile or Scrum development methodologies . TPBN1\_UKTJ
About the opportunity
Are you ready to launch a career in Data Analytics and Business Intelligence?
Netcom Training s fully-funded Data course (NCFE Certificate in Data, Level 3) equips you with the technical skills employers are actively seeking. From data sourcing, cleansing, and analysis to visualization and reporting, you ll gain hands-on experience that prepares you for today s fast-growing data-driven roles.
Our learners have successfully moved into roles such as Junior Data Analyst, Operations Analyst, Business Intelligence Assistant, and Database Administrator, working across tech, finance, healthcare, and the public sector.
Complete the course and gain a guaranteed interview with a leading employer, helping you kickstart your career
Course Details
What you ll learn
Career Pathway
Successful participants are guaranteed an interview with our network of UK-wide partners working with leading brands.
Potential Roles:
Eligibility
This is a government-funded opportunity. To apply, you must:
Cost
This is a fully-funded course with no fees complete the training, gain essential data skills, and secure your guaranteed interview.
provided you meet the learner obligations outlined in our employablility terms and conditions, which can be found on our website.
Are you ready to launch a career in Data Analytics and Business Intelligence? Netcom Training s fully-funded Data course (NCFE Certificate in Data, Level 3) equips you with the technical skills employers are actively seeking.
From data sourcing, cleansing, and analysis to visualisation and reporting, you ll gain hands-on experience that prepares you for today s fast-growing data-driven roles.
Our learners have successfully moved into roles such as Junior Data Analyst, Operations Analyst, Business Intelligence Assistant, Database Administrator, and Pricing Analyst, working across tech, finance, healthcare, and the public sector. Complete the course and gain a guaranteed interview with a leading employer, helping you kickstart your career.
Course Details
What you ll learn
Career Pathway
Successful participants are guaranteed an interview with our network of UK-wide partners working with leading brands.
Starting Salaries: Typically £22,000 £28,000 (role dependent)
Eligibility
To apply, you must:
Key Responsibilities: Design, implement, and maintain data pipelines to ingest and process OpenShift telemetry (metrics, logs, traces) at scale. Stream OpenShift telemetry via Kafka (producers, topics, schemas) and build resilient consumer services for transformation and enrichment. Engineer data models and routing for multi-tenant observability; ensure lineage, quality, and SLAs across the stream layer. Integrate processed telemetry into Splunk for visualization, dashboards, alerting, and analytics to achieve Observability Level 4 (proactive insights). Implement schema management (Avro/Protobuf), governance, and versioning for telemetry events. Build automated validation, replay, and backfill mechanisms for data reliability and recovery. Instrument services with OpenTelemetry; standardize tracing, metrics, and structured logging across platforms. Use LLMs to enhance observability capabilities (e.g., query assistance, anomaly summarization, runbook generation). Collaborate with platform, SRE, and application teams to integrate telemetry, alerts, and SLOs. Ensure security, compliance, and best practices for data pipelines and observability platforms. Document data flows, schemas, dashboards, and operational runbooks.
Required Skills: Hands-on experience building streaming data pipelines with Kafka (producers/consumers, schema registry, Kafka Connect/KSQL/KStream). Proficiency with OpenShift/Kubernetes telemetry (OpenTelemetry, Prometheus) and CLI tooling. Experience integrating telemetry into Splunk (HEC, UF, sourcetypes, CIM), building dashboards and alerting. Strong data engineering skills in Python (or similar) for ETL/ELT, enrichment, and validation. Knowledge of event schemas (Avro/Protobuf/JSON), contracts, and backward/forward compatibility. Familiarity with observability standards and practices; ability to drive toward Level 4 maturity (proactive monitoring, automated insights). Understanding of hybrid cloud and multi-cluster telemetry patterns. Security and compliance for data pipelines: secret management, RBAC, encryption in transit/at rest. Good problem-solving skills and ability to work in a collaborative team environment. Strong communication and documentation skills.
We are accepting CVs for this position so its easier than ever to apply
Exciting things are happening within the Digital Services Team at Derbyshire County Council and were looking fora Data Architect to join our Strategy and Architecture team.
A major review of our organisation and strategies has been completed and we have invested in new tools to support acomprehensive and innovative Digital Services transformation programme, to support the delivery of key services to the citizens and communities of Derbyshire.
This role will be pivotal in helping usdeliver our transformation programme, as a key post within theDigital Services, Strategy and Architecture function. This is an excellent opportunity for someone who has significant experience in Data Architecture anddelivering changeand who really wants to make a difference.
Key Responsibilities of this role are:
Based within a dynamic and challenging environment, you will be working in several key areas including:
What skills and experience do you need to have ?
What we offer you:
Our Digital Services Team is contractually based in Matlock, Derbyshire, but we work on a hybrid basis.
Derbyshire County Council is going through a period of positive development and transformation so its a great time to join and drive change.
If you’re passionate about Data and are ready to revolutionise the way Derbyshire County Council operates in the digital era then we would love to hear from you.
If you would like to discuss the role further, please see our website for contact details
Don’t miss out on this exciting opportunity to contribute to the digital revolution at Derbyshire County Council.Apply now and be at the forefront of shaping the future.
Derbyshire County Council is an equal opportunities employer and we welcome applications from individuals of all backgrounds, experiences, and abilities.
All candidates must be able to provide proof of right to work in the UK.
Provisional Interview Date: Week Commencing 23 March 2026
This role currently offers hybrid working options which will be to subject service needs, there’ll be an opportunity to discuss working arrangements for this position at interview.
Important:Derbyshire County Council holds a sponsorship licence, butthis role does not offer visa sponsorship. Please applyonly if you already have the right to work in the UK without sponsorship, as applications that do not meet this requirement will be rejected.
We welcome applications from individuals who share our values being Collaborative, Innovative, Empowered and Accountable. These values describe who we are and what we stand for as a council. They help shape our culture, encourage consistent behaviour and guide how we work together to make a positive difference for both our employees and the residents we serve.
Role: Lead Data Engineer - Azure/Databricks
Location: Sheffield
Working Model: Hybrid - 2 days per week in person
Salary: Up to £85k depending on experience
Own the data platform. Shape the future architecture.
This role is for a hands-on data engineering leader who wants to build something properly, not babysit legacy pipelines.
Youll take full ownership of the data engineering domain, leading the design and building of a modern, highly scalable data platform that handles complex, high-frequency industrial data. Youll work closely with data scientists, software engineers, and senior leadership, with the trust and mandate to make meaningful architectural decisions.
If you enjoy balancing strong engineering principles with real-world business needs, and you like mentoring others while staying deeply technical, this role is built for you.
What youll be working on
You wont just be delivering features; youll be setting direction.
The tech youll work with
What makes you a great fit
Youre someone who can zoom out to architecture and zoom in to code, comfortably.
Industry experience
Why youll want this role
What youll get
We welcome diverse applicants and are dedicated to treating all applicants with dignity and respect, regardless of background.
Role Title: Lead Data Engineer Location: Sheffield/hybrid (3 days on site) Duration: 9 months Rate: £430 per day inside ir35 We are seeking a Lead Data Engineering Consultant with proven experience in leading and developing data engineering platforms. Experience required: Extensive enterprise experience with Hadoop, Spark, and Splunk. Proficiency in object-oriented and functional scripting, particularly in Python. Skilled in handling raw, structured, semi-structured, and unstructured data (SQL and NoSQL). Experience integrating large, disparate datasets using modern tools and frameworks. Strong background in building and optimizing ETL/ELT data pipelines. Familiarity with source control and implementing Continuous Integration, Delivery, and Deployment via CI/CD pipelines. Experience supporting and collaborating with BI and Analytics teams in fast-paced environments. Ability to pair program and work effectively with other engineers. Excellent analytical and problem-solving abilities. Knowledge of agile methodologies such as Scrum or Kanban is a plus. Comfortable representing the team in standups and problem-solving sessions. Capable of driving the creation of technical test plans and maintaining records, including unit and integration tests, within automated test environments to ensure high code quality. Promote SRE (Site Reliability Engineering) culture by addressing challenges through data engineering. Ensure service resilience, sustainability, and adherence to recovery time objectives for all delivered software solutions.Soft Skills (Consultant): Demonstrated ability and enthusiasm for enhancing team performance. Strong active listening and effective communication skills. Self-mastery, with a focus on positive mindsets and professional behaviours. Maintains up-to-date expertise in current tools, technologies, and key areas such as cybersecurity, data privacy, consent, and data residency regulations. Engages with industry groups and external vendors to represent and advance HSBC's interests and influence. Takes accountability for ensuring control and compliance throughout the engineering process. Champions innovation and the adoption of advanced technologies and best practices within the domain.If you are interested in this role or wish to apply, please feel free to submit your CV
Sheffield City Centre & Home working (2 days per week) Up to 48,000 + Bonus + Free Parking + Other Benefits Your new role As a Power BI / Data Analyst you will help deliver the strategic vision with its subsidiaries. The role is to provide insight and data to our internal companies, whether this be high-level interactive performance indicators and dashboards, or detailed data extracts using reports and Power BI. Your expertise will be invaluable to deliver new ways of accessing our data, understanding trends and visual reporting to better inform all levels of our people. Responsibilities Be proactive in identifying issues and the action of change. Lead in the development of new Power BI reports and the improvement of existing reports, applying the latest methods and best practices. Manage workspaces and settings within the Power BI Service. Share specialist knowledge on Power BI and related topics with members of the IT team. Ensure the correct security permissions are assigned to authorised users and are regularly reviewed. Work with users at all levels within the organisation, to gather, understand and document reporting requirements. Provide training, documentation and support to relevant departments when delivering solutions. Assist with the development and maintenance of simple apps and workflows within the Power Platform and show a willingness to advance this knowledge over time. Articulate the capabilities and limitations of Power BI clearly to personnel at all levels of the organisation. Work with data owners to investigate data accuracy and validity in various data-related projects. Work with teams across the organisation to assist in data collection and analysis, in line with relevant legislation such as GDPR. Evaluate user needs and system functionality for reporting purposes. Experience needed Proven track record of consolidating data from multiple sources, into a single or a group of reports. Ability to 'tell a story' with one or multiple sets of data, and presenting this appropriately for the intended audience. Ability to develop Power BI reports/dashboards and publish these in Power BI Service. Good understanding of Power App (model-driven and canvas), PowerFX and Power Automate development. Ability to provide support and documentation to end users. An understanding of creating reports from Dataverse, with particular emphasis on D365 data. Experience administering reports and workspaces in Power BI Service. Experience using SQL, Power Query and Data Analysis Expression (DAX). Proven track record in delivering application-based reporting solutions (dashboards). Experience in using Azure DevOps, JIRA or similar tools. Be passionate about Business Intelligence and Data and how this can add value Have demonstratable expertise in data handling and how this can add value Understanding the data warehouse lifecycle such as ETL Demonstrate experience in managing and reporting from large and small data sets Have good interpersonal skills and ability to work effectively in a team The ability to design data structures to support reporting needs Be able to explain complex information to lay audiences Desirable Microsoft Certified: Power BI Data Analyst Associate (PL-300) Experienced working with commercial data. Experience in running successful data visualisation projects. Knowledge of Microsoft Fabric or data warehousing. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found at (url removed)
Role Title: Lead Data Engineer
Location: Sheffield/hybrid (3 days on site)
Duration: 9 months
Rate: 430 per day inside ir35
We are seeking a Lead Data Engineering Consultant with proven experience in leading and developing data engineering platforms.
Experience required:
Soft Skills (Consultant):
If you are interested in this role or wish to apply, please feel free to submit your CV.
Job Title: Lead Data Engineer - Hadoop, Spark, Pytthon
Location: Sheffield - 3 days per week in the office
Salary/Rate: Up to 617 per day inside IR35
Start Date: 02/03/2026
Job Type: Contract until November
We are seeking a Lead Data Engineering Consultant with proven experience in leading and developing data engineering platforms.
The ideal candidate will possess hands-on expertise in the following areas:
If you are interested in this opportunity, please apply now with your updated CV in Microsoft Word/PDF format.
Disclaimer
Notwithstanding any guidelines given to level of experience sought, we will consider candidates from outside this range if they can demonstrate the necessary competencies.
Square One is acting as both an employment agency and an employment business, and is an equal opportunities recruitment business. Square One embraces diversity and will treat everyone equally. Please see our website for our full diversity statement.