Make yourself visible and let companies apply to you.
Roles
Data Engineer Jobs
Overview
Looking for top Data Engineer jobs? Explore the latest data engineering opportunities on Haystack, your go-to IT job board. Whether you're skilled in ETL, data pipelines, or big data technologies, find the perfect role to advance your career today. Start your search for Data Engineer positions now!
Senior Data Engineer
Lynx Recruitment Limited
London
Hybrid
Senior
£85,000
RECENTLY POSTED
+1

Are you a Data Engineer passionate about buildingscalable, cloud-first data platformsthat power business insights and innovation? Our client is seeking an experienced engineer to design robust, secure, and future-ready data solutions that support analytics, AI, and strategic decision-making.

What Youll Do:

  • Build, test, and maintainETL/ELT pipelinesfor structured and unstructured data.
  • Design and optimisedata warehouses, lakes, and cloud-native platforms.
  • Implementmonitoring, security, and compliance frameworksin line with regulations (e.g., GDPR, financial standards).
  • Deliverhigh-quality datasetsto analysts, data scientists, and business stakeholders.
  • Work with modern technologies, includingcloud platforms, orchestration tools, and streaming frameworks.

What Were Looking For:

  • StrongSQL & Pythonskills.
  • Hands-on experience withETL/ELT tools(e.g., Matillion, Talend, FiveTran, Azure Data Factory).
  • Experience designing and managingcloud data platforms(Snowflake essential; AWS/Azure/GCP desirable).
  • Soliddata modelling knowledgeand performance optimisation skills.
  • Awareness ofdata governance, security, and regulatory compliance.
  • Ability to work closely with technical and business teams to deliver impactful solutions.

Nice-to-Haves:

  • Orchestration tools (Airflow, dbt, Prefect) or streaming frameworks (Kafka, Kinesis).
  • CI/CD experience and DevOps practices for data workflows.
  • Exposure to data visualisation platforms (PowerBI, Tableau, MicroStrategy).
  • Experience in financial services or other highly regulated industries.
Data Platform Lead
Connells Limited
Milton Keynes
In office
Senior
Private salary
RECENTLY POSTED
+2

We are seeking a Data Platform Lead to join our Group Technology team in Milton Keynes. You will play a leading role in delivering the formal requirements associated with the Connells Group Data Platform, including design, capacity, management and configuration management responsibilities. As the Data Platform Lead, you will be managing a team of technical specialists across Connells data estate on a day-to-day basis, liaising with 3rd party providers, mentoring and developing the teams core skills and expertise whilst maturing the overall processes and procedures in relation to the service. The role also supports the business objectives and strategy through the delivery of secure, supportable and scalable on-premises, cloud and hybrid data platforms.

Key Responsibilities:

  • Line management of a number of direct reports within the team. Resource management across the team, ensuring that the relevant workload associated to the team (Design, Project delivery, Operational and Change) are delivered as agreed.
  • Act as the Subject Matter expertise around the Data Platform capability.
  • Maintain the Data Platform for all Connells users, managing critical downtime and risk of disruption.
  • To lead on ensuring an appropriate and robust response to applicable incidents, and to ensure root cause analysis and resolution occurs. In addition, ensuring an applicable out-of-hours support model in place as required for 24x7x365 operational running.

Team Roles and Responsibilities:

  • Owns the Data Platform designs and supplier relationship
  • Incident and Change
  • Support projects where appropriate
  • Undertake proactive monitoring and react to escalations from other IT teams
  • Ensures Patches and Upgrades are implemented in line with operational limits

Experience and Skills Required:

Essential:

  • Demonstrable Experience in similar relevant technical managerial roles
  • Strong experience of incident resolution, requests, changes and problem-solving activities delivered to agreed SLAs.
  • Experience of implementing Cloud Technologies
  • Experience of Microsoft Fabric and SQL Server
  • Extensive experience of implementing, managing and supporting data platforms in a demanding environment
  • Experience of JIRA and Confluence
  • Understanding of FinOps; PaaS; Alerting and Monitoring

Desirable:

  • Experience of managing teams of over 5+ people in a complex, challenging environment setting and managing against SLAs and business objectives.
  • Experience of GitHub, Git Actions, Terraform, Zero Trust architectures and PaaS.
  • Preferably educated to graduate level with a bachelors degree in business, computer science or a related field

Connells Group UKis an equal opportunities employer and positively encourages applications from suitably qualified and eligible candidates regardless of sex, race, disability, age, sexual orientation, transgender status, religion or belief, marital status, or pregnancy and maternity.

Dont meet every single requirement? Studies have shown that women and people of colour are less likely to apply to jobs unless they meet every single qualification. At Connells Group we are dedicated to building a diverse, inclusive and authentic workplace. So, if youre excited about this role but your experience doesnt fit perfectly with every aspect of the job description, we encourage you to apply anyway. You may be just the right candidate for this or other opportunities.

Azure Data Architect
ARC IT Recruitment Ltd
Brighton
Hybrid
Mid - Senior
£90,000
RECENTLY POSTED

Brighton, East Sussex, £80 - £90k

Azure Data Architect is required by our client who are a fast-growing and technically advanced multi-award-winning company going through an extended period of growth.

Key responsibilities:

  • Design, develop, and maintain data architectures using Azure Databricks and Synapse.
  • Collaborate with cross-functional teams to understand data requirements and translate them into robust architectural solutions.
  • Optimize data workflows and ensure seamless integration with various data sources.
  • Implement data governance and security best practices.
  • Provide technical leadership and guidance to the development team.
  • Conduct performance tuning and optimization of data processes.

Skills required:

  • Proven experience as a Data Architect working within Azure.
  • Expertise in Azure Databricks and Azure Synapse.
  • Strong understanding of Datamodelling, ETL processes, and data warehousing concepts.
  • Proficiency in SQL, Python, and other relevant programming languages.
  • Experience with data governance, data quality, and data security best practices.
  • Excellent problem-solving skills and attention to detail.
  • Strong communication and stakeholder management skills.

Worthing based hybrid opportunity, easily commutable from Portsmouth Hampshire, Guildford Surrey or Brighton, East Sussex.
Azure, Synapse, Databricks
Brighton, Hybrid (3 days in the office)

Data Engineer (Automation)
Network IT
Milton Keynes
Hybrid
Mid - Senior
£55,000
RECENTLY POSTED
+1

Role: Data Engineer (Automation)

Location: Milton Keynes (Hybrid 3 Days In-Office Weekly)

Salary: £45,000 - £55,000

Network IT are supporting a large, enterprise-scale organisation as they continue to evolve and modernise their Data Analytics and Automation Platform ; were looking for an experienced Data Engineer to design, build, and optimise secure, automated data pipelines that enable scalable analytics, business intelligence, and data-driven decision-making across multiple business units.

This is a highly technical role with strong exposure to cloud and on-prem data platforms , automation, and emerging AI-driven capabilities, and is specifically suited to candidates with strong, hands-on data engineering experience , particularly in building, operating, and optimising complex data pipelines, data models, and integration workflows at scale.

Role Overview and Responsibilities

As a Data & Automation Engineer , you will be responsible for the end-to-end delivery, operation, and continuous improvement of enterprise data pipelines and analytics platforms. Youll work closely with architects, application managers, and international teams to ensure data solutions are reliable, scalable, and aligned with data governance standards.

Key responsibilities include:

Designing, developing, and maintaining automated end-to-end data pipelines across cloud and on-premise source systems.

Delivering reliable data ingestion, transformation, and delivery processes using technologies such as Azure Data Factory, Databricks, SSIS, and SQL .

Reducing manual interventions through automation and standardisation , including data preparation, feature engineering, and training-data pipelines.

Preparing data models and datasets (DWH / Lakehouse) to support business intelligence, analytics, and operational reporting.

Monitoring and supporting live data pipelines, resolving issues in line with ITIL best practices , and implementing proactive alerting and self-healing mechanisms.

Identifying and implementing performance optimisations across data pipelines, queries, and reporting workloads.

Supporting data governance processes , including data archiving, masking, encryption, and versioning, with opportunities to integrate AI-driven automation.

Contributing to CI/CD processes for data pipelines, ensuring releases are tested, compliant, and deployed with minimal operational impact.

Collaborating with cross-functional and international teams to support aligned, scalable data operations.

Supporting the ongoing evolution and execution of the organisations data strategy .

Essential Skills and Experience

To be successful in this role, you will bring:

Strong commercial experience delivering end-to-end data engineering and automation solutions in complex environments.

Advanced SQL skills, including performance tuning and optimisation across large datasets.

Knowledge of Python or R for data processing or analytics.

Hands-on experience with data integration and ingestion tools such as Azure Data Factory and Databricks .

Proven experience in data modelling , data warehousing, and relational database platforms (e.g. MS SQL Server).

Experience designing cloud-native and/or on-prem data solutions .

Exposure to automation and AI-enabled data processes , with awareness of LLM use cases within data workflows.

Experience working in Agile delivery environments (Scrum, Kanban, DevOps).

Strong analytical and problem-solving skills, with the ability to translate complex data into meaningful insights.

Excellent communication skills and the ability to work effectively with both technical and non-technical stakeholders.

Desirable Experience

Experience coordinating or supporting AI / ML initiatives within data platforms.

Experience working within large, regulated, or international organisations.

TPBN1_UKTJ

Hedge Fund - Senior C++ Quant Developer - Equities - Linux - Python - Data/Algos/Low latency
Scope AT Limited
London
In office
Senior
Private salary
RECENTLY POSTED

Hedge Fund background essential

C++ (Version 11 upwards), Linux, Python (nice to have). Trading systems experience - ideally experience working in the equities space.
Ideally the technical has experience with algo implementation.

Quantitative Developer - Equities Technology

We are in search of a Quantitative Developer to join our team who is passionate about designing, architecting, and implementing low latency C++ systems that are not only robust, resilient, and accurate, but also exceptionally fast. Our team works directly with the firm’s central trading teams. By constructing and maintaining this high-performance infrastructure used by these teams, this developer will enable new trading opportunities across businesses and regions, allowing the best possible execution performance.

Job Duties

  • Development of execution algorithms, order management systems, strategy containers, connectivity, and messaging systems.
  • Work directly with central trading teams to optimize the firm’s overall execution performance.
  • Enhance the platform’s efficiency by utilizing network and systems programming, along with other advanced techniques to reduce latency.
  • Create systems, interfaces, and tools for historical market data and trading simulations to boost research productivity and system testability.
  • Assist in building and maintaining our automated tests, performance benchmark framework, and other tools
  • Collaborate closely with trading teams to gather requirements and develop solutions in a fast-paced environment

Qualifications

  • 5+ years of professional experience in a Front Office, financial services environment as a senior contributor
  • 10+ years cumulative, professional experience
  • Strong background in data structures, algorithms, and object-oriented programming in C++

Permanent role - Central London based - 5 days a week in the office

By applying to this job you are sending us your CV, which may contain personal information. Please refer to our Privacy Notice to understand how we process this information. In short, in order to supply you with work finding services, we will hold and process your personal data, and only with your express permission we will share this personal data with a client (or a third party working on behalf of the client) by email or by upload to the Client/third parties vendor management system. By giving us permission to send your CV to a client, this constitutes permission to share the personal data that would be necessary to consider your application, interview you (Phone/video/face to face) and if successful hire you.

Scope AT acts as an employment agency for Permanent Recruitment and an employment business for the supply of temporary workers. By applying for this job you accept the Terms and Conditions, Data Protection Policy, Privacy Notice and Disclaimers which can be found at our website.

Lead Big Data Ops Engineer
Hunter Bond
London
Hybrid
Senior
£90,000
RECENTLY POSTED
TECH-AGNOSTIC ROLE

My leading Tech client are looking for a talented and motivated individual to ensure the resilience, performance, and cost-effectiveness of their Azure-based data platform. This role is essential to their data ecosystem, combining platform reliability, incident response, SLA management, cost optimization (FinOps), and deployment oversight.

You will be the single point of contact for operational issues, driving rapid resolution during outages, leading communications with stakeholders, and shaping the processes that keeps their platform running smoothly and efficiently.

This is a newly created role in a growing business. A brilliant opportunity!

The following skills/experience is required:

  • Proven operational leadership for large-scale data platforms.
  • Expertise in incident management, SLA enforcement, and stakeholder communication.
  • Hands-on experience with Azure Synapse, Databricks, ADF, Power BI.
  • Familiarity with CI/CD and automation.
  • Strong FinOps mindset and cost management experience.
  • Knowledge of monitoring and observability frameworks.

Salary: Up to £90,000 + bonus + package

Level: Lead Engineer

Location: London (good work from home options available)

If you are interested in this Lead Big Data Ops Engineer position and meet the above requirements please apply immediately.

Lead DataOps Engineer - Big Data
Hunter Bond
London
Hybrid
Senior
£90,000
RECENTLY POSTED
TECH-AGNOSTIC ROLE

My leading Tech client are looking for a talented and motivated individual to ensure the resilience, performance, and cost-effectiveness of their Azure-based data platform. This role is essential to their data ecosystem, combining platform reliability, incident response, SLA management, cost optimization (FinOps), and deployment oversight.

You will be the single point of contact for operational issues, driving rapid resolution during outages, leading communications with stakeholders, and shaping the processes that keeps their platform running smoothly and efficiently.

This is a newly created role in a growing business. A brilliant opportunity!

The following skills/experience is required:

  • Proven operational leadership for large-scale data platforms.
  • Expertise in incident management, SLA enforcement, and stakeholder communication.
  • Hands-on experience with Azure Synapse, Databricks, ADF, Power BI.
  • Familiarity with CI/CD and automation.
  • Strong FinOps mindset and cost management experience.
  • Knowledge of monitoring and observability frameworks.

Salary: Up to £90,000 + bonus + package

Level: Lead Engineer

Location: London (good work from home options available)

If you are interested in this Lead DataOps Engineer (Big Data) position and meet the above requirements please apply immediately.

Scala Data Engineer
Sky
Multiple locations
Hybrid
Mid - Senior
Private salary
RECENTLY POSTED
+1

We believe in better. And we make it happen.

Better content. Better products. And better careers.

Working in Tech, Product or Data at Sky is about building the next and the new. From broadband to broadcast, streaming to mobile, SkyQ to Sky Glass, we never stand still. We optimise and innovate.

We turn big ideas into the products, content and services millions of people love.

And we do it all right here at Sky.

What you’ll do

  • Design and implement scalable APIs and backend services, primarily in Scala, to integrate ML models into production systems and deliver personalised experiences.
  • Real time data processing and gRPC microservices (Typelevel stack).
  • Take end-to-end ownership of services, from development to production operations
  • Optimising the performance of the application in the cloud environments
  • Creating/improving automated pipelines that support our Continuous Delivery process
  • Build, scale and maintain large scale cloud-based services
  • Work closely with data scientists, ML engineers, and product teams to align technical solutions with business goals.
  • Refining the team processes to continuously integrate and working towards a continuously deliverable application.
  • Championing best practices to develop clean, resilient code that performs at serious scale.
  • Coaching and providing feedback to fellow developers.

What you’ll bring

  • Strong software engineering skills with experience in Scala, ideally the typelevel stack (bonus if you have exposure to Golang and Python).
  • Interest in machine learning, personalisation systems and cloud technology - even if you haven’t worked extensively in ML before.
  • Demonstrated experience designing, implementing, deploying, and maintaining production-grade APIs and backend services, including responsibility for reliability, performance, and on-call support.
  • Hands-on experience working with data processing frameworks and distributed systems used to ingest, process, and store large-scale datasets, with an understanding of scalability, fault tolerance, and performance considerations.
  • Practical experience with modern software development practices, including automated CI/CD pipelines, containerisation technologies (e.g., Docker), and deploying applications to cloud environments (e.g., AWS or GCP).
  • Ability to collaborate effectively across teams and communicate technical concepts clearly.
  • A problem-solving mindset and eagerness to learn new technologies and approaches.
  • Ability to challenge technical choices, architecture, tools and processes

Team overview

Global OTT Technology

Our team develops and supports market-leading video streaming services, underpinned by state-of-the-art engineering principles. We do this at huge scale: for over 50 million customers globally, spanning NBCUniversal Peacock in the US and Sky, NOW and SkyShowtime across Europe. No matter the device, the time or the place, we make sure that our diverse audiences can easily find and enjoy whatever they want to watch, choosing from the world’s best entertainment, news and sport.

The rewards

There’s one thing people can’t stop talking about when it comes to : the perks. Here’s a taster:

Sky Q, for the TV you love all in one place

The magic of Sky Glass at an exclusive rate

A generous pension package

Private healthcare

Discounted mobile and broadband

A wide range of Sky VIP rewards and experiences

Inclusion & how you’ll work

We are a Disability Confident Employer, and welcome and encourage applications from all candidates. We will look to ensure a fair and consistent experience for all, and will make reasonable adjustments to support you where appropriate. Please flag any adjustments you need to your recruiter as early as you can.

We’ve embraced hybrid working and split our time between unique office spaces and the convenience of working from home. You’ll find out more about what hybrid working looks like for your role later on in the recruitment process.

Your office space

Osterley

Our Osterley Campus is a 10-minute walk from Syon Lane train station. Or you can hop on one of our free shuttle buses that run to and from Osterley, Gunnersbury, Ealing Broadway and South Ealing tube stations. There are also plenty of bike shelters and showers.

On campus, you’ll find 13 subsidised restaurants, cafes, and a Waitrose. You can keep in shape at our subsidised gym, catch the latest shows and movies at our cinema, get your car washed, and even get pampered at our beauty salon.

We’d love to hear from you

Inventive, forward-thinking minds come together to work in Tech, Product and Data at Sky. It’s a place where you can explore what if, how far, and what next.

But better doesn’t stop at what we do, it’s how we do it, too. We embrace each other’s differences. We support our community and contribute to a sustainable future for our business and the planet.

If you believe in better, we’ll back you all the way.

Just so you know: if your application is successful, we’ll ask you to complete a criminal record check. And depending on the role you have applied for and the nature of any convictions you may have, we might have to withdraw the offer.

Data Engineer (Scala)
Sky
Multiple locations
Hybrid
Mid - Senior
Private salary
+1

We believe in better. And we make it happen.

Better content. Better products. And better careers.

Working in Tech, Product or Data at Sky is about building the next and the new. From broadband to broadcast, streaming to mobile, SkyQ to Sky Glass, we never stand still. We optimise and innovate.

We turn big ideas into the products, content and services millions of people love.

And we do it all right here at Sky.

What you’ll do

  • Design and implement scalable APIs and backend services, primarily in Scala, to integrate ML models into production systems and deliver personalised experiences.
  • Real time data processing and gRPC microservices (Typelevel stack).
  • Take end-to-end ownership of services, from development to production operations
  • Optimising the performance of the application in the cloud environments
  • Creating/improving automated pipelines that support our Continuous Delivery process
  • Build, scale and maintain large scale cloud-based services
  • Work closely with data scientists, ML engineers, and product teams to align technical solutions with business goals.
  • Refining the team processes to continuously integrate and working towards a continuously deliverable application.
  • Championing best practices to develop clean, resilient code that performs at serious scale.
  • Coaching and providing feedback to fellow developers.

What you’ll bring

  • Strong software engineering skills with experience in Scala, ideally the typelevel stack (bonus if you have exposure to Golang and Python).
  • Interest in machine learning, personalisation systems and cloud technology - even if you haven’t worked extensively in ML before.
  • Demonstrated experience designing, implementing, deploying, and maintaining production-grade APIs and backend services, including responsibility for reliability, performance, and on-call support.
  • Hands-on experience working with data processing frameworks and distributed systems used to ingest, process, and store large-scale datasets, with an understanding of scalability, fault tolerance, and performance considerations.
  • Practical experience with modern software development practices, including automated CI/CD pipelines, containerisation technologies (e.g., Docker), and deploying applications to cloud environments (e.g., AWS or GCP).
  • Ability to collaborate effectively across teams and communicate technical concepts clearly.
  • A problem-solving mindset and eagerness to learn new technologies and approaches.
  • Ability to challenge technical choices, architecture, tools and processes

Team overview

Global OTT Technology

Our team develops and supports market-leading video streaming services, underpinned by state-of-the-art engineering principles. We do this at huge scale: for over 50 million customers globally, spanning NBCUniversal Peacock in the US and Sky, NOW and SkyShowtime across Europe. No matter the device, the time or the place, we make sure that our diverse audiences can easily find and enjoy whatever they want to watch, choosing from the world’s best entertainment, news and sport.

The rewards

There’s one thing people can’t stop talking about when it comes to : the perks. Here’s a taster:

Sky Q, for the TV you love all in one place

The magic of Sky Glass at an exclusive rate

A generous pension package

Private healthcare

Discounted mobile and broadband

A wide range of Sky VIP rewards and experiences

Inclusion & how you’ll work

We are a Disability Confident Employer, and welcome and encourage applications from all candidates. We will look to ensure a fair and consistent experience for all, and will make reasonable adjustments to support you where appropriate. Please flag any adjustments you need to your recruiter as early as you can.

We’ve embraced hybrid working and split our time between unique office spaces and the convenience of working from home. You’ll find out more about what hybrid working looks like for your role later on in the recruitment process.

Your office space

Osterley

Our Osterley Campus is a 10-minute walk from Syon Lane train station. Or you can hop on one of our free shuttle buses that run to and from Osterley, Gunnersbury, Ealing Broadway and South Ealing tube stations. There are also plenty of bike shelters and showers.

On campus, you’ll find 13 subsidised restaurants, cafes, and a Waitrose. You can keep in shape at our subsidised gym, catch the latest shows and movies at our cinema, get your car washed, and even get pampered at our beauty salon.

We’d love to hear from you

Inventive, forward-thinking minds come together to work in Tech, Product and Data at Sky. It’s a place where you can explore what if, how far, and what next.

But better doesn’t stop at what we do, it’s how we do it, too. We embrace each other’s differences. We support our community and contribute to a sustainable future for our business and the planet.

If you believe in better, we’ll back you all the way.

Just so you know: if your application is successful, we’ll ask you to complete a criminal record check. And depending on the role you have applied for and the nature of any convictions you may have, we might have to withdraw the offer.

AI Prompt Engineer
Harvey Nash
Edinburgh
Remote or hybrid
Mid - Senior
Private salary

AI Prompt Engineer - 12 Month Contract - Outside IR35

Role Description:

Harvey Nash’s Pub Sec client are seeking an AI Prompt Engineer to design, develop, and optimise prompt-based solutions for AI systems.

This role will design and optimise AI prompts for extracting data from forms, fine-tune models for accuracy, automate end-to-end workflows, and manage production deployments. The ideal candidate will ensure robust, scalable, and secure AI solutions that streamline document processing

The initial focus of the role will look at building on exiting solution to extract structured data from various forms and documents. This role combines expertise in natural language processing (NLP), prompt engineering, and workflow automation to enable accurate and efficient data processing.

Key Responsibilities:

  • Prompt Design & Optimisation:

  • Develop and refine AI prompts to accurately extract data fields from structured and semi-structured forms.

  • Test and iterate prompts for different document types and languages to maximize accuracy and reliability.

  • Ensure prompt development is reusable and scalable across different use case scenarios

  • Work closely with developers, data scientists, and business analysts to align solutions with business needs.

  • Fine-tune AI models to improve performance across diverse document types and languages.

  • Workflow Automation

  • Develop automated pipelines for document ingestion, data extraction, and validation.

  • Integrate AI solutions with OCR tools and enterprise systems for seamless processing.

  • Deployment & Support

  • Manage deployment of AI solutions into production environments.

  • Monitor system performance, troubleshoot issues, and provide ongoing support.

  • AI Model Integration*:

  • Collaborate with data scientists and developers to integrate prompt-based solutions into existing AI/ML pipelines.

  • Ensure compatibility with OCR tools and document processing systems.

  • Data Quality & Validation:

  • Implement validation logic to ensure extracted data meets quality standards.

  • Work with QA teams to identify and resolve extraction errors.

  • Research & Innovation:

  • Stay updated on advancements in prompt engineering, LLMs, and document AI technologies.

  • Experiment with new techniques for improving extraction performance and reducing manual intervention.

  • Collaboration & Documentation:

  • Partner with business analysts to understand form structures and data requirements.

  • Document prompt strategies, workflows, and best practices for internal knowledge sharing.

  • Document guidance to support deployment approach and ongoing support and maintenance activities

Required Skills & Qualifications

  • Strong understanding of Large Language Models (LLMs) and prompt engineering principles.
  • Experience with document AI, OCR technologies, and data extraction workflows.
  • Proficiency in Python or similar languages for automation and integration tasks.
  • Familiarity with APIs and cloud-based AI services (e.g., Azure OpenAI, AWS, Google AI).
  • Excellent problem-solving skills and attention to detail.
  • Background in NLP, machine learning, or data science.
  • Experience with form processing systems in finance, healthcare, or enterprise environments.
  • Knowledge of data privacy regulations and secure handling of sensitive information.
  • Demonstrated success in delivery across both distributed and hybrid on-premises and cloud technology estates.
  • Experience in managing both agile and waterfall projects.
  • Excellent interpersonal skills, influencing and communication skills.
  • Ability to apply a broad understanding of IT infrastructure and interdependencies to create effective, risk-minimising migration plans.
  • The candidate will have an awareness of Digital First Service Standards, and Government Digital Services (GDS).
Data Architect
Fruition Group
Multiple locations
Hybrid
Senior
£100,000

Liverpool (Hybrid - 2 days per week in office)
Basic salary circa £100k

An exciting opportunity for a Data Architect to join a large, established organisation and become part of a growing, centralised data function. This role offers a genuine mix of data strategy and hands-on data architecture, giving you the chance to influence how data is designed, governed, and leveraged across the wider business.

As a Data Architect, you’ll operate within a broader Data and Analytics team, working alongside data engineers, governance specialists, and technology architects. You’ll help set the direction for the organisation’s data landscape while remaining close to delivery, ensuring architectural decisions translate into practical, scalable solutions that teams can build against.

This is a role for a Data Architect comfortable operating at both a strategic and technical level. You’ll contribute to long-term data direction, define architectural standards, and support delivery teams with clear, well-considered data designs. A strong grounding in data engineering concepts, data governance, and modern cloud-based data platforms is essential, though the focus is on capability and approach rather than specific tools.

Data Architect - Key Requirements:

  • Strong experience designing data architectures within complex or enterprise environments
  • Experience contributing to data strategy as well as hands-on architectural design
  • Understanding of modern data architecture patterns and approaches
  • Solid grasp of data engineering practices, including integration, transformation, and pipelines
  • Good awareness of data governance principles, data quality, and ownership
  • Experience working with modern data tooling and cloud platforms (e.g. Snowflake, AWS, Azure, etc.)
  • Confident working with and influencing stakeholders across engineering and architecture teams
  • Previous experience working in a highly regulated environment would be preferred

Data Architect - Salary & Benefits:

  • Basic salary up to £100k
  • Excellent pension scheme
  • Discretionary bonus
  • 25 days holiday (+/-)
  • Private medical cover
  • Life assurance and income protection
  • Share save scheme
  • Additional flexible benefits, L&D opportunities, and perks

If you’re a Data Architect looking for a role where you can shape data direction, stay close to delivery, and work as part of a collaborative data team, this is a strong opportunity to make a meaningful impact.

We are an equal opportunities employer and welcome applications from all suitably qualified candidates, regardless of race, sex, disability, religion/belief, sexual orientation, or age.

Data Engineer - Highly competitive salary
Anson McCade
Bristol
In office
Mid - Senior
Private salary

Data Engineer- Highly competitive salary About the Role: Were partnering with a leading technology consultancy that helps organisations harness the power of data to modernise platforms and drive business outcomes. As a Data Engineer, youll be at the forefront of designing and delivering cloud-native solutions on Google Cloud, turning complex datasets into actionable insights. In this role, youll work on diverse projects, from batch and streaming pipelines to data warehouses, data lakes, and AI-powered analytics platforms. This is a hands-on role where your expertise will guide delivery, shape best practices, and mentor other team members. Key Responsibilities : Lead the design, development, and deployment of scalable data pipelines using BigQuery, Dataflow, Dataproc, and Pub/Sub Automate ETL/ELT workflows and orchestrate pipelines with tools such as Cloud Composer Contribute to architecture and end-to-end solution design for complex data platforms Set engineering standards and ensure high-quality code, deployment, and documentation practices Collaborate with clients and internal teams, translating business requirements into practical solutions Mentor and coach junior engineers to grow their skills and adopt best practices What They're Looking For: They're looking for a Data Engineer who can take ownership of complex data solutions while remaining hands-on. You should have: Proven experience building production-ready solutions on Google Cloud Expertise with batch and streaming frameworks like Apache Spark or Beam Strong understanding of data storage, pipeline patterns, and event-driven architectures Experience with CI/CD, version control, automated testing, and Agile delivery Ability to communicate clearly to both technical and non-technical stakeholders Mentoring or coaching experience Bonus skills: Kafka, enterprise data platform migrations, RDBMS experience (Postgres, MySQL, Oracle, SQL Server), and exposure to ML pipelines. Security Eligibility Candidates must be eligible for UK Security Clearance (SC or DV) if required. Why This Role? This is a chance to work on high-impact, cloud-native projects as a Data Engineer, taking ownership of technical decisions, shaping delivery practices, and developing your career. Youll join a supportive environment where mentoring and learning are highly valued, and your work will directly contribute to the success of complex data programmes. Ok I'm In What's Next? Please apply with your latest CV. TPBN1\_UKTJ

Asset Manager
Telent Technology Services Ltd
Birmingham
Hybrid
Mid - Senior
Private salary

Hybrid/Birmingham/Remote

Role Purpose

The Asset Manager will report to the Data Architect Manager and is responsible for ensuring that NRTS product and configuration data is accurate, consistent, and complete across all systems. This role manages the Product Catalogue, implements the Information Asset Register, and ensures data governance and obsolescence processes are executed effectively for mainly hardware assets. The post holder provides subject matter expertise across internal and external stakeholders to ensure data assets are fit for purpose and deliver measurable business value. This role underpins the accuracy, reliability, and compliance of NRTS configuration and asset data. The successful candidate will enable the business to make informed decisions, optimise asset lifecycle management, and maintain operational resilience through trusted and governed data.

Key Responsibilities

Product Catalogue & Asset Management

  • Populate and maintain the NRTS Product Catalogue with all known versions, specifications, and associated support asset data.
  • Provide accurate, periodic reports on the supportability and lifecycle status of NRTS assets.
  • Maintain the accuracy and completeness of all asset and service records used on the NRTS programme.

Obsolescence and Configuration Processes

  • Collaborate with the Logistics Manager, CRM Manager, and Release Manager to review and enhance the Obsolescence Management process and related procedures.
  • Identify and manage risks relating to asset end-of-life and end-of-support.
  • Establish and maintain consistent policies and procedures to ensure configuration data is accurate, secure, and contractually compliant.
  • Manage and maintain multiple NRTS datasets including Forward Stock, Support Spares, Test Equipment, and other repositories to ensure alignment and consistency.

Network Configuration Management

  • Maintaining repository, for network, device and software configurations
  • Maintain the process to check that CMDB accurately reflects the installed Firmware and software on network assets
  • Manage discrepancies between Data and installed configuration
  • Establish process with Provisioning Team for using and recording gold configurations
  • Track configuration changes on Network Assets

Information Governance and Data Quality

  • Establish and maintain the Asset Information Register (Data Dictionary) to document data sources, ownership, and refresh frequency.
  • Implement data quality routines, metrics, and controls to proactively identify and resolve data issues.
  • Grade and prioritize data quality issues based on business and safety impact to ensure that high-value risks are addressed promptly.
  • Ensure data management processes meet governance standards and audit requirements.

Analytics and Reporting

  • Deliver bespoke analytics and dashboards using Qlik Sense to provide insights into asset lifecycle, data quality, and configuration status.
  • Support the migration of reporting from QlikView to Qlik Sense, ensuring improved visualization and accessibility.
  • Produce periodic and ad-hoc reports on product lifecycle, supportability, and configuration compliance for key stakeholders.

Stakeholder Engagement and Continuous Improvement

  • Work collaboratively with internal and external stakeholders to define standard methods of recording support contract details within Remedy ITSM.
  • Ensure continuous alignment between business, data, and technical teams regarding data requirements and standards.
  • Drive ongoing improvements in asset and data management processes through the Continual Service Improvement framework.
  • Provide internal subject matter expertise (SME) for data, configuration, and product catalogue management.

Skills, Knowledge and Experience

  • Systems & Tools - Hands-on experience with Remedy CMDB/ServiceNow/ITSM, Qlik Sense/PowerBI and data migration projects.
  • Data Quality - Strong background in data validation, profiling, and data governance.
  • Analytics - Capable of creating visual reports and insights in Qlik Sense (or similar BI tools).
  • Stakeholder Management - Proven ability to work cross-functionally across business, technology, and supplier teams.
  • Domain Experience - Telecommunications, transport infrastructure, or technology environment preferred.
  • Certifications (Desirable) - ITIL Foundation, Data Management, or Information Governance qualification.

Personal Attributes

  • Analytical and detail-oriented, with strong documentation skills.
  • Self-starter capable of working independently and managing priorities.
  • Strong communicator, able to engage and influence technical and business stakeholders.
  • Able to work under pressure and deliver to tight deadlines.
  • Committed to continuous improvement and maintaining data excellence.

What do we offer:

A career at Telent can span many sectors, roles, technologies and clients giving you the opportunity to develop, learn new skills and make an impact. We are growing and we rely on our committed Team to deliver.

We nurture the talent that makes this happen, by our on-going commitment to creating an inclusive culture that respects and values differences, that celebrates diverse ideas. We want everyone to feel they can be themselves and to thrive at work.

The additional benefits with this role:

  • 26 days holiday, plus public holidays, and the option to buy or sell days annually
  • Car Allowance
  • Company pension scheme
  • A range of family friendly policies
  • Occupational health support and wellbeing Portal
  • Discounts on Cinema, Restaurants and Shopping with Telent Reward scheme.

Learn more about Telent:

Click here for Telent Video!

We’re passionate about creating an environment that champions diversity and inclusion, where everyone feels they belong, can be themselves and empowered to reach their full potential. People are at the heart of our business, and we believe that our teams should reflect the diverse experiences and backgrounds of the communities we support.

About Telent

Telent is a leading technology company and specialist in the design, build, support and maintenance, drawing on decades of experience in mission critical communications and technology. of the UK’s critical digital infrastructure. The work we do helps connect thousands of people and communities, using the best technology and innovation available. When you join us, you’ll have the opportunity to make a real impact on all our futures by fulfilling your potential and delivering high performance. We work together to make everyday life work better for everyone. You’ll be part of a team of more than 3,000 brilliant, dedicated people committed to getting the job done well.

Brilliance brought together.

We are guided by our values and behaviours:

Be Inclusive

Take Responsibility

Collaborate

Be Customer-focussed

Senior Data Engineer (AWS, Airflow, Python)
Triad
London
Remote or hybrid
Senior
£60,000 - £65,000
+1

Based at client locations, working remotely, or based in our Godalming or Milton Keynes offices.
Salary up to £65k plus company benefits.

About Us

Triad Group Plc is an award-winning digital, data, and solutions consultancy with over 35 years’ experience primarily serving the UK public sector and central government. We deliver high-quality solutions that make a real difference to users, citizens and consumers.

At Triad, collaboration thrives, knowledge is shared, and every voice matters. Our close-knit, supportive culture ensures you’re valued from day one. Whether working with cutting-edge technology or shaping strategy for national-scale projects, you’ll be trusted, challenged, and empowered to grow.

We nurture learning through communities of practice and encourage creativity, autonomy, and innovation. If you’re passionate about solving meaningful problems with smart and passionate people, Triad could be the place for you.

Glassdoor score of 4.7

96% of our staff would recommend Triad to a friend

100% CEO approval

See for yourself some of the work that makes us all so proud:

Helping law enforcement with secure intelligence systems that keep the UK safe

Supporting the UK’s national meteorological service in leveraging supercomputers for next-level weather forecasting

Assisting a UK government department responsible for consumer product safety with systems to track unsafe products

Powering systems that help the government monitor and reduce greenhouse gas emissions from commercial transport

Role Summary

Triad is seeking a Senior Data Engineer to play a key role in delivering high-quality data solutions across a range of client assignments, primarily within the UK public sector. You will design, build, and optimise cloud-based data platforms, working closely with multidisciplinary teams to understand data requirements and deliver scalable, reliable, and secure data pipelines. This role offers the opportunity to shape data architecture, influence technical decisions, and contribute to meaningful, data-driven outcomes.

Key Responsibilities

Design, develop, and maintain scalable data pipelines to extract, transform, and load (ETL) data into cloud-based data platforms, primarily AWS.

Create and manage data models that support efficient storage, retrieval, and analysis of data.

Utilise AWS services such as S3, EC2, Glue, Aurora, Redshift, DynamoDB and Lambda to architect and maintain cloud data solutions.

Maintain modular Terraform based IaC for reliable provisioning of AWS infrastructure.

Develop, optimise and maintain robust data pipelines using Apache Airflow.

Implement data transformation processes using Python to clean, preprocess, and enrich data for analytical use.

Collaborate with data analysts, data scientists, developers, and other stakeholders to understand and integrate data requirements.

Monitor, optimise, and tune data pipelines to ensure performance, reliability, and scalability.

Identify data quality issues and implement data validation and cleansing processes.

Maintain clear and comprehensive documentation covering data pipelines, models, and best practices.

Work within a continuous integration environment with automated builds, deployments, and testing.

Skills and Experience

  • Strong experience designing and building data pipelines on cloud platforms, particularly AWS.
  • Excellent proficiency in developing ETL processes and data transformation workflows.
  • Strong SQL skills (postgresql) and advanced Python coding capability (essential).
  • Experience working with AWS services such as S3, EC2, Glue, Aurora, Redshift, DynamoDB and Lambda (essential).
  • Understanding of Terraform codebases to create and manage AWS infrastructure.
  • Experience developing, optimising, and maintaining data pipelines using Apache Airflow.
  • Familiarity with distributed data processing systems such as Spark or Databricks.
  • Experience working with high-performing, low-latency, or large-volume data systems.
  • Ability to collaborate effectively within cross-functional, agile, delivery-focused teams.
  • Experience defining data models, metadata, and data dictionaries to ensure consistency and accuracy.

Qualifications & Certifications

  • A degree or equivalent qualification in Computer Science, Data Science, or a related discipline (desirable).
  • Due to the nature of this position, you must be willing and eligible to achieve a minimum of SC clearance. To be eligible, you must have been a resident in the UK for a minimum of 5 years and have the right to work in the UK.

Triad’s Commitment to You

As a growing and ambitious company, Triad prioritises your development and well-being:

  • Continuous Training & Development: Access to top-rated Udemy Business courses.
  • Work Environment: Collaborative, creative, and free from discriminatioBenefits:
    • 25 days of annual leave, plus bank holidays.
    • Matched pension contributions (5%).
    • Private healthcare with Bupa.
    • Gym membership support or Lakeshore Fitness access.
    • Perkbox membership.
    • Cycle-to-work scheme.

What Our Colleagues Have to Say

Please see for yourself on Glassdoor and our “Day in the Life” videos at the bottom of our Careers Page.

Our Selection Process

After applying for the role, our in-house talent team will contact you to discuss Triad and the position. If shortlisted, you will be invited for:

  1. A technical test including numerical, logical and verbal reasoning
  2. A technical interview with our consultants
  3. A management interview to assess cultural fit

We aim to complete interviews and progress candidates to offer stage within 2-3 weeks of the initial conversation.

Other Information

If this role is of interest to you or you would like further information, please contact Ryan Jordanand submit your application now.

Triad is an equal opportunities employer and welcomes applications from all suitably qualified people regardless of sex, race, disability, age, sexual orientation, gender reassignment, religion, or belief. We are proud that our recruitment process is inclusive and accessible to disabled people who meet the minimum criteria for any role. Triad is a signatory to the Tech Talent Charter and a Disability Confident Leader.

Geospatial Software Engineer
ISR RECRUITMENT LIMITED
Manchester
Remote or hybrid
Mid - Senior
£60,000 - £90,000
+5

The Opportunity: You’ll join an experienced, collaborative consultancy team delivering greenfield, enterprise-scale digital services for high-profile public and private sector clients. This opportunity is ideal for a practical, adaptable Geospatial Full Stack Engineer who enjoys working across disciplines and solving complex problems and challenges that will have a real-world impact. Collaboration sits at the heart of how our client operates, so you’ll be partnering closely with colleagues across Software Engineering, User-Centred Design, Delivery Management, Data Science and Live Services to deliver outcomes that genuinely make a difference in today’s society. As a consultancy, they are technology-agnostic by design, focusing on choosing the right tools for each problem, rather than forcing one stack everywhere. Their teams regularly work with .NET, Java, Python, Node.js, AWS and Azure, giving you genuine scope to broaden your skills and develop your career across a range of languages and platforms. Many of their projects also involve Geographic Information Systems (GIS) and open-source geospatial technologies, helping clients unlock the value of location-based data through mapping, spatial analysis and data-driven decision making. Skills and Experience: Essential \* 3+ years’ experience in a Full Stack Engineering role \* Strong development skills in .NET, Java or Python, alongside modern JavaScript frameworks/libraries \* Experience working in Agile environments (Scrum, Kanban, TDD) \* Solid understanding of architectural and design patterns, including microservices and serverless \* Hands-on experience designing and delivering solutions on AWS or Azure \* Experience working with GIS systems or geospatial data, and familiarity with tools such as Leaflet, OpenLayers, QGIS, GeoServer, PostGIS, etc. \* A collaborative mindset and experience working in multi-disciplinary teams Desirable \* Experience working in a consultancy environment \* Exposure to public sector projects \* Familiarity with CI/CD tooling (e.g. Jenkins, Terraform) \* Awareness of the Digital Service Standard and Technology Code of Practice, particularly in geospatial or public sector contexts Role and Responsibilities: This is a varied role suited to someone who enjoys the pace, responsibility and collaboration of consultancy. You will be involved with the following types of activity: \* Design and deliver high-quality solutions: building, enhancing and maintaining software, infrastructure and deployment pipelines that are robust, secure and scalable. Projects may include solutions involving geospatial data, GIS platforms and open-source mapping tools. \* Work collaboratively across disciplines: partnering with Senior and Lead Engineers, Delivery Managers, Designers and Data Scientists to shape solutions, contribute to technical documentation and deliver against agreed plans. \* Apply standards and best practice: follow established engineering approaches, contribute accurate technical estimates and proactively identify and escalate risks or issues. \* Communicate clearly and build relationships: present ideas, prototypes and progress updates to stakeholders, while building strong working relationships with colleagues, clients and partner organisations. Applications: Please contact Edward here at ISR to learn more about our client and how they are leading the way in developing the next generation of technical solutions through innovation and transformational technology?

SQL Database Administrator
Allied Vehicles Ltd
Glasgow
In office
Mid - Senior
£52,000

SQL Database Administrator - Glasgow

At Allied Vehicles we design, develop, and manufacture a wide range of specialist vehicles, including wheelchair accessible vehicles, taxis, and minibuses.

In addition to manufacturing vehicles, we also offer a range of aftersales services (onsite and mobile), including servicing, repairs and maintenance and we are Scotland s largest independent parts distributor.

We are a driven, high performance, family business, that achieves our goals through engaging our people and maximising opportunities.

Our commitment to quality and innovation has made us a trusted name in the industry and we seek enthusiastic and dedicated individuals to join our team.

We are now recruiting for a highly experienced SQL Database Administrator to join our IT department.

You will be responsible for managing, maintaining and optimising our SQL database environments. The successful candidate will oversee day-to-day database administration, data transfers, and supporting business intelligence initiatives.

Hours of work are Monday to Friday, 8am 4.30pm, based fully on-site, and the salary is up to £52k per annum dependant on experience.

This position provides an excellent opportunity to become part of a forward-thinking and dedicated company.

Why Join Us?

We believe in taking care of our people, and that s why we offer a fantastic benefits package designed to support your well-being, career growth, and lifestyle:

  • Generous Annual Leave: Enjoy 25 days of holiday, plus 8 bank holidays.
  • Financial Security: Access our group life scheme and annual profit share.
  • Competitive Growth: Annual salary reviews to ensure you’re rewarded for your contributions.
  • 24/7 Health Support: GP24 by HealthHero provides virtual GP services and second opinions for you and your family, 24/7/365.
  • Benefits package: Enjoy industry-leading perks and discounts at your fingertips plus a holiday purchase scheme and EV leasing through OctopusEV.
  • Convenient On-Site Facilities: Free staff parking and an on-site cafeteria for your convenience.
  • Sustainable Travel: Save on your commute with our cycle-to-work scheme.
  • Continuous Development: Frequent learning opportunities to help you grow professionally.
  • Exclusive Discounts: Take advantage of after-sales discounts for yourself, friends, and family.
  • Recognition and Rewards: Celebrate your success with our company values and long-service awards program.

The main duties of the role are:

  • Administer, monitor, support, and maintain the company s SQL estate to ensure optimal performance, security, and reliability.
  • Troubleshoot and resolve database issues, ensuring minimal downtime and data loss.
  • Collaborate with the Technical Specialist, BI analysts and developers to support reporting, analysis, and analytics.
  • Contribute to the design, development, and support of production and non-production environments.
  • Perform data transfers, migrations, and integrations between systems as required.
  • Implement and maintain backup, recovery, and disaster recovery solutions.

To be effective in this role, you will have:

  • Degree qualified or equivalent industry experience in a relevant field.
  • Extensive experience in SQL database administration within demanding, fast-moving environments.
  • Knowledge of .NET and SQL development environments.
  • Strong communication skills, able to explain technical concepts to non-technical colleagues.

We are an Equal Opportunities employer and encourage applications from all members of the community. We are committed to the disability confident initiative, and creating an inclusive workplace where all individuals, regardless of disability, have the opportunity to thrive. We encourage applications from candidates with disabilities and will make reasonable adjustments where required to support you through the recruitment process and beyond. We will offer a guaranteed interview to any applicant who considers themselves to be disabled, and who meets the requirements for the post.

We appreciate all applications, but only shortlisted candidates will be contacted for an interview. Thank you for considering Allied Vehicles as your potential employer. We look forward to reviewing your application.

NO AGENCIES PLEASE

Data Analyst
FERROVIAL CONSTRUCTION (UK) LIMITED
London
In office
Mid - Senior
Private salary
TECH-AGNOSTIC ROLE

The Data Analyst plays a key role on a large-scale infrastructure project, focusing on the development and ongoing maintenance of the project s connected digital environment. The role involves analysing data to support decision-making and ensure project objectives are met.

You will work closely with information management and project controls teams, using data to improve project efficiency and support digital transformation initiatives.

You will join the FBRS (Ferrovial BAM Joint Venture) Information Management Team (IM), where your responsibilities will include ensuring systems integration, designing data modelling processes, and developing algorithms and predictive models to extract the data required by the project. You will also collaborate with teams across the project to support data analysis and share insights.

Candidates need to demonstrate outstanding attention to detail, self-motivation, and the ability to take initiative. They should also have strong Power BI expertise and experience using FME for data integration.

Key Responsibilities:

  • Collect, process, and analyse construction project data from multiple sources.
  • Support project teams with data quality checks.
  • Use FME to support information sharing and provide basic training on FME to project teams. Ensure project team members receive essential instruction on ETL tools (FME).
  • Drive digital transformation by identifying and implementing process and workflow efficiency improvements.
  • Support the integration of project systems with internal and client platforms.
  • Work closely with digitalisation and project controls teams to ensure accurate data flow and project insights.
  • Analyse datasets to identify trends, patterns and actionable insights.
  • Create and maintain Power BI dashboards, visualisations, and reports for executive and project stakeholders.
  • Work closely with the client, RSA delivery team and Project Information Manager to ensure system stability and improvement.
  • Ensure the project complies with relevant legislation, project standards, and client requirements.

Key Skills and qualifications:

  • Strong organisational skills to manage multiple tasks, projects, and data streams effectively.
  • Ability to perform Quality Assurance checks according to the project and industry standards
  • Ability to coordinate and manage own workload support project delivery.
  • Familiarity with BIM, Python/R and UK construction data standards.
  • Familiarity with ETL tools like FME and GIS integrations.
  • Strong communication, stakeholder engagement, and problem-solving skills.
  • Experience in large infrastructure projects.

Location: London

Please note that this job description does not represent a comprehensive list of activities and employees may be requested to undertake other reasonable duties.

The Ferrovial BAM Joint Venture (FBJV) has a successful history of delivering critical infrastructure for the UK on time and to budget together in joint venture partnership. They first worked together in 2010 as BFK, delivering three Crossrail contracts, including the longest stretch of tunnelling works between Royal Oak and Farringdon and Farringdon Station, the first central station to be completed on the Elizabeth Line. The team is also delivering the Silvertown Tunnel project together in East London and has been delivering excellence at each stage of HS2, such as Fusion JV for the Enabling Works packages, EKFB for the central Main Works Contract and now delivering the track infrastructure across the entire HS2 route.

Seize the challenge. Move the world together! Innovative, creative, respectful, and diverse are some of the ways we describe ourselves. We are motivated by challenges, and we collaborate across our business units to move the world together. Your journey to a fulfilling career starts here!

Ferrovial is an equal opportunity employer. We treat all jobs applications equally, regardless of gender, color, race, ethnicity, religion, national origin, age, disability, pregnancy, sexual orientation, gender identity and expression, covered veteran status or protected genetic information (each, a Protected Class ), or any other protected class in accordance with applicable laws.

Senior Machine Learning Engineer
Tate
London
Hybrid
Senior
£85,000 - £95,000

Senior Machine Learning Engineer - Data Science Focus

Based in London (Hybrid - 2 days onsite)

Permanent, Full-Time

Salary: Up to 95,000 (depending on experience)

We are seeking a Senior Machine Learning Engineer to design and deliver production-grade ML systems for a leading digital gaming and gambling platform. This is a hands-on role combining data science expertise with engineering skills - you’ll build models, optimise algorithms, and deploy solutions at scale to enhance customer engagement and decisioning.

You’ll work closely with Data Scientists to translate prototypes into robust applications, ensuring performance, governance, and reliability. If you’re passionate about applied AI, data-driven problem solving, and building ML systems that deliver measurable impact, this is the role for you.

Role and Responsibilities

  • Data science & modelling: Develop, validate, and optimise predictive models using advanced ML algorithms (e.g., gradient boosting, logistic regression, ensemble methods).
  • End-to-end ML engineering: Deploy models as APIs, batch jobs, and streaming services; implement CI/CD, monitoring, and rollback strategies.
  • Feature engineering & pipelines: Build scalable data workflows and feature stores for ML applications.
  • Infrastructure & tooling: Containerise applications with Docker, orchestrate with Kubernetes, and deploy securely in AWS.
  • Model governance: Apply best practices for evaluation, drift monitoring, and compliance.
  • Collaboration: Partner with Data Scientists and business stakeholders to translate insights into production-ready solutions.

Key Skills and Experience

  • Master’s degree in a STEM or quantitative discipline (PhD nice to have).
  • 3+ years of industrial ML engineering experience (not purely academic; not focused on Generative AI).
  • Strong data science fundamentals: supervised learning, evaluation metrics, feature engineering, and experimentation.
  • Production-grade Python proficiency and ability to write clean, maintainable code.
  • Comfortable with complex SQL queries.
  • Hands-on experience with AWS (ECR/ECS/EKS, Lambda, S3, IAM, CloudWatch), ideally AWS-certified.
  • Experience with Docker and Kubernetes in production environments.
  • Degree (BSc/MSc) in a STEM or quantitative discipline; PhD desirable.
  • Strong communication skills and ability to explain technical concepts clearly.

Apply now with your most up-to-date CV and a short note highlighting your experience with Python, SQL, AWS, Docker, Kubernetes, and data science projects.

Please be aware this advert will remain open until the vacancy has been filled. Interviews will take place throughout this period, therefore we encourage you to apply early to avoid disappointment.

Tate is acting as an Employment Business in relation to this vacancy.

Tate is committed to promoting equal opportunities. To ensure that every candidate has the best experience with us, we encourage you to let us know if there are any adjustments we can make during the application or interview process. Your comfort and accessibility are our priority, and we are here to support you every step of the way. Additionally, we value and respect your individuality, and we invite you to share your preferred pronouns in your application.

Data Engineer - TV Advertising Data (FAST)
Datatech
London
Hybrid
Mid - Senior
£75,000 - £85,000

Location: London - 3 days onsite
Salary 75,000 - 85,000 Neg DOE
Reference : J13057

Note: Full and current UK working rights required for this role

We’re currently seeking a Data Engineer to build the foundations behind the rapidly growing FAST (Free Ad Supports Streaming TV channels) A pioneering opportunity to be involved with direct to consumer advertising for a Global player in the field. Someone who is passionate about how data drives the industry and to help optimise campaigns, measure performance, and monetise content.

Key Responsibilities
Design, build, and maintain scalable ETL/ELT pipelines that transform raw data into reliable, analytics-ready datasets
Ingest, integrate, and manage new data sources across advertising, audience, platform, and content data within Fremantle’s Microsoft Fabric environment
Deliver robust data flows that underpin global FAST dashboards, monetisation insights, and audience viewing metrics
Work closely with the central Data & Analytics team to enable high-quality Power BI reporting and analysis
Ensure strong data governance, integrity, and security across the Azure/Fabric ecosystem
Optimise data pipelines for performance, scalability, and efficiency, following best-practice engineering standards including version control and code reviews
Monitor pipeline health, data freshness, and quality, implementing proactive alerting and issue resolution
Translate business and analytical needs into well-structured data models and technical solutions
Automate data workflows to minimise manual processes and improve operational reliability
Maintain clear documentation of pipelines, datasets, and data flows to support collaboration and smooth handovers
Stay current with data engineering best practices, particularly within the Microsoft technology stack

Skills & Experience
5+ years’ experience working as a Data Engineer or in a similar role
Proven experience with cloud-based data platforms (Azure, AWS, SQL, Snowflake, Springserv); Microsoft Fabric experience is a strong plus
Strong proficiency in Spark SQL and PySpark, including complex transformations
Experience building ETL/ELT pipelines using tools such as Azure Data Factory or equivalent
Ability to write efficient, reusable scripts for transformation, validation, and automation
Hands-on experience integrating data from APIs (REST, JSON), including automated data collection
Solid understanding of data modelling best practices for analytics and dashboards
Confidence working with large, complex datasets across multiple formats (CSV, JSON, Parquet, databases, APIs)
Strong problem-solving skills and the ability to diagnose and resolve data issues
Excellent communication skills and experience working with cross-functional teams
Genuine curiosity about how data drives content performance, audience behaviour, and monetisation

If this sounds like the role for you then please apply today!

Data Engineer
Answer Digital
Leeds
Hybrid
Mid - Senior
Private salary

Answer Digital is looking to recruit a Data Engineer to play a vital role in supporting the digital transformation journeys of our clients. What you’ll be doing We’re looking for someone who has experience guiding teams and supporting clients to use data to make better decision making. We're working on some amazing projects across both our Health and Private clients, so if you're interested in being involved in work that delivers real value then we'd love to hear from you. What you’ll bring to Answer Designing and implementing complex data pipelines, leveraging both NoSQL and relational databases. Develop and manage cloud-based data solutions, ensuring scalability and security. Lead and mentor junior data engineers, promoting knowledge sharing and professional growth. Implement and oversee CI/CD pipelines, ensuring efficient deployment of data services. Proficiency in Python, SQL, and other relevant backend languages. Utilise backend programming languages effectively for data processing and manipulation. It would be great if you also had experience in some of these, but if not we’ll help you with them: Actively participate in the organisation and delivery of our Data Engineer Academy. Apply advanced scripting skills to automate data processes and integrate ETL workflows. Engage with business intelligence (BI) tools and techniques to deliver actionable insights. Advocate and apply best practices in software design patterns, principles, and architecture patterns. Stay abreast of the latest trends in data engineering, with a focus on healthcare data standards (SNOMED, FHIR, HL7). Understanding the role of a consultancy and sharing a passion for helping solve our customers problems The perks of being @ Answer We’re an Employee-Owned Company. After 12 months – through the Employee Ownership Trust you will be part of the ownership of Answer; a major factor in driving engagement, retention and growth for our people Flexible annual leave (buy/sell and carry forward) Company-wide bonus, paid twice a year (and it's income tax free too!) Continuous training and development - if you want to learn, we’ll provide the support you need Flexible Pension - we match your on contributions up to 5% Regular tech catch ups/hack events - we also encourage external tech events! A packed social calendar including; Christmas party (partners invited) and Summer away days, monthly and quarterly company team socials. Free parking at Head Offices in central Leeds, plus Cycle2Work & Green Car Lease schemes to help get you here The chance to give back – get involved nationally and regionally with partnerships to get people from different backgrounds into tech, as well as lots of charity and community events Hybrid and flexible working – you can vary your working when and where you work, to allow you to collaborate better, feed your creativity, and take the time and space to focus when you need to Diversity and Inclusion At Answer we proudly embrace diversity and inclusion - we want to create a safe environment for everyone to bring their true selves to work. We will do everything we can to support your application; if you require any adjustments to be made to your application or interview process please speak to our recruitment team. A bit more about us. Answer Digital is a successful digital transformation consultancy headquartered in Leeds. We’re a company on a fast growth trajectory with a reputation of delivering large-scale, operational critical solutions. People are at the heart of everything we do - so much so we’re owned by our people. It sets us apart from most digital businesses you’ll meet and defines our culture and values. Our people are invested in everything we do, because we are invested in them. Don't believe us? Well check out our Glassdoor and hear what our people have to say!

Spotlight
Technology Consulting Academy
Answer Digital
Multiple locations
Hybrid
Graduate - Junior
£27,000 - £30,000
TECH-AGNOSTIC ROLE

About the job Who are Answer Digital? Answer Digital are a proudly Employee-Owned digital transformation consultancy headquartered in Leeds. We’ve delivered large-scale, cutting edge and life saving digital solutions in both the private and healthcare sectors. After a period of huge growth, we’re looking to invest in our people with our first hybrid Technology Consulting Academy. We’re passionate about investing in our people, so much so that “Nurturation” is one of our core values (our way of nurturing growth through collaboration!). Our Academy We’re looking for motivated and ambitious people to join our Technology Consulting Academy. You’ll be employed with us from Day 1 and the programme has been designed by our talented consultants to accelerate the professional development of great people. The Academy will cover all our technical capabilities to a high level, specialising more as the academy progresses. By the time the Academy is complete, you’ll have all the skills and knowledge to play a pivotal role on one of our projects, helping build solutions for our clients across Software and DevOps Engineering, Data Engineering and Quality Assurance. Academy Structure: What to expect With a blend of workshops, mentored coaching sessions and self guided learning, the Academy will cover: - The fundamentals of the Software Development Lifecycle - Proficient application of Agile methodologies within a professional environment (e.g. Scrum, Kanban, Lean). - Training in core technologies used across our technical capabilities including - Programming Languages (.Net, Javascript, Python) - Database and Data technology (SQL, Pyspark, ETL, Snowflake, Data Architecture, warehouses and lakehouses) - DevOps and Cloud Technologies (Azure and AWS, CI/CD, Terraform and Docker) - Test Frameworks (Cypress, Accessibility and Performance testing) - Include upskilling in emerging AI technologies and their application in software, date and quality assurance engineering. - Strategies for fostering personal development and navigating career progression. You will also be integrated into the broader Answer Digital community, engaging with professionals across various disciplines and industry sectors, with plenty of opportunity to learn. As the Academy approaches its conclusion, you will complete a series of analytical tasks and presentations for your Academy Coach. The Academy culminates in a 2-3 week assignment undertaken alongside an experienced consultant on a live project, followed by a formal presentation of their work to the wider Answer Digital community. After completing the Academy, you will join one of our technical capabilities supporting our clients to deliver great outcomes. What you’ll bring to Answer For us, this isn’t about the jobs you’ve had but the experience you’ve gained along the way though in order to succeed you’ll need; - Either hands on experience or a qualification in a Technology field (such a tech bootcamp, a computer science degree, a tech apprenticeship etc.) or a Data field (a role involving processing/manipulating large data sets or a mathematics degree etc.) - Demonstrated interest and experience within the Information Technology sector. - Someone who constantly challenges assumptions, solves problems and focuses on the details to get things right. - A great communicator, being able to articulate problems to our clients and how we can solve them. This could come from working in tech already, a career changer with lots of transferable skills or someone from academia, who’s had plenty of experience with the above - either way, we’d love to hear from you! It would be great if you also had experience in some of these, but if not we’ll help you with them: - Experience working in multidisciplinary teams to build something great. - Practical, hands-on experience working on Agile projects with .Net, Javascript, Python, SQL etc.. About you This isn’t just about your experience, we’re investing in you as a person and you’ll be set up for success if you have: - A genuine commitment to developing a career in technology and digital transformation - A passion for continuous learning and professional development. - Great communication skills, encompassing the ability to interact confidently with diverse individuals and groups at all organisational levels, and to adapt how you communicate to suit your audience. - A strong focus on achieving positive outcomes for both end-users and clients. - A collaborative and team-oriented approach to work. - Commercial experience or the understanding of commercial impacts to project work - Effective time management skills and the ability to prioritise The perks of being at Answer - We’re an Employee-Owned Company. After 12 months – through the Employee Ownership Trust you will be part of the ownership of Answer; a major factor in driving engagement, retention and growth for our people. - Competitive salary, starting at £27,500 and increasing to £30,000 after the Academy. - Flexible annual leave (buy/sell and carry forward). - Twice a year income-tax free bonus if company-wide performance targets are met - we’re all in this together - Continuous training and development - if you want to learn, we’ll provide all the support you need. - Flexible Pension - we match your own contributions up to 5% - A flexible Healthcare cash plan so you can fund the care you value most. - A packed social calendar including; end of year party and Summer away days, monthly and quarterly company team socials. - Free parking at Head Offices in central Leeds, plus Cycle2Work & Green Car Lease schemes to help get you here. - The chance to give back – get involved nationally and regionally with partnerships to get people from different backgrounds into tech, as well as lots of charity and community events. - Hybrid and flexible working – you can vary your working when and where you work, to allow you to collaborate better, feed your creativity, and take the time and space to focus when you need to. Diversity and Inclusion At Answer we proudly embrace diversity and inclusion - we want to create a safe environment for everyone to bring their true selves to work. We will do everything we can to support your application. If you require any adjustments to be made to your application (whether that’s support with our interview questions, adjusting how we interview, or financial support with our hiring process) then please speak to Jonny Hiles, TA Lead (jonny.hiles@answerdigital.com) Our Interview Process - We pride ourselves on having a fair but flexible recruitment process, we want to create a platform where you can show us your best. For us, keeping it simple means you can focus on understanding if our people, values and the work we do are right for you. Our typical process can be broken down into 3 stages; - Application Process: Depending on the volume of applications, our process may involve a couple of assessments. We focus on your inherent ability rather than your CV. That's why the application includes an analytical thinking and problem solving exercise. We want to assess your capacity for analytic thinking and problem-solving—skills that are crucial for a successful career with us. - Recruiter call: This will be with a member of the recruitment team, and normally lasts 15-30 minutes. We will talk to you about Answer Digital, the type of work we do, and an overview of the role. We’ll ask you some questions about your current situation and why you’re interested in working here. Here, we'd advise asking any questions about our people, such as our culture, career progression, or approach to flexible working. - Skills based interview: This interview will be with 2 people, normally from the team you’re applying to work with and roughly lasts an hour. This interview will involve questions around your experience, the skills you’ve gained and the work you’ve done. We’ll also ask questions about the role and challenges you might face and how you’d face them. Here, we’d recommend asking questions about the role, the day to day responsibilities and what life in the team is like. - Cultural/Value interview: The final interview stage is an hour-long discussion with two members of our Leadership team (either Directors or Capability Leads). This interview is a two-way conversation, as much about you assessing if Answer is the right fit for you as it is about us assessing your fit with Answer. The focus will be on our culture and values. We'll discuss what our values mean to us, and we want to understand what values are important to you and how you like to work. We strongly encourage you to ask any questions you have about Answer as a business, including our business model and our ambitious growth plans.

Frequently asked questions
Typically, a Data Engineer should have a strong background in computer science or related fields, proficiency in programming languages like Python or Java, and experience with data warehousing, ETL processes, and big data technologies such as Hadoop or Spark.
Haystack features a wide range of Data Engineer positions, including roles in startups, large enterprises, and remote opportunities. You can find jobs specializing in cloud data engineering, real-time data processing, data pipeline development, and more.
To improve your chances, tailor your resume to highlight relevant skills and projects, gain hands-on experience with popular data tools, contribute to open-source projects, and stay updated with the latest trends in data engineering.
Yes, Haystack lists entry-level Data Engineer roles suitable for recent graduates or professionals transitioning into data engineering, as well as internships and junior positions to help you start your career.
Absolutely. Haystack offers many remote and flexible Data Engineer job listings to suit your preferred working style and location.