Make yourself visible and let companies apply to you.
Roles
Data Engineer Jobs
Overview
Looking for top Data Engineer jobs? Explore the latest data engineering opportunities on Haystack, your go-to IT job board. Whether you're skilled in ETL, data pipelines, or big data technologies, find the perfect role to advance your career today. Start your search for Data Engineer positions now!
Machine Learning Engineer
Halian Technology Limited
Reading
In office
Mid - Senior
£90,000
RECENTLY POSTED

JOB- Machine Learning Engineer

LOCATION- Reading

TERM- Permanent

SALARY- £60,000 to £90,000 per annum (plus benefits)

My client is a technology-focused organisation developing advanced AI and data-driven solutions across a range of industries. They are looking to expand their technical team by hiring a Machine Learning Engineer to support the development and deployment of intelligent systems from their offices in Reading.

The Machine Learning Engineer will ideally have the following attributes:

Knowledge / experience in Machine Learning, Artificial Intelligence, or Data Science

A degree qualification (BSc, MSc, PhD etc.) in Computer Science, Data Science, Mathematics, Statistics, Artificial Intelligence or similar

Strong programming skills in Python

Experience with machine learning libraries such as TensorFlow, PyTorch, or Scikit-learn

Experience working with large datasets and building data pipelines

Knowledge of cloud platforms such as AWS, Azure, or GCP would be beneficial

Experience deploying machine learning models into production environments

Understanding of APIs, data processing frameworks, and software development best practices

Ability to work both independently and as part of a collaborative technical team

Excellent communication skills (both verbal and written)

Keen to develop technically and grow within a fast-paced environment

The Machine Learning Engineer role will involve:

Working within the existing data and engineering teams to develop machine learning solutions

Designing, building, and deploying machine learning models

Preparing and processing datasets for model training and evaluation

Improving model performance through experimentation and optimisation

Collaborating with software engineers to integrate ML solutions into applications

Researching and implementing new machine learning techniques where appropriate

If you are interested in this position, please apply with an up-to-date CV as soon as possible, along with your availability and salary expectations.

TPBN1_UKTJ

RPA & Data Automation Developer
Adecco
Surrey
Hybrid
Mid - Senior
£600/day - £700/day
RECENTLY POSTED
TECH-AGNOSTIC ROLE

Job Title: RPA & Data Automation Developer
Contract Type: 6 month contract
Inside IR35 - £550-£700 per day (umbrella rate)
Location: Hybrid working - Surrey

Are you ready to take your career to the next level in the world of Robotic Process Automation (RPA) and Data Automation? We’re on the lookout for a passionate RPA & Data Automation Developer to drive efficiency, improve data quality and deliver actionable insights. If you thrive in a collaborative environment and enjoy working with cutting-edge technologies, we want to hear from you!

Key Responsibilities:

  • Data Design & Preparation: Design processes for preparing, enriching, and documenting data using semantic models, Lakehouses and data warehouses to enable insightful analysis.
  • Automation Proficiency: Utilize multiple automation technologies, including AI, ML, Power Automate and Power Apps, to streamline data access and empower developers and analysts.
  • Transform & Test Data: Transform and rigorously test data using dataflows, procedures and notebooks to design user-friendly visualizations that uncover valuable insights.
  • Data Storage Solutions: Implement robust storage and querying strategies for Lakehouses and data warehouses, ensuring a single version of the truth across the organization.
  • Stakeholder Communication: Engage with both technical and non-technical stakeholders to understand business requirements and communicate potential insights effectively.
  • Quality Assurance: Conduct careful testing of data lists and aggregations, creating UAT parameters and checklists to ensure accuracy and enable business sign-off.
  • Collaborative Governance: Work alongside other team members to design and document solutions while establishing strong governance and control processes.
  • Data Flow Analysis: Analyze and document data flows to meet corporate standards, ensuring reusability and maximizing insights for informed decision-making.

What We’re Looking For:

  • Experience with RPA tools.
  • Strong knowledge of data automation technologies, including Power Automate, Power Apps and data storage solutions.
  • Excellent analytical skills with a keen eye for detail.
  • Ability to communicate complex concepts clearly to a diverse audience.
  • A collaborative spirit with a commitment to achieving business objectives.

If you’re excited about the possibility of making a difference through RPA and Data Automation, we’d love to hear from you! Apply now and become a vital part of our mission to enhance public services through data-driven insights.

Let’s innovate together!

Adecco is a disability-confident employer. It is important to us that we run an inclusive and accessible recruitment process to support candidates of all backgrounds and all abilities to apply. Adecco is committed to building a supportive environment for you to explore the next steps in your career. If you require reasonable adjustments at any stage, please let us know and we will be happy to support you.

Senior Data Engineer
VC Talent
London
Hybrid
Senior
£59,000/day - £64,000/day
RECENTLY POSTED
+2

A successful FTSE-listed organisation with a friendly, close-knit culture is looking for a Senior Data Engineer to help shape and deliver its evolving data platform. This is a hands-on, high-impact role in a smaller organisation where you will act as both a senior engineer and strategic data partner, designing solutions that support both operational and long-term business goals. You will play an important role in the company’s migration to Microsoft Fabric, while continuing to build and optimise its existing SQL Server-based data warehouse environment. While primarily a backend engineering role, you will also support reporting via SSRS/Microsoft Power BI, with an increasing focus on AI-driven capabilities over time. As the Senior Data Engineer, you will work closely with the Solutions Architect and collaborate with teams across the business, with occasional travel to France and Germany. This opportunity suits someone with in-depth SQL experience who is looking to work with modern tools like Microsoft Fabric. Essential Skills Strong SQL Server experience (T-SQL, SSIS, SSRS, stored procedures, functions, triggers) Data warehouse architecture, build and maintenance Excellent communication skills and ability to gather requirements from non-technical stakeholders at all levels Degree in Computer Science or STEM subject Comfortable working 4 days per week onsite in Vauxhall Desirable Microsoft Fabric, Azure Synapse, Azure Data Factory, Snowflake, Databricks Python or any other relevant language Experience with tools such as Power BI, Qlik or Tableau The role offers a package of £64k+, plus a bonus of up to 15%, along with a generous pension, 28 days' holiday, private medical insurance, permanent health insurance, study support, and a modern office with an onsite gym. If your experience aligns, apply with an up-to-date CV as soon as possible, as this is expected to be a popular opportunity

Azure Data Engineer
Opus Recruitment Solutions
Bristol
Hybrid
Mid - Senior
£400/day - £500/day
RECENTLY POSTED

Azure Data Engineer | £400 - £500 Outside IR35 | Bristol | Hybrid | 6‑Month Initial Term |

A large‑scale data transformation programme is underway, and our client is looking for an experienced Azure Data Engineer to support the rebuild of their cloud data platform. This role is hands‑on and delivery‑focused — you’ll be designing and developing Azure‑native data pipelines, working extensively with Databricks, and shaping scalable data models across the Microsoft ecosystem. The role would require you to be on site in Bristol 4 days per week, please only apply for this position if you are local enough to do this without relocating.

What you’ll be doing

Build, enhance and maintain data pipelines using Azure Databricks, Data Factory, and Delta Lake

Develop and optimise Lakehouse components and cloud‑based data flows

Create robust data models to support analytics, MI and downstream reporting

Assist in migrating legacy warehouse assets into a modern Azure environment

Contribute to cloud architecture decisions, data standards and best‑practice engineering patterns

Develop reliable Python and PySpark code to support data ingestion, transformation, and end‑to‑end processing.

What you’ll bring

Strong hands‑on experience across Azure Data Services (ADF, ADLS, Synapse, Databricks)

Excellent SQL skills, with experience in performance tuning and optimisation

Solid understanding of data modelling (star schema, medallion, ETL frameworks)

Ability to work with complex, inconsistent or legacy data sources

Experience building scalable, production‑ready pipelines in a cloud environment

Azure Data Engineer | £400 - £500 Outside IR35 | Bristol | Hybrid | 6‑Month Initial Term

Software Engineer - AI and Data, Perm, Midlands
BMR Associates Limited
Birmingham
Hybrid
Junior - Mid
£50,000
RECENTLY POSTED

Software Engineer AI and Data, Permanent West Midlands. Artificial Intelligence, AI, Machine Learning, ML, MLOps, LLM, Statistical Analysis, Software, Python, Data, Prompt Engineering. Due to continued growth and investment in Tech, this leading edge organisation is looking to the market for an AI Developer to join an exciting team looking to scale their AI development capability. I seek a creative Software Engineer with a passion for AI and Machine Learning who has is a team player, and has the attitude to grow as a developer to support this client on their AI journey. Working in an Agile environment, you will help transform high quality data into AI Powered products including conversational and agent based solutions. This position will be Hybrid, and you will be required onsite in their Birmingham office 2 days per week. You will ideally have 2 years Software Development experience, developing and maintaining software components / services using C# and / or Python. If you have previously supported AI Powered features, this will add considerable weight to your application. At this level you will be expected to demonstrate superb communication skills both verbally and written, along with the ability to work collaboratively in an agile environment and cross functional teams as well as develop excellent working relationships with key stakeholders at all levels. This role requires someone who is a reliable, enthusiastic about AI and Machine Learning and a motivated individual who has the desire to learn and grow with the business. If you are interested, please send your CV and call me for more information. TPBN1\_UKTJ

Lead PySpark Engineer
Randstad Technologies Recruitment
London
Remote or hybrid
Senior
£281/day - £292/day
RECENTLY POSTED

PySpark Engineer Lead

As the Technical Lead, you will drive the high-stakes migration of legacy SAS analytics to a modern, cloud-native PySpark ecosystem on AWS. This isn’t just a lift and shift you will refactor complex procedural logic into scalable, production-ready distributed pipelines for a Tier-1 financial services environment.

Core Responsibilities

  • Engineering Leadership: Design and develop complex ETL/ELT pipelines and Data Marts using PySpark, EMR, and Glue.
  • Legacy Modernisation: Architect the conversion of SAS Base/Macros into modular, testable Python code using SAS2PY and manual refactoring.
  • Performance Tuning: Optimise Spark execution (partitioning, shuffling, caching) to ensure cost-efficient processing of massive financial datasets.
  • Quality & Governance: Implement rigorous CI/CD, unit testing, and data reconciliation frameworks to ensure “penny-perfect” accuracy.

Technical Stack

  • Engine: PySpark (Expert), Python (Clean Code/SOLID principles).
  • AWS: EMR, Glue, S3, Athena, IAM, Lambda.
  • Data Modeling: SCD Type 2, Fact/Dimension tables, Data Vault/Star Schema.
  • Legacy: Proficiency in reading/debugging SAS (Base, Macros, DI Studio).
  • DevOps: Git-based workflows, Jenkins/GitLab CI, Terraform.

Randstad Technologies is acting as an Employment Business in relation to this vacancy.

Data Scientist - SC Cleared
Hays Technology
London
Hybrid
Mid - Senior
£500/day - £550/day
RECENTLY POSTED

Your new company
One of the most influential Central Government Organisations in the current economic climate
Your new role
Data Scientist - SC Cleared - SQL, Python & R
What you’ll need to succeed
My client is looking for an Analytical Data Scientist, leading/working alongside a team of data scientists to deliver key outputs for commissioned projects (use cases).
You will also support the development of GSCIP through developing tools, data visualisations, and data available for analysis.
You will have the opportunity to work on bespoke data science projects to improve understanding and interpretation of the data, and enhance use case delivery capability.
This role can only be offered to candidates with Active and Existing SC or DV Clearance.

Essential Criteria:

  • Experience of delivering high-quality coding projects that make use of at least two of: SQL, Python & R
  • Experience of engaging with stakeholders across government to scope and deliver impactful analysis
  • Experience of leading teams through complex data science projects - a track record of delivering complex data science projects on time to meet user needs, and overcoming any challenges
  • A demonstrable commitment to developing your knowledge and expertise
  • Significant technical data science knowledge

Desirable Criteria:

  • Understanding of supply chain data sources
  • Experience with Pyspark/Spark SQL
  • Experience delivering interactive visualisations in Python and SQL
  • Graph data experience, including with graph query languages and knowledge graphs
  • Experience working on technical commercial procurements

This is a hybrid role at 40% and Monday is a mandatory team day. Successful candidates will join the role asap and interviews are to commence 16/3/26 onwards.

What you’ll get in return
This is an excellent role to join the GSCI Programme as an experienced Data Scientist, ensuring existing delivery and data standards are maintained and services scaled up!
What you need to do now

If you’re interested in this role, click ‘apply now’ to forward an up-to-date copy of your CV, or call us now.
If this job isn’t quite right for you, but you are looking for a new position, please contact us for a confidential discussion about your career.

Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C’s, Privacy Policy and Disclaimers which can be found at (url removed)

RPA Developer
Adecco
Guildford
In office
Mid - Senior
£500/hour
RECENTLY POSTED

Job Advertisement: RPA Developer

Are you ready to make a significant impact in the world of public services? Our client Surrey Police are seeking a talented Robotics Consultant to join their dynamic team in Guildford on a temporary full-time basis. With a daily rate of 700, this is your opportunity to showcase your expertise in robotics process automation (RPA) and drive innovation within the public sector!

PLEASE NOTE DUE TO THE POLICE VETTING CRITERIA YOU MUST HAVE RESIDED WITHIN THE UK CONTINUOUSLY FOR AT LEAST 5 YEARS AT THE TIME OF APPLICATION.

Key Responsibilities:

Testing, Deployment & Maintenance:

  • Validate bot functionality and ensure optimal performance under diverse conditions.
  • Configure bots for production use, monitor their operations, and implement updates as processes evolve.

Communication & Collaboration:

  • Clearly articulate technical solutions to stakeholders and non-technical teams.
  • Collaborate across business units and IT teams to enhance automation outcomes.

Strategic Planning:

  • Plan automation initiatives, set achievable goals, and anticipate future scaling requirements.

What You’ll Need:

Technical Skills:

  • Basic programming knowledge in languages like Python, Java, or C#.
  • Proficiency in RPA tools such as UiPath, Blue Prism, or Automation Anywhere.
  • SQL and database integration skills to support data-driven automation.
  • Experience in API integration and UI automation.

Analytical & Problem-Solving Abilities:

  • Strong process analysis and mapping skills to identify automation opportunities.
  • Troubleshooting and debugging skills to resolve bot errors and ensure smooth operations.

Workflow & UX Design:

  • Experience in designing workflows using UML or BPMN to optimize processes.
  • User experience awareness to create intuitive bot interfaces for seamless adoption.

Soft Skills:

  • Exceptional communication and teamwork abilities.
  • Strategic thinking to align automation solutions with business goals.

Note: This role is temporary, and applicants must be available for full-time work in Guildford. Only successful candidates will be contacted for interviews.

Adecco is a disability-confident employer. It is important to us that we run an inclusive and accessible recruitment process to support candidates of all backgrounds and all abilities to apply. Adecco is committed to building a supportive environment for you to explore the next steps in your career. If you require reasonable adjustments at any stage, please let us know and we will be happy to support you.

Adecco acts as an employment agency for permanent recruitment and an employment business for the supply of temporary workers. The Adecco Group UK & Ireland is an Equal Opportunities Employer.

By applying for this role your details will be submitted to Adecco. Our Candidate Privacy Information Statement explains how we will use your information - please copy and paste the following link in to your browser (url removed)

Data Scientist
Vermillion Analytics
Nottingham
Hybrid
Junior - Mid
£40,000 - £60,000
RECENTLY POSTED

Job Opportunity: Data Scientist Performance Intelligence Startup · Remote-first · England or Spain £45,000 – £60,000 depending on experience About the Company This is a really exciting opportunity with a fast-growing performance intelligence startup with a genuinely clever concept — think elite sports coaching, but applied to the workplace. The company helps employees, teams, and organisations reach peak performance by uncovering the root causes that hold people back, delivering personalised, data-driven insights and actionable solutions. They drive measurable business impact across things like sales growth, AI adoption, employee retention, burnout reduction, and risk management. It is, in short, a platform that takes data seriously and actually does something useful with it. Refreshing, right? The company is building toward an agent-first platform where AI becomes the primary interface for users — and they are looking for a talented Data Scientist to help make that a reality. The Role (No, you won't just be cleaning CSV files for eternity — though fair warning, some of that is inevitable in any data job.) This is a hands-on, high-impact position where the successful candidate will: \* Build, train, and deploy machine learning models — from exploratory analysis through to production-ready solutions that power the platform. Real models, in the real world. Not just PowerPoints about models. \* Develop and deploy AI agents using Databricks, Google AI Studio and/or Azure AI Foundry. These agents are the primary user interface for the product — so this is genuinely central to what the company does. \* Apply statistical rigour to psychometric and survey data, validating the company's proprietary measurement framework and identifying correlations that drive actionable insights. \* Advance analytical capabilities beyond aggregation — into predictive modelling, deep learning, and generative AI applications. The ambition is there; they just need the right person to execute it. \* Work closely with the CTO and tech team to integrate models into the platform. Collaboration is key — this is not a lone-wolf role. \* Translate complex findings into clear, actionable insights for product decisions and client-facing reports. The ability to explain results to non-technical stakeholders is genuinely valued here. ✅ What They're Looking For The must-haves (non-negotiable, unfortunately — they checked): \* 2–4 years of hands-on experience in data science, machine learning, or applied statistics. \* Strong Python skills with experience in ML frameworks such as scikit-learn, PyTorch, TensorFlow, or similar. \* Solid grounding in statistics — hypothesis testing, regression, correlation analysis, and experimental design. \* Experience with SQL and data manipulation at scale. \* Comfortable with ambiguity and working independently in a fast-moving startup environment. This is not a company with 47 layers of sign-off. Decisions happen quickly. \* Professional English proficiency (B2+ level). ⭐ Nice to Have Bonus points — none of these are dealbreakers, but they would make a hiring manager very happy: \* Experience with GenAI, LLMs, SLMs, prompt engineering, or agentic AI workflows. Given the direction of the product, this is increasingly relevant. \* Cloud AI platform experience — GCP, Azure, Databricks, or similar. \* Startup experience. You know what wearing multiple hats feels like, and you didn't hate it. \* A background in behavioural science, organisational psychology, or psychometrics — the company works heavily with survey data and measurement validation. \* Spark, Databricks, or distributed computing experience. \* Published research or an academic background in statistics, ML, or a quantitative field. What's on Offer \* Salary of £45,000–£60,000 depending on experience, with bi-annual salary reviews. \* Remote-first, flexible working environment. The company trusts its people to get the job done. \* A genuine opportunity to join a high-potential startup at an exciting stage and directly influence the product, technical direction, and culture. \* High ownership — ideas actually get built here, rather than disappearing into a backlog black hole. \* A high-growth career trajectory as the team and product scale. \* A team culture that values passion, autonomy, continuous learning, and collaborative problem-solving. Small egos encouraged. Location Valencia, Spain or East Midlands, England is preferred — but the company is open to candidates across England or Spain. The role requires willingness to meet in person once a month (and more as needed). Worth it for the Valencia weather alone, frankly

Data Engineer
TEAM
Chandler's Ford
Hybrid
Mid - Senior
£500/day
RECENTLY POSTED

A Data Engineer is needed for a contract where your work will directly shape how a business trusts, structures, and uses its data. If you enjoy building reliable pipelines, improving models, and turning messy data into dependable assets, this is the kind of project where your impact is felt quickly. This role focuses on practical delivery. You’ll be strengthening the foundations of analytics and reporting by building dependable solutions that teams across the organisation rely on every day. What’s in it for you £500 per day contract with immediate impact on a growing environment Hybrid working with a balanced onsite and remote setup A delivery-focused project where practical engineering skills are valued The opportunity to improve and shape core assets used across the business A collaborative environment working closely with technical teams and stakeholders Real ownership over the reliability and structure of pipelines and models What you’ll be getting stuck into as a Data Engineer Building and maintaining scalable pipelines that support analytics, reporting, and operational data use Developing and refining warehouse models that align with real business requirements Writing and optimising SQL for transformation, integration, and performance improvements Strengthening quality through validation, governance, and structured data workflows Delivering reliable, accessible datasets for reporting and decision-making Supporting monitoring, testing, and continuous improvement across data processes What you’ll bring to the table as a Data Engineer Strong hands-on experience delivering practical solutions Strong SQL capability for transformation, modelling, and optimisation Previous experience designing and working with data warehouse models Experience building and maintaining production pipelines Exposure to platforms such as Databricks, Synapse, or Microsoft Fabric If you're a Data Engineer ready to step into a contract where you can quickly add value by building dependable pipelines and models, apply now to learn more. Candidate Source Ltd is an advertising agency.  Once you have submitted your application it will be passed to the third party Recruiter who is responsible for processing your application. This will include holding and sharing your personal data, our legal basis for this is legitimate interest subject to your declared interest in a job. Our privacy policy can be found on our website and we can be contacted to confirm who your application has been forwarded to

Senior Data Analyst
Greater London Authority (GLA)
London
Hybrid
Senior
Private salary
RECENTLY POSTED

Communities & Skills

Collaborative, open, inclusive and fair - we work with and through partners to ensure Londoners can shape healthy, empowered and productive lives. Communities and Skills is led by Executive Director, Tunde Olayinka and is comprised of the following units: Civil Society & Sport, Communities & Social Policy, Group Public Health Unit, Skills & Employment and Health, Children & Young Londoners.

About the team

The Skills & Employment Unit is responsible for overseeing adult skills delivery in London following delegation of the Adult Skills Fund from the DfE to the Greater London Authority in 2019 and the introduction of Skills Bootcamps in 2022.

The Skills & Employment Unit’s Funding Policy & Systems Team is responsible for data collection and processing related to London’s adult education and skills programmes and produces a range of data products to support delivery of the Mayor’s priorities in this area.

About the role

Sitting in the wider Funding Policy & Systems Team, the role will lead and support a small team of data analysts to deliver software and data systems to manage our adult skills programmes.

Working mainly in PostgreSQL and Python, alongside Microsoft Office Suite and PowerBI, the role will involve implementing change controls through updated code, using our tools to produce new reports, investigation and implementation of new technologies, designing and implementing quality assurance tests, reviewing the work of colleagues and helping with training.

This is a hybrid working opportunity. The team is based at 169 Union Street, SE1 0LL.

What your day will look like:

  • Support the team to deliver software and data systems to collect, store and process programme data and to deliver services and data products required to manage skills programmes and pay providers.
  • Design new and adapt existing data solutions to meet programme needs.
  • Implement a robust approach to testing and quality assurance for all software changes prior to release.
  • Investigate data processing requirements for new programmes and data collections.
  • Review workflows and adjust priorities to ensure deadlines are met.

Provide analysis and data processing required to operate key business processes or develop policy, including support for the ASF and Bootcamps data publication, London Learner Survey and evaluation programmes and wider skills programmes as necessary; and provide ad hoc analysis, incorporating statistically robust methodology as needed, working with policy and delivery colleagues, to help ensure ASF funding can best support the Mayor’s priorities.

Skills, knowledge and experience:

  • To be considered for the role you must meet the following essential criteria:
  • Ability to read and understand python and SQL code (or similar languages with demonstrated ability to learn new programming languages), and set up and support others to use the appropriate environments and tools.
  • Ability to use version control tools such as GitHub to review code and provide feedback to developers.
  • Strong proficiency in analysing data and building reproducible processes using code.
  • Ability to review code and provide feedback in a constructive manner.
  • Ability to explain technical issues to non-technical colleagues.
  • A knowledge of adult skills programmes and the national data collection system and key dataset, the Individualised Learner Record, or demonstrated ability to learn new programmes and datasets quickly.

Behavioural competencies

Research and analysis is gathering intelligence (information, opinion and data) from varied sources, making sense of it, testing its validity and drawing conclusions that can lead to practical benefits.

Level 3 indicators of effective performance:

  • Expands networks to gain new information sources for research and policy development
  • Identifies and implements methods to ensure intelligence is of a high quality
  • Encourages others to analyse data from different angles, using multiple perspectives to identify connections and new insights
  • Tailors research investment in line with likely impact for Londoners and policy priorities
  • Retains a bigger picture view, ensuring research recommendations are appropriate and practical for the GLA and its stakeholders

Problem solving is analysing and interpreting situations from a variety of viewpoints and finding creative, workable and timely solutions.

Level 3 indicators of effective performance:

  • Clarifies ambiguous problems, questioning assumptions to reach a fuller understanding
  • Actively challenges the status quo to find new ways of doing things, looking for good practice
  • Seeks and incorporates diverse perspectives to help produce workable strategies to address complex issues
  • Initiates consultation on opportunities to improve work processes
  • Supports the organisation to implement innovative suggestions

Strategic thinking is using an understanding of the bigger picture to uncover potential challenges and opportunities for the long term and turning these into a compelling vision for action.

Level 3 indicators of effective performance:

  • Translates GLA vision and strategy into practical and tangible plans for own team or delivery partners
  • Consistently takes account of the wider implications of team’s actions for the GLA
  • Encourages self and others to think about organisation’s long term potential
  • Informs strategy development by identifying gaps in current delivery or evidence
  • Takes account of a wide range of public and partner needs to inform team’s work

Communicating and influencing is presenting information and arguments clearly and convincingly so that others see us as credible and articulate and engage with us.

Level 2 indicators of effective performance:

  • Communicates openly and inclusively with internal and external stakeholders
  • Clearly articulates the key points of an argument, both in verbal and written communication
  • Persuades others, using evidence based knowledge, modifying approach to deliver message effectively
  • Challenges the views of others in an open and constructive way
  • Presents a credible and positive image both internally and externally

Stakeholder focus is consulting with, listening to and understanding the needs of those our work impacts and using this knowledge to shape what we do and manage others’ expectations.

Level 2 indicators of effective performance:

  • Seeks to understand requirements, gathering extra information when needs are not clear
  • Presents the GLA positively by interacting effectively with stakeholders
  • Delivers a timely and accurate service
  • Understands the differing needs of stakeholders and adapts own service accordingly
  • Seeks and uses feedback from a variety of sources to improve the GLA’s service to Londoners

Planning and organising is thinking ahead, managing time, priorities and risk, and developing structured and efficient approaches to deliver work on time and to a high standard.

Level 2 indicators of effective performance:

  • Prioritises work in line with key team or project deliverables
  • Makes contingency plans to account for changing work priorities, deadlines and milestones
  • Identifies and consults with sponsors or stakeholders in planning work
  • Pays close attention to detail, ensuring team’s work is delivered to a high standard
  • Negotiates realistic timescales for work delivery, ensuring team deliverables can be met

The GLA Competency Framework Guidelines further detailing each competency and the different level indicators can be found here: GLA Competency Framework

How to apply

If you would like to apply for the role you will need to submit the following:

  • Up to date CV
  • Personal statement with a maximum of 1500 words.
  • Please ensure you address how you demonstrate the essential criteria outlined above in the advert.

Please ensure your CV and Personal Statement have a maximum file size of 1.5MB each and upload your Personal Statement to the ‘CV and Cover Letters’ section’ of the form, ensuring you address the technical requirements and competencies in your Personal Statement.

Word or PDF format preferred and do not include any photographs or images. Please ensure your CV and Personal Statement are saved with the job reference number as part of the naming convention (E.g., "CV - applicant name - 012345)

As part of GLA’s continuing commitment to be an inclusive and equal opportunity employer we will be removing personal identifiable information from CVs and Personal Statements that could cause discrimination.

We may close this advert early if we receive a high volume of suitable applications.

If you have questions about the role

If you wish to talk to someone about the role, the hiring manager Phil Vabulas would be happy to speak to you. Please contact them at .

If you have any questions about the recruitment process, contact the who support the GLA with recruitment.

Is this role eligible for sponsorship?

This role DOES NOT meet the criteria for sponsorship for external candidates. It may meet the criteria for sponsorship for some internal candidates . click apply for full job details

Senior Power BI Developer
Hays Technology
London
Hybrid
Senior
£80,000 - £120,000
RECENTLY POSTED

Your new company
This is an opportunity to join an expanding project controls and reporting function supporting a nationally significant major project. You will play a key role in delivering high quality digital reporting, analytics, and performance insights across a complex, large scale environment.

Your new role
As a Power BI Developer, you will lead the development of dashboards, data models and analytical insights that support programme performance and decision-making.Core responsibilities include:

  • Developing, publishing and scheduling Power BI dashboards and reports to support project and programme performance monitoring
  • Translating business needs into robust data models aligned with WBS, CBS, and OBS structures
  • Analysing complex data sets and presenting clear visual insights for senior stakeholders
  • Delivering digital reporting solutions through the project’s enterprise data platform
  • Supporting monthly reporting cycles with intelligent analysis and performance insights
  • Producing training materials and supporting users to ensure effective dashboard adoption
  • Liaising with internal IT teams and external suppliers to maintain high quality reporting capability

What you’ll need to succeed

  • Advanced Power BI experience (DAX, Power Query/M) and strong Excel skills
  • Experience working within large scale project environments-ideally infrastructure, utilities, nuclear, defence, or other high complexity engineering sectors
  • Ability to design sophisticated dashboards and data models that support project controls functions
  • Strong analytical and problem solving skills, with excellent communication and stakeholder engagement abilities

What you’ll get in return

  • Hybrid working
  • Competitive benefits package

What you need to do now
If you’re interested in this role, click ‘apply now’ to forward an up-to-date copy of your CV, or call us now.
If this job isn’t quite right for you, but you are looking for a new position, please contact us for a confidential discussion about your career.

Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C’s, Privacy Policy and Disclaimers which can be found at (url removed)

Data Engineer - SC Cleared
Sanderson Government and Defence
London
Hybrid
Mid - Senior
£70,000
RECENTLY POSTED
+4

Data Engineer Salary: £40 - 72K + Benefits Location: London or Manchester (aligned to office for client on-site requirements) Working Pattern: Hybrid / On-site depending on client needs Security Clearance: SC Clearance required You will join a people-focused digital consultancy supporting data-driven services across the UK public sector. The organisation values collaboration, inclusion, and work-life balance, and actively supports continuous learning and professional development through access to training, modern engineering tools, and supportive multidisciplinary teams. The consultancy works closely with government departments and public sector organisations to design, build, and operate secure, scalable data platforms that enable advanced analytics, data science, and machine learning. Diversity and inclusion are core values, and hiring decisions are based on skills, experience, and potential. Empowering individuals and building strong teams are central to delivering meaningful outcomes for clients and citizens. This role is suited to Data Software Engineers with a strong technical foundation and an interest in data engineering, data science, and machine learning. You will work within agile, multidisciplinary teams alongside data scientists, platform engineers, and stakeholders to build robust data processing systems in secure environments. Role Responsibilities Design, build, and maintain scalable data processing and integration systems to support data science and analytics workloads. Develop high-quality, well-tested software using a Test-Driven Development (TDD) approach. Collaborate closely with data scientists to enable effective use of data for analytics and machine learning. Build and operate cloud-based solutions, with a strong focus on AWS services. Work with messaging, streaming, or data flow technologies to support real-time and batch data processing. Contribute to infrastructure and platform automation using Infrastructure as Code. Participate in agile ceremonies, technical design discussions, and code reviews. Ensure solutions meet security, performance, and reliability requirements within public sector environments. What You Will Bring to the Team A strong interest in data, particularly data engineering, data science, or machine learning. A solid technical background, with experience in Java, Python, TypeScript, or similar languages. Experience developing software using TDD or a strong willingness to adopt TDD practices. Strong problem-solving skills and the ability to work collaboratively within multidisciplinary teams. Good communication skills, with the ability to explain technical concepts to both technical and non-technical stakeholders. A proactive mindset with attention to detail and a commitment to quality. Desirable Skills and Experience Experience working with cloud platforms, ideally AWS. A strong Linux background. Experience with data integration and messaging technologies such as Apache NiFi, Apache Kafka, RabbitMQ, or similar tools. Experience using Infrastructure as Code tools such as Terraform or CloudFormation. Previous experience working in a consultancy or public sector delivery environment. Familiarity with secure or regulated environments. Reasonable Adjustments: Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people of all backgrounds and perspectives. Our success is driven by our people, united by the spirit of partnership to deliver the best resourcing solutions for our clients. If you need any help or adjustments during the recruitment process for any reason , please let us know when you apply or talk to the recruiters directly so we can support you. TPBN1\_UKTJ

School Information Systems (SIS) and Data Systems Specialist
The American School in London
London
In office
Mid
Private salary
RECENTLY POSTED
TECH-AGNOSTIC ROLE

The School Information Systems (SIS) and Data Systems Specialist is a non-teaching, operational role within the Technology, Information Literacy, and Media (TILM) department. The primary responsibility of the role is the day-to-day administration, configuration, and effective operation of the Student Information System (Veracross), which is a core platform underpinning the school’s academic and operational processes.

Alongside this core responsibility, the role contributes to the development, maintenance, and improvement of applications, data workflows, and automations that support the school’s operations. While application development and automation form an important part of the role, particularly over time, stewardship of Veracross, data integrity, and reliable system operation are the priority, especially during the initial period in post.

The SIS and Data Systems Specialist reports to and works under the direction of the Network and Data Integration Manager. All development, configuration, and integration work is undertaken within agreed priorities and architectural oversight, ensuring alignment with departmental strategy, security expectations, and school needs.

Summary of Major Responsibilities

Veracross Administration and Development:

  • Manage the day-to-day administration and configuration of the Veracross Student Information System.
  • Configure schedules, calendars, parent conferences, access controls, and user permissions.
  • Maintain data quality, data integrity, and appropriate access across Veracross modules.
  • Produce reports, data extracts, and imports to support academic and operational teams.
  • Use Veracross APIs, data structures, and tools to develop integrations, scripts, and automations.

Applications and Automation:

  • Develop and maintain scripts, workflows, and lightweight applications that improve operational efficiency.
  • Support the design and development of in-house applications to meet school needs.
  • Migrate legacy solutions, including FileMaker Pro databases and applications, to modern and sustainable platforms.

Data and Systems Support:

  • Support data flows and integrations between Veracross and other school systems under the guidance of the Digital Integration Manager.
  • Lead and support the data rostering (students and staff) for internal and external assessments including NWEA MAP, ACER ISA, and aimswebPlus in coordination with the Office of Teaching & Learning.
  • Troubleshoot application, data, and integration issues and escalate where appropriate.
  • Maintain clear technical documentation for configurations, scripts, and applications.

Security:

  • Apply secure development and administration practices in all work.
  • Follow data protection, access control, and safeguarding requirements when handling systems and data.
  • Identify and escalate risks, data issues, or security concerns appropriately.

Collaboration and Professional Growth:

  • Work collaboratively with colleagues across academic and operational teams to understand requirements and deliver effective technical solutions.
  • Respond to and resolve assigned Helpdesk tickets related to Veracross, applications, data, and integrations, in line with departmental service expectations.
  • Escalate issues appropriately and keep stakeholders informed of progress.
  • Participate in team meetings, planning, and professional development.
  • Perform other duties within the scope, spirit, and purpose of the role, as requested by the Digital Integration Manager or Director of Technology.

Essential Qualifications/Experience:

  • Experience working with application platforms, databases, or information systems.
  • Demonstrable experience with scripting, automation, or application development.
  • Experience working with structured data and reporting.
  • Strong problem-solving skills and attention to detail.
  • Ability to work collaboratively and take technical direction.
  • Strong written and verbal communication skills.
  • A proven commitment to safeguarding and the welfare of children and young people.

Desirable Qualifications / Experience:

  • Experience working with Veracross or a comparable Student Information System.
  • Experience with FileMaker.
  • Experience with AppSheet.
  • Experience with APIs, data integrations, and workflow automation.
  • Experience migrating or modernizing legacy systems.
  • Experience working in a school or similar complex organization.

This position description is current at the date shown but following consultation may be changed to reflect or anticipate changes in the role that are commensurate with the job title and salary.

The American School in London is committed to safeguarding and promoting the welfare of children and young people and expects all trustees, employees and volunteers to share this commitment. All new appointments will be subject to appropriate checks: Disclosure and Barring Service (DBS enhanced), Disqualification by Association Self-Declaration, Declaration of Criminal Record, checks against the Teaching Regulation Agency (TRA) Prohibition List (Teacher Status Checks) including Identity, Address, Date of Birth, a Full Employment History, Right to Work in the UK, overseas checks where applicable, at least 2 references (one with current or most recent employer, where appropriate) and original documentation of Qualifications (where appropriate). For positions into Senior Management a Prohibition from Management Check (s128 Directive) will also be undertaken.

All posts involving direct contact with children are exempt from the Rehabilitation of Offenders Act 1974. However, amendments to the Exceptions Order 1975 (2013 & 2020) provide that certain spent convictions and cautions are ‘protected’. These are not subject to disclosure to employers and cannot be taken into account. Guidance and criteria on the filtering of these cautions and convictions can be found on the Ministry of Justice website. Shortlisted candidates will be asked to provide details of all unspent convictions and those that would not be filtered, prior to the date of the interview. You may be asked for further information about your criminal history during the recruitment process. If your application is successful, this self-disclosure information will be checked against information from the Disclosure & Barring Service before your appointment is confirmed.

ASL is dedicated to fostering courageous global citizenship in a diverse and inclusive school environment. In our international community, we aspire for the cultures and backgrounds of our employees to mirror those of our families and student body, and we enthusiastically welcome applications from candidates who bring diverse life experiences, perspectives and skills. Educators with knowledge of global education and prospective applicants for any position who are committed to diversity and inclusion are particularly welcome to apply. The American School in London will not discriminate against an applicant or employee based on race, color, religion, creed, national origin or ancestry, sex, age, physical or mental disability, genetic information, gender identity or expression, sexual orientation, marital status, maternity or parental status, or any other legally recognised protected basis under local law. Read our Diversity, Equity and Inclusion statement here .

Data Engineer
Meritus
London
Hybrid
Mid - Senior
£550/day - £600/day
RECENTLY POSTED

Contract Data Engineer - Azure / Databricks
Location: London (2 days onsite)
Rate: 550- 600 per day (Inside IR35)
Contract: 6 months

A leading UK financial institution is seeking an experienced Data Engineer to support the development and enhancement of a modern cloud-based data platform. This role will focus on building scalable data pipelines and supporting the evolution of a cloud-first data architecture.

Key Responsibilities

  • Design and develop scalable data pipelines using modern cloud technologies.
  • Build and optimise distributed data processing solutions using Databricks, Spark and Python.
  • Develop and maintain data integration workflows using Azure Data Factory.
  • Work with large datasets stored in Azure Data Lake environments.
  • Collaborate with architects, analysts and engineering teams to deliver reliable and secure data solutions.
  • Contribute to improving data quality, performance and operational monitoring across the platform.

Key Skills & Experience

  • Strong experience with Azure Databricks, Azure Data Factory and Azure Data Lake.
  • Advanced Python, SQL and Spark (PySpark) development experience.
  • Experience building and optimising ETL / data pipelines in cloud environments.
  • Knowledge of CI/CD and version control (Azure DevOps, GitHub or similar).
  • Experience working with large-scale distributed data processing systems.

Contract Details

  • 6-month initial contract
  • 550- 600 per day (Inside IR35)
  • Hybrid working: 2 days per week onsite in London

If you’re an experienced Data Engineer with strong Azure and Databricks expertise and are available for a new contract, please apply or get in touch to discuss further.

Data Scientist - UKIC DV Clearance Required
Matchtech
London
In office
Mid - Senior
Private salary
RECENTLY POSTED
+1

Our client, a prominent entity in the Defence & Security sector, is seeking a meticulous Data Scientist with a strong understanding of Linux, Data Science, and AWS to join their team. This is a contract position located in London for a duration of 12 months, requiring a UKIC DV clearance to undertake sensitive and impactful work.

Key Responsibilities:

  • Develop and deploy data science solutions to support national security missions.
  • Build and optimise data pipelines for processing large, complex datasets.
  • Apply machine learning and statistical techniques to extract actionable insights.
  • Create clear visualisations to communicate findings to stakeholders.
  • Collaborate in agile teams to deliver robust, scalable solutions.
  • Support cloud-based deployments and integration into operational environments.

Job Requirements:

  • Proficiency with scripting languages like Python for data exploration, cleansing, and manipulation.
  • Knowledge of machine learning models and statistical techniques, including validation.
  • Understanding of data analytics and data visualisation techniques.
  • Ability to process large datasets via batch or stream processing using Apache Spark or similar tools.
  • Exposure to techniques used for acquiring and fusing data.
  • Experience with cloud platforms (preferably AWS) or implementing cloud-based data science solutions.
  • Knowledge of, or willingness to learn, DataOps.
  • Experience with structured or unstructured databases.
  • Experience with container technologies, including Docker and Kubernetes.
  • Familiarity with agile ways of working.
  • Understanding of software best practices including version control, CI/CD pipelines for automated testing, and deployment.
  • Proficiency in Linux.
  • BPSS & Current UKIC DV clearance.

Additional Details:

  • Location: 5 days per week onsite - London
  • Duration: 12 months

If you are a dedicated Data Scientist with the necessary clearances and skills, and are eager to contribute to mission-critical projects in the realm of national security, we want to hear from you. Apply now to take the next step in your career with our client.

TM1 Planning Developer
Square One Resources
England
Fully remote
Mid - Senior
Private salary
RECENTLY POSTED

Job Title: TM1 Planning Developer
Location: Remote - Inside IR35.
Start Date: April
Job Type: Contract

We’re looking for a TM1/IBM Planning Analytics Developer to support the development and optimisation of enterprise planning solutions within a complex data environment.

You will be responsible for designing and maintaining TM1 models and cubes, developing business rules and processes, and supporting financial and operational planning workflows. The role involves working closely with finance and business stakeholders to deliver scalable, high-performance planning and reporting solutions.

This is a 3 month initial contract, remote and Inside IR35.

Key requirements:

  • Strong experience with IBM TM1/Planning Analytics
  • Development of cubes, dimensions, rules and TurboIntegrator (TI) processes
  • Experience supporting financial planning, forecasting and reporting
  • Performance optimisation and troubleshooting of TM1 models
  • Strong stakeholder engagement skills

If you are interested in this opportunity, please apply now with your updated CV in Microsoft Word/PDF format.

Disclaimer
Notwithstanding any guidelines given to level of experience sought, we will consider candidates from outside this range if they can demonstrate the necessary competencies.
Square One is acting as both an employment agency and an employment business, and is an equal opportunities recruitment business. Square One embraces diversity and will treat everyone equally. Please see our website for our full diversity statement.

Data Modeller
Robert Walters
Manchester
Hybrid
Mid - Senior
Private salary
RECENTLY POSTED

Location: Manchester
Contract: Consultant
Work Setup: Hybrid - 2 days onsite (moving to 3 days in September)

Who We Are
We are a consultancy operating within Robert Walters, the world’s most trusted talent solutions business. Across the globe, we deliver recruitment, outsourcing, and talent advisory services for businesses of all sizes, opening doors for people with diverse skills, ambitions, and backgrounds.

The Role
We have an exciting new opportunity for a Data Modeller to join Robert Walters as a Consultant.

As a consultant, you will benefit from permanent employment with Robert Walters and will be deployed on an assignment within our clients’ organisations, in return we will provide you with the opportunity to develop your skills with ongoing training and professional support.

This role offers an exciting opportunity to join a global business, providing top-tier service to our blue chip clients.

What you’ll do

  • Design, build and maintain scalable data pipelines and models in Databricks using Python to deliver reliable datasets for reporting and key business metrics.
  • Develop efficient, well-structured code while adhering to technical standards, reconciliation checks, and version control practices using Git and DevOps tools.
  • Partner with visualisation analysts to ensure data models are structured effectively for dashboards, reporting and insight generation.
  • Work within Agile delivery teams to scope work, contribute to sprint planning and deliver outputs within agreed timelines.
  • Engage with stakeholders to clarify requirements, provide progress updates and communicate technical concepts clearly to non-technical audiences.
  • Continuously develop knowledge of insurance data and emerging analytics technologies to improve data solutions and support business decision-making.

What you bring

  • Strong hands-on experience with Databricks, Python and Power BI, with the ability to contribute quickly in an established environment.
  • Background as a Data Modeller, Analytics Engineer, or Data Analyst with strong modelling experience.
  • Experience designing scalable data models and pipelines within cloud-based data platforms.
  • Proficiency with Git version control and development best practices.
  • Strong analytical mindset with the ability to interpret complex datasets and produce actionable insight.
  • Insurance or financial services experience preferred, with understanding of business reporting and operational data.

What’s Next?
If you are ready to take the next step, apply now. Successful applicants will be contacted directly by a recruiter to discuss the role more.

We are committed to creating an inclusive recruitment experience. If you require support or adjustments to the recruitment process, our Adjustment Concierge Service is here to help. Please feel free to contact us at (see below) to discuss how we can support you.

This position is being recruited on behalf of our client through our Outsourcing service line. Resource Solutions Limited, trading as Robert Walters, acts as an employment business and agency, partnering with top organizations to help them find the best talent. We welcome applications from all candidates and are committed to providing equal opportunities.

Senior KDB+ Developer
Korn Ferry
London
Hybrid
Senior
£1,000/day
RECENTLY POSTED

Rate: up to £1000 a day inside IR35

Location: 3 days at London Office

We are working with a leading global financial institution on a senior hire within their Real Time market data engineering team. This role is focused on building and operating low-latency, high-performance KDB+ platforms that support mission-critical trading, analytics and monitoring use cases.

What You’ll Be Doing

  • Design, develop and maintain large-scale KDB+/q systems for Real Time and historical market data
  • Build and operate tickerplants (TP), Real Time processes (RTP), and HDBs, including recovery and log replay
  • Implement performant time-series data models, schemas, and APIs
  • Optimize q code for latency, throughput, and memory efficiency
  • Develop Real Time and batch pipelines for tick data ingestion, normalization, and enrichment
  • Work closely with quants and stakeholders to productionise analytics and trading signals
  • Support and troubleshoot production KDB systems on Linux, including participation in on-call rotations

What We’re Looking For

  • Extensive hands-on experience with KDB+/q in a production environment

  • Proven experience designing or operating Real Time tick data systems

  • Strong knowledge of:

    • Tickerplant architectures and recovery models
    • Time-series joins (eg as-of joins)
    • Attributes, iterators/adverbs, and performance internals
  • Experience building low-latency systems where performance matters

  • Strong Linux/Unix skills, including debugging running processes

Applied AI Engineer
Oscar Associates (UK) Limited
London
Hybrid
Mid - Senior
£100,000
RECENTLY POSTED

Applied AI / Automation Engineer

Location: London (Hybrid)

Salary : £70k + (DOE)

The Role

A London-based financial services firm operating within the hedge fund and asset management space is seeking an Applied AI / Automation Engineer to join its Data Science function.

This role will focus on designing and building an intelligent reporting and automation platform to support fund operations and administration teams. The successful candidate will work closely with both technical and operational stakeholders to deliver measurable efficiency gains and scalable automation solutions.

The overall objective is to modernise reporting production and reconciliation processes through the application of modern data engineering practices, AI-enabled automation, and agent-based workflows.

This is a hands-on engineering role focused on building reliable, production-grade systems within a complex financial environment.

Key Responsibilities

Lead the design and development of an end-to-end automated reporting platform .

Build intelligent workflows to automate data ingestion, validation, reconciliation, and reporting processes , replacing manual operational workflows with scalable, controlled systems.

Design exception-handling and human-in-the-loop mechanisms where full automation is not appropriate.

Ensure the platform delivers high levels of accuracy, auditability, and operational resilience .

Work closely with product and operations stakeholders to translate operational complexity into structured technical solutions.

Develop the system as a maintainable internal product , rather than a collection of standalone scripts or tools.

Contribute to the broader evolution of AI-driven operational capabilities across the business.

Experience & Skills

Strong automation engineering capability with the ability to apply AI in real operational environments.

Strong hands-on experience with Python and SQL (3+ years)

Experience designing and implementing data pipelines and automation systems.

Experience integrating systems via APIs and structured data feeds.

Experience building reliable, monitored, and maintainable production systems.

Familiarity with version control (Git) and CI/CD pipelines.

Experience implementing testing frameworks to ensure system reliability and accuracy.

Strong systems thinking with the ability to design end-to-end technical solutions.

Strong communication skills and ability to work cross-functionally with operations teams.

Desirable

Exposure to financial services workflows such as reconciliation, accounting, or operational controls .

Experience building systems in regulated or audit-sensitive environments .

Familiarity with workflow orchestration tools .

Experience turning internal automation into scalable internal products, including dashboards or tools (e.g. TypeScript, React, Next.js or similar ).

Opportunity

This is an opportunity to design and build an AI-driven operational platform from the ground up within a sophisticated financial environment.

Through the application of automation, AI agents, and modern data engineering, the role will help shape how AI augments and transforms operational workflows across the organisation.

Oscar Associates (UK) Limited is acting as an Employment Agency in relation to this vacancy.

To understand more about what we do with your data please review our privacy policy in the privacy section of the Oscar website.

TPBN1_UKTJ

Machine Learning Engineer / MLOps Engineer
ARCA Resourcing Ltd
Oxford
Hybrid
Mid - Senior
Private salary
RECENTLY POSTED

Machine Learning Engineer (MLOps)
Location: Oxfordshire, UK

Permanent

HYBRID 3 days per week onsite

ARCA Resourcing is partnering with an innovative, established but scaling technology company in Oxfordshire to recruit a Machine Learning Engineer (MLOps). This role offers the opportunity to work at the forefront of advanced computing and emerging technologies, applying modern machine learning techniques to complex scientific and engineering challenges.

As a Machine Learning Engineer, you will develop advanced ML-driven applications that enhance the performance, stability, and sensitivity of next-generation technologies. Working closely with experimental scientists and hardware engineers, you will translate complex physical system data into actionable improvements through intelligent data modelling and automation.

Key Responsibilities

  • Develop and implement custom machine learning models for signal processing, sensor fusion, and system optimisation
  • Design models for applications such as noise suppression, drift compensation, anomaly detection, and adaptive calibration
  • Build and maintain robust data pipelines to process high-dimensional experimental and time-series datasets
  • Train, validate, and optimise machine learning models using modern Python-based frameworks
  • Integrate ML inference into real-time or near-real-time control environments
  • Stay up to date with the latest developments in machine learning and sensor fusion techniques
  • Interpret and communicate cutting-edge research in both theory and experiment with internal teams
  • Contribute to the wider scientific community through publications, conferences, or technical collaboration
  • Support and mentor colleagues, contributing to a collaborative research and engineering environment

Essential Skills & Experience

  • BSc or MSc in Computer Science, Mathematics, Statistics, Physics, Quantum, or a closely related discipline
  • 2+ years of industry experience developing and deploying machine learning models
  • Strong programming skills in Python for machine learning and scientific computing
  • Experience developing, training, and optimising machine learning models
  • Familiarity with common ML tools and frameworks such as PyTorch, pandas, and scikit-learn
  • Experience working with large datasets and high-performance computing (HPC) environments
  • Strong analytical and problem-solving skills in complex technical environments
  • Ability to communicate effectively with both technical and non-technical stakeholders
  • Comfortable working in collaborative, cross-functional teams within a fast-paced R&D environment
  • Ability to learn complex topics quickly and translate research into practical solutions

Desirable Skills & Experience

  • PhD in Computer Science, Mathematics, Statistics, Physics, or a related field
  • Experience deploying machine learning models to edge or embedded compute hardware
  • Experience with MLOps workflows, including continuous training, testing, and deployment pipelines
  • Experience with real-time systems, sensor data processing, or advanced signal processing
  • Exposure to quantum technologies, quantum information science, or quantum machine learning

This is a unique opportunity to apply cutting-edge machine learning techniques within a highly technical environment, working alongside experts developing breakthrough technologies.

ARCA Resourcing welcomes applications from engineers passionate about using machine learning to solve complex scientific and engineering problems.

Please apply via the link for immediate consideration!

Frequently asked questions
Typically, a Data Engineer should have a strong background in computer science or related fields, proficiency in programming languages like Python or Java, and experience with data warehousing, ETL processes, and big data technologies such as Hadoop or Spark.
Haystack features a wide range of Data Engineer positions, including roles in startups, large enterprises, and remote opportunities. You can find jobs specializing in cloud data engineering, real-time data processing, data pipeline development, and more.
To improve your chances, tailor your resume to highlight relevant skills and projects, gain hands-on experience with popular data tools, contribute to open-source projects, and stay updated with the latest trends in data engineering.
Yes, Haystack lists entry-level Data Engineer roles suitable for recent graduates or professionals transitioning into data engineering, as well as internships and junior positions to help you start your career.
Absolutely. Haystack offers many remote and flexible Data Engineer job listings to suit your preferred working style and location.