Make yourself visible and let companies apply to you.
Roles
Remote Data Engineer Jobs
Overview
Find top remote Data Engineer jobs with Haystack, your go-to IT job board for flexible, work-from-anywhere opportunities. Explore the latest openings in data engineering, build scalable data pipelines, and work with cutting-edge technologies—all from the comfort of your home. Start your remote Data Engineer career today!
Data Quality Manager
McCabe & Barton
London
Remote or hybrid
Senior - Leader
£95,000
RECENTLY POSTED

AleadingFinancialServicesorganisationundergoingalargescaledatatransformationislookingtohireanexperiencedDataQualityManageronapermanentbasis.Theroleoffersasalaryof£95,000plusastrongbenefitspackageandflexibleworking.

ThisrolewillsuitatechnicallycredibleDataQualityleaderwithagenuinepassionfordataquality,accuracyandtrust.Youwillworkcloselywithdataengineersandplatformteamstoembedpragmaticgovernanceandqualitycontrolsintodelivery,whileinfluencingstakeholdersacrossthebusinessandpossessacommercialmindset.

Thisisahandsontechnicalleadershiprole,combiningdataqualityandgovernanceownershipwithpracticalengineeringinput.YouwillleadasmallteamandpartnerwithdataengineersandoperationalSMEstoembedbestpracticeacrossdataquality,governanceanddatamanagement.

Roleremit

  • Ownandevolvethedatagovernanceframeworkwithinanengineering-ledenvironment
  • Definegovernancestandards,guardrails,datacontractsandSLAs
  • PartnerwithRisk,Audit,DataProtectionandLegaltomeetcompliancerequirements
  • Workwithdataengineeringteamstoembeddataqualityintopipelinesandworkflows
  • Providehands-onguidanceondatamodelling,reconciliation,metadataandbestpractice

Experiencerequired

  • StrongbackgroundinDataQuality,DataGovernanceandDataManagementwithinamoderndataengineeringenvironment
  • Hands-onexperiencewithclouddataplatforms,Azure,SQL,Pythonandorchestrationtools
  • ProvenexperienceembeddingdataqualitycontrolsacrossdatapipelinesandETLtransformationworkflows
  • Goodunderstandingofmoderndataarchitecturesandqualitycontrolpatterns
  • Experiencewithdataprofiling,lineageanalysis,reconciliationandmetadatamanagement
  • Strongstakeholdercommunicationskillswiththeabilitytoinfluenceengineeringteams

IfyouareanexperiencedDataQualityManagerwiththerequiredbackground,pleaserespondwithanup-to-dateCVforreview.

Lead Developer - Applied AI Engineering
HM Treasury
UK
Remote or hybrid
Senior
Private salary
RECENTLY POSTED

Do you have a strong background in software development? Becoming part of a growing community of data and digital professionals and champion excellence in AI data, and digital and help grow data & digital skills across HM Treasury? If so we would love to hear from you! About the Team The Chief Secretary of the Treasury (CST) has outlined the government's ambition to rewire the state – see Institute for Government speech. Central to this vision is more collaboration and transparency between departments and the centre of government on spending, requiring a greater level of sharing and harmonising of key data sets (Finance, Outcome & Performance data). To meet the spending challenges of the future, HM Treasury is committed to developing an integrated data solution which will enable a single version of the truth which provides real-time, standardised data on finance, outcomes and performance. This will allow for greater autonomy for departments, more open conversations between departments and HMT and more effective, data-driven decision making, ultimately leading to better outcomes for the public. The Finance and Performance Data Integration Service (FPDIS) is a key part of the Government’s ambition to rewire the state. The new compact between department and the centre requires more and better data, and this programme is the means by which the Treasury will get that data. The team is building and every role will bring vital perspectives and insight to the programme. We are currently developing our approach, business case and early thinking about what the future could look like. You would be joining us at the start of an exciting journey. About the Job The key responsibilities of the post holders will be: Technical Leadership · Lead the end-to-end technical design, development, and implementation of AI solutions. This would involve development and maintenance of analytic products in our preferred tech stack (Python, Plotly Dash and Azure) and experimentation with and use of other applications. · Provide technical guidance and mentoring to data engineers, analysts and non-technical staff working on the broader FPDIS programme. · Document AI architectures, models, and agent behaviours to ensure transparency, governance, and continuous improvement. Solution Design & Delivery · Lead technical delivery of an experimental Agile project to extract finance and performance data from PDFs and other documents. · Identify opportunities to apply AI to optimise public spending business processes, improve user experiences and deliver public value. · Integrate AI capabilities with enterprise platforms and services, including low-code environments, APIs, and data pipelines, and cloud-based data integration platforms. Technology Evaluation · Assess and select appropriate AI models, platforms, and tools (e.g. OpenAI, Copilot Studio). · Stay current with emerging AI technologies, particularly developments in agent-based systems, and evaluate their applicability to FPDIS. Collaboration & Partner Engagement · Work closely with partners to translate business needs into AI-enabled solutions, incorporating agent-based architectures where appropriate. · Support the training and upskilling of HMT staff in AI literacy, responsible use of intelligent systems, and adoption of AI-enabled tools. Governance & Compliance · Ensure all AI solutions are ethical, secure, and aligned with HMT’s strategic objectives and regulatory obligations, and wider DSIT guidance. About You · Technical leadership of applied AI projects · Designing of AI solution to a business problem. · Technical Application of LLMs in a Digital Product. · Working as a team Some of the Benefits our people love! · 25 days annual leave (rising to 30 after 5 years), plus 8 public holidays and the King’s birthday (unless you have a legacy arrangement as an existing Civil Servant). Additionally, we operate flexitime systems, allowing employees to take up to an additional 2 days off each month · Flexible working patterns (part-time, job-share, condensed hours) · Generous parental and adoption leave packages · Access to a generous Defined Benefit pension scheme with employer contributions of 28.97% · Access to a cycle-to-work salary sacrifice scheme and season ticket advances · A range of active staff networks, based around interests (e.g. analysts, music society, sports and social club) and diversity For more information about the role and how to apply, please follow the apply link. If you need any reasonable adjustments to take part in the selection process, please tell us about this in your online application form, or speak to the recruitment team.

Lead Data Platform Engineer - Databricks - IAC - Terraform - Azure Data Factory - Data Lakehouse
Nexere Consulting Limited
London
Remote or hybrid
Senior
£80,000 - £85,000
RECENTLY POSTED
+1

The Data Platform Engineer designs, develops, automates, and maintains secure, scalable, and compliant data platforms that enable the firm to efficiently manage, analyse, and utilise data. The role ensures that data solutions are robust and reliable while meeting regulatory obligations and safeguarding client confidentiality.

Key Responsibilities

  • Design and architect scalable, secure, and compliant data platforms and solutions, producing technical documentation and securing approvals through governance bodies such as Architecture Review Boards.
  • Build and deliver robust data solutions using Databricks, PySpark, Spark SQL, Azure Data Factory, and Azure services.
  • Develop APIs and write efficient Python, PySpark, and SQL code to support data integration, processing, and automation.
  • Implement and manage CI/CD pipelines and automated deployments using Azure DevOps to enable reliable releases across environments.
  • Develop and maintain infrastructure-as-code (eg, Terraform, ARM) to provision and manage cloud resources, including ADF pipelines, Databricks assets, and Unity Catalog components.
  • Monitor, troubleshoot, and optimise data platform performance, reliability, and costs, identifying bottlenecks and recommending improvements.
  • Create dashboards and observability tools to report on platform performance, usage, incidents, and operational KPIs.

Knowledge, Skills & Experience

  • Degree in Computer Science, Data Engineering, or a related field.
  • Proven experience designing and building cloud-based data platforms, ideally within Azure.
  • Strong hands-on expertise with Databricks, PySpark, Spark SQL, and Azure Data Factory.
  • Solid understanding of Data Lakehouse architecture and modern data platform design.
  • Proficiency in Python for data engineering, automation, and data processing.
  • Experience developing and integrating REST APIs for data services.
  • Strong DevOps experience, including CI/CD, automated testing, and release management for data platforms.
  • Experience with Infrastructure as Code tools such as Terraform or ARM templates.
  • Knowledge of data modelling, ETL/ELT pipelines, and data warehousing concepts.
  • Familiarity with monitoring, logging, and alerting tools (eg, Azure Monitor).

Desirable

  • Experience with additional Azure services (eg, Fabric, Azure Functions, Logic Apps).
  • Knowledge of cloud cost optimisation for data platforms.
  • Understanding of data governance and regulatory compliance (eg, GDPR).
  • Experience working in regulated or professional services environments.
Strategic Market Data Lead
Experis
London
Remote or hybrid
Senior
£10,000 - £11,000
RECENTLY POSTED

This is a strategic, hands on, standalone role that blends market data vendor management, technical capability, and business engagement.

I need someone with Financial Service experience ideally Wealth Manager, below are the key components of the role

Market Data Vendor Oversight

  • Identify and catalogue current market data feeds managed by business teams
  • Engage business stakeholders to understand data needs.
  • Assess overlapping vendor feeds
  • Drive cost savings, synergies, and vendor consolidation where possible.
  • Support decommissioning or renegotiation of feeds.

Procurement & Contractual Understanding

  • Work closely with procurement teams.
  • Understand contract obligations and typical market data vendor operating models.
  1. Technical & Data Capability

The role sits in the data department, so the candidate must be technically capable:

  • Understand data architecture and how feeds land in Snowflake.
  • Ability to run SQL queries, investigate data, and compare feeds.
  • Familiarity with concepts like EDP, data lakes, ingestion of PDFs, etc.
  • Should be able to use AI/tools to automate comparisons.
  • Not reliant on data engineers/analysts for basic tasks.
Lead Platform Engineer
Synapri
Not Specified
Remote or hybrid
Senior
£75,000 - £85,000
RECENTLY POSTED
+1

Lead Data Platform Engineer - Databricks - IAC - Terraform - Azure Data Factory - Data Lakehouse

The Data Platform Engineer designs, develops, automates, and maintains secure, scalable, and compliant data platforms that enable the firm to efficiently manage, analyse, and utilise data. The role ensures that data solutions are robust and reliable while meeting regulatory obligations and safeguarding client confidentiality.

Key Responsibilities

  • Design and architect scalable, secure, and compliant data platforms and solutions, producing technical documentation and securing approvals through governance bodies such as Architecture Review Boards.
  • Build and deliver robust data solutions using Databricks, PySpark, Spark SQL, Azure Data Factory, and Azure services.
  • Develop APIs and write efficient Python, PySpark, and SQL code to support data integration, processing, and automation.
  • Implement and manage CI/CD pipelines and automated deployments using Azure DevOps to enable reliable releases across environments.
  • Develop and maintain infrastructure-as-code (eg, Terraform, ARM) to provision and manage cloud resources, including ADF pipelines, Databricks assets, and Unity Catalog components.
  • Monitor, troubleshoot, and optimise data platform performance, reliability, and costs, identifying bottlenecks and recommending improvements.
  • Create dashboards and observability tools to report on platform performance, usage, incidents, and operational KPIs.

Knowledge, Skills & Experience

  • Degree in Computer Science, Data Engineering, or a related field.
  • Proven experience designing and building cloud-based data platforms, ideally within Azure.
  • Strong hands-on expertise with Databricks, PySpark, Spark SQL, and Azure Data Factory.
  • Solid understanding of Data Lakehouse architecture and modern data platform design.
  • Proficiency in Python for data engineering, automation, and data processing.
  • Experience developing and integrating REST APIs for data services.
  • Strong DevOps experience, including CI/CD, automated testing, and release management for data platforms.
  • Experience with Infrastructure as Code tools such as Terraform or ARM templates.
  • Knowledge of data modelling, ETL/ELT pipelines, and data warehousing concepts.
  • Familiarity with monitoring, logging, and alerting tools (eg, Azure Monitor).

Desirable

  • Experience with additional Azure services (eg, Fabric, Azure Functions, Logic Apps).
  • Knowledge of cloud cost optimisation for data platforms.
  • Understanding of data governance and regulatory compliance (eg, GDPR).
  • Experience working in regulated or professional services environments.
Data Analysis Starter Course (Cardiff)
ITonlinelearning Recruitment
Cardiff
Fully remote
Graduate - Junior
£28,000 - £38,000

Data Analyst Course Programme – Job Guarantee Included

Complete online training designed to take you from zero experience to your first data analyst role. Study part-time, build fundamental skills, and get dedicated job placement support until you’re hired. Flexible financing options available, with payment plans starting from as low as £142 per month.
The Programme
Complete this 10-week online training with just 10-15 hours per week of study time. You’ll learn industry-standard tools, including Excel, SQL, Python, and Power BI, while building a professional portfolio with workplace projects. The programme includes earning BCS and CompTIA certifications recognised by UK employers, expert tutor support throughout your studies, and dedicated job placement support with CV help, interview preparation, and direct employer introductions.
The Outcome
93% of graduates secure data analyst roles within 3 months.
Starting salaries: £28,000 – £38,000
Who This Is For
The programme is completely beginner friendly, so no experience needed. Career changers are welcome, and you can study at your own pace.
*This programme is available to UK-based learners only.
Ready to start earning in data? Limited spaces available. Apply now for the next available cohort.

Junior Data Analyst
Newto Training
Multiple locations
Remote or hybrid
Junior
Private salary

Ready to start your career as a Data Analyst?

The demand for skilled data professionals in the UK is booming - and organisations are searching for people who can turn raw data into meaningful insight. If you’re looking for a career with purpose and strong growth, our Data Analyst Career Programme is built for you, with a job guarantee on completion.

Why this programme matters

We focus on equipping you with both the tools and the real-world experience you need to hit the ground running. With industry-recognised certifications, live instruction and project work you’ll be ready for business challenges from day one.

What you’ll get:

  • Seven training modules, covering Excel, SQL, Python, Tableau, Power BI and more.
  • Three official certifications: Microsoft Azure Data Fundamentals, Microsoft Power BI Data Analyst Associate and Microsoft Azure AI Fundamentals.
  • Real-world project work to enhance your CV and show our end employers you can deliver.
  • Job guarantee: If you complete the programme and don’t receive a job offer, we’ll refund 100% of your course fee.

Your investment:

  • Course cost: £2,795
  • Payment plan: From £232.91 per month (interest-free)

No prior tech-job experience? No problem.

You don’t need to come from a data background. If you bring curiosity, communication skills, and a willingness to learn, this programme will equip you for a transition into a demanding and rewarding role.

Take the next step now.

Click ‘Apply Now’ and embark on a career where data drives decisions, and you drive your future.

Data Engineer
Salutem LD Bidco Ltd/ Salutem LD Bidco II Ltd/ Sal
Windsor
Remote or hybrid
Mid
£40,000
+1

About The Company
We are a fast-growing, acquisitive group built from diverse businesses with unique backgrounds and processes. Our mission is to harness the power of data to deliver actionable insights that improve care and performance.

About The Role
As a Data Engineer, you will play a key role in developing and maintaining our analytics platform, enabling data-driven decision-making across the organisation. You’ll collaborate with stakeholders to streamline workflows, automate processes, and ensure the accuracy and reliability of data. This is an exciting opportunity for someone who thrives in a dynamic environment and is passionate about leveraging data to create business value.
Key Responsibilities

Maintain and optimise our Azure environment for performance, security, and scalability.
Design and build scalable data pipelines using Spark, PySpark, and SQL within a Databricks environment.
Structure data in Delta Lake format to support medallion architecture layers.
Develop and refine Power BI dashboards, reports, and apps that are insightful and user-friendly.
Automate reporting processes using Power Platform, Power Apps, and SharePoint.
Ensure robust data governance, security, and compliance with regulatory standards.
Provide training and documentation to empower colleagues to access and interpret data independently.
Troubleshoot and resolve issues related to Power Query, SQL, Power BI, and other systems.
Collaborate with operational teams to generate actionable insights and influence business activities.About You
We’re looking for someone with:
Strong experience in data modelling, ETL processes, and business intelligence tools.
Proficiency in Python or SQL for data analysis and processing.
Familiarity with cloud platforms such as Azure, AWS, or Google Cloud.
Knowledge of data governance, security, and compliance standards (e.g., GDPR).
Excellent communication skills for both technical and non-technical audiences.
A proactive, solutions-focused mindset with the ability to work under pressure and manage competing priorities.Desirable:

Experience with Databricks, Delta Lake, and medallion architecture.
Knowledge of CI/CD practices and Agile methodologies.
Understanding of Microsoft Fabric services.Our Core Values: 
➡️Supportive: Helping everyone reach their full potential. 
➡️Ambitious: Striving for the best outcomes. 
➡️Loyal: Prioritising our staff and the people we support.
➡️Unique: Innovating without compromising quality.
➡️Transparent: Fostering openness and mutual respect.
➡️Engaging: Partnering with everyone involved.
➡️Meaningful: Offering fulfilling opportunities. 
Why Choose Us? 
✅Emotional Support: 24/7 Employee Assistance, mental health resources, meditation apps, and bereavement support. 
✅ Medical Support: Free Online GP access, Health Cash Plan, Cancer Cover, and Menopause support. 
✅ Financial Support: Flexible pay with Wagestream, utility bill savings, Money Helper, and Life Assurance. 
✅ Physical Support: Online workouts, Cycle to Work scheme, gym discounts, and National Trust activities. 
Still not convinced? 
We have been recognised as a Top Employer 2025 in the United Kingdom. 
We have been named as a Top 50 Inspiring Workplace Uk & Ireland
We are a Disability confident committed company.  
Salutem Care and Education is dedicated to protecting and promoting the well-being of children, young adults, and vulnerable individuals. As part of our safer recruitment process, the successful candidate will be required to complete thorough pre-employment checks, including an enhanced DBS and, where applicable, overseas criminal record checks

Machine Learning Engineer - £110k – £130k – Geospatial Tech 4 Good
Opus Recruitment Solutions
London
Fully remote
Mid - Senior
£110,000 - £130,000

Machine Learning | Deep Learning | Time Series | Climate | Remote Sensing | PyTorch | scikit‑learn | Geospatial | AWS | MLOps | Python | Risk Modelling | FinTech | Do you want to work with a business building AI‑native data system that bring clarity and credibility to nature‑based assets? A business tackling complex, real‑world environmental challenges, helping organisations make high‑impact decisions around risk, resilience and commercial performance? This is the chance to join as a Machine Learning Engineer working with a climate‑tech scale‑up applying cutting‑edge Machine Learning to satellite data, weather models and environmental signals, reshaping how nature is valued in real‑world decision‑making. Joining their AI team, you’ll design and deploy models that forecast climate volatility, detect vegetation stress, and generate risk‑driven insights from remote sensing and time‑series data. You’ll work across AI, climate science, geospatial modelling and scalable pipelines, contributing meaningfully from day one. What you’ll be working on: • Building and evaluating Machine Learning/DL models for satellite, weather and climate data • Forecasting environmental and risk‑related signals (volatility, vegetation stress, land‑surface change) • Developing geospatial and remote‑sensing models (Sentinel‑1/2, GEDI, optical, radar, LiDAR) • Creating time‑series and forecasting models for environmental change • Translating business questions into robust modelling problems • Turning research prototypes into scalable, reproducible AI pipelines • Communicating assumptions, uncertainty and results clearly The must‑haves: • Strong background in Machine Learning, DL and Applied Statistics • Time‑series modelling + backtesting • Experience with geospatial and climate datasets • Python stack: PyTorch, scikit‑learn, scipy • Reproducible workflows (Git, AWS/cloud, W&B) Nice‑to‑haves: • Risk modelling, financial time series, portfolio optimisation (great for FinTech/quant backgrounds) • Climate/weather datasets (CMIP, forecast data) • Geospatial tools: rasterio, xarray, geopandas, GDAL • Remote sensing (optical, radar, LiDAR) • MLOps: CI/CD, containerisation, monitoring • Startup or fast‑paced product environment The role offers £110k–£130k, a global team environment, and the chance to shape the future of AI‑powered environmental and risk intelligence. If it ticks those boxes, don’t hang about message me: (url removed) Machine Learning | Deep Learning | Time Series | Climate | Remote Sensing | PyTorch | scikit‑learn | Geospatial | AWS | MLOps | Python | Risk Modelling | FinTech

Data Engineer (Azure)
Hays Technology
London
Remote or hybrid
Mid - Senior
£450/day - £550/day

Your new company Working for a renowned financial services organisation Your new role Seeking a Data Engineer to help design and maintain scalable batch and near‑real‑time ingestion pipelines, modernizing legacy ETL/ELT processes into Azure and Snowflake, and implementing best‑practice patterns such as CDC, incremental loading, schema evolution, and automated ingestion frameworks. They build cloud‑native solutions using Azure Data Factory/Synapse, Databricks/Spark, ADLS Gen2, and Snowflake capabilities including stages, file formats, COPY INTO, and Streams/Tasks to support raw‑to‑curated data modelling. The role involves creating reusable components and Python libraries to accelerate delivery across teams, enforcing data quality through validation, observability, and robust pipeline design, and ensuring strong security, governance, and documentation standards. Collaboration within agile workflows-including CI/CD, code reviews, and iterative planning-is also key to delivering consistent, reliable, and secure data solutions. What you'll need to succeed Strong hands-on data engineering experience, with strong focus on data ingestion Experience building production pipelines using Azure Data Factory, Databricks, Synapse Solid SQL skills and experience working with modern cloud data warehouses, ideally Snowflake Proficiency in Python for data processing, automation, and pipeline utilities Good understanding of data lake/lakehouse concepts and ingestion patterns Infrastructure-as-Code exposure (Terraform) and CI/CD (Azure DevOps) Able to prototype quickly while adhering to Group standards and controls Communicates clearly with business stakeholders and technical teams Familiarity with orchestration frameworks (Dagster) - desirable Energy commodity trading experience is a real advantage What you'll get in return Flexible working options available. What you need to do now If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found at (url removed)

Senior Data Engineer
Hays Technology
Abingdon
Remote or hybrid
Senior
£60,000 - £67,500
+2

Your new company An established and fast‑growing technology organisation is on a mission to transform digital connectivity across the UK. With a focus on building and operating high‑speed fibre networks, the business is committed to delivering world‑class broadband services to communities and supporting a data‑driven future. You'll be joining a forward‑thinking environment that values innovation, collaboration, and continuous improvement. Your new role As a Senior Data Engineer, you will play a pivotal role in shaping and enhancing the organisation's enterprise data platform. Working within a specialist Data Analytics & AI team, you'll be responsible for designing, building, and maintaining scalable data pipelines and models within Snowflake to support analytics, reporting, and data‑led decision‑making across the business.You will translate data architecture strategies into high‑quality technical solutions, optimise performance and cost, and ensure the data platform is reliable, secure, and well‑structured. This includes developing ELT/ETL pipelines using tools such as dbt and Argo Workflows, implementing data quality and governance practices, and leveraging advanced Snowflake features to drive automation and efficiency.Collaboration is key-you'll work closely with analysts, data consumers, and business stakeholders, enabling them through well‑designed data models and providing technical support where needed. You'll also contribute to monitoring, CI/CD processes, and ongoing improvements to engineering standards across the team. What you'll need to succeed Proven experience delivering cloud‑based data engineering solutions, ideally centred around Snowflake Strong skills in SQL, Python, and dbt for data modelling and transformation Experience with Snowflake RBAC and performance optimisation Familiarity with ingestion/replication tools such as Airbyte, Fivetran, Hevo, or similar Understanding of cloud technologies (AWS preferred) Knowledge of data modelling, governance principles, and best‑practice engineering standards Experience supporting BI/reporting tools such as Power BI Solid grounding in version‑controlled development and CI/CD practices (git)Desirable: Exposure to enterprise systems like Salesforce, BSS/OSS, telephony, or call‑centre data Experience in data platform migrations, data validation, and quality assurance Background in enabling business teams through training, documentation, or adoption support Familiarity with Terraform or Infrastructure‑as‑Code A mindset for continuous learning and staying up to date with modern data stack tooling What you need to do now If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now. If this job isn't quite right for you, but you are looking for a new position, please contact us for a confidential discussion about your career. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found at (url removed)

Geospatial Software Engineer
ISR RECRUITMENT LIMITED
Manchester
Remote or hybrid
Mid - Senior
£60,000 - £90,000
+5

The Opportunity:

You ll join an experienced, collaborative consultancy team delivering greenfield, enterprise-scale digital services for high-profile public and private sector clients.

This opportunity is ideal for a practical, adaptable Geospatial Full Stack Engineer who enjoys working across disciplines and solving complex problems and challenges that will have a real-world impact.

Collaboration sits at the heart of how our client operates, so you ll be partnering closely with colleagues across Software Engineering, User-Centred Design, Delivery Management, Data Science and Live Services to deliver outcomes that genuinely make a difference in today s society.

As a consultancy, they are technology-agnostic by design, focusing on choosing the right tools for each problem, rather than forcing one stack everywhere.

Their teams regularly work with .NET, Java, Python, Node.js, AWS and Azure, giving you genuine scope to broaden your skills and develop your career across a range of languages and platforms.

Many of their projects also involve Geographic Information Systems (GIS) and open-source geospatial technologies, helping clients unlock the value of location-based data through mapping, spatial analysis and data-driven decision making.

Skills and Experience:

Essential

  • 3+ years experience in a Full Stack Engineering role
  • Strong development skills in .NET, Java or Python, alongside modern JavaScript frameworks/libraries
  • Experience working in Agile environments (Scrum, Kanban, TDD)
  • Solid understanding of architectural and design patterns, including microservices and serverless
  • Hands-on experience designing and delivering solutions on AWS or Azure
  • Experience working with GIS systems or geospatial data, and familiarity with tools such as Leaflet, OpenLayers, QGIS, GeoServer, PostGIS, etc.
  • A collaborative mindset and experience working in multi-disciplinary teams

Desirable

  • Experience working in a consultancy environment
  • Exposure to public sector projects
  • Familiarity with CI/CD tooling (e.g. Jenkins, Terraform)
  • Awareness of the Digital Service Standard and Technology Code of Practice, particularly in geospatial or public sector contexts

Role and Responsibilities:

This is a varied role suited to someone who enjoys the pace, responsibility and collaboration of consultancy.

You will be involved with the following types of activity:

  • Design and deliver high-quality solutions: building, enhancing and maintaining software, infrastructure and deployment pipelines that are robust, secure and scalable. Projects may include solutions involving geospatial data, GIS platforms and open-source mapping tools.
  • Work collaboratively across disciplines: partnering with Senior and Lead Engineers, Delivery Managers, Designers and Data Scientists to shape solutions, contribute to technical documentation and deliver against agreed plans.
  • Apply standards and best practice: follow established engineering approaches, contribute accurate technical estimates and proactively identify and escalate risks or issues.
  • Communicate clearly and build relationships: present ideas, prototypes and progress updates to stakeholders, while building strong working relationships with colleagues, clients and partner organisations.

Applications:

Please contact Edward here at ISR to learn more about our client and how they are leading the way in developing the next generation of technical solutions through innovation and transformational technology?

Junior Analytics Engineer
Hillarys HR
Nottingham
Remote or hybrid
Junior
Private salary

We’re looking for an experienced Analytics Engineer to help shape our data foundations and deliver high quality, trusted insights across the business. You’ll design and maintain scalable analytics models, ensure data quality, and work closely with teams across Finance, Commercial, Operations, and IT to turn complex requirements into intuitive, reliable datasets.

This role sits at the heart of our analytics function, balancing immediate reporting needs with long-term data architecture, and ensuring our organisation can make confident, data driven decisions.

Key Responsibilities

  • Design, build, and maintain analytics models that convert raw data into trusted, business-ready datasets.
  • Own the full modelling lifecycle (staging → core → marts).
  • Define and maintain consistent metrics, dimensions, and business logic.
  • Implement testing, data quality checks, and SLAs to ensure reliability.
  • Translate business requirements into scalable, well-structured data solutions.
  • Deliver self-service datasets and support reporting/dashboards where needed.
  • Continuously improve analytics engineering processes and standards.
  • Provide robust data foundations to support strategic initiatives, including cost, pricing, and operational analytics.

What We Are Looking For

We’re seeking someone with strong analytical and technical capability, who enjoys solving complex data problems and can confidently partner with stakeholders across the business. You’ll bring the ability to design robust data models, communicate clearly, and ensure outputs are accurate, consistent, and actionable.

  • A degree in Computer Science, Engineering, Maths, Economics, Data Science is desirable.
  • Advanced SQL skills and experience with analytical/dimensional modelling.
  • Practical experience with modern ELT tools (Dataform/dbt) and Git workflows.
  • Knowledge of cloud data warehouses, ideally BigQuery.
  • Familiarity with orchestration tools such as Airflow.
  • Experience designing BI-friendly datasets, particularly for Power BI.
  • Strong communication skills, able to translate business needs into data models.
  • Experience with enterprise systems such as SAP BW/HANA.
  • A track record of building reusable datasets or maintaining a metrics/semantic layer.
  • Basic Python for automation or analytics engineering tooling.
  • Experience collaborating in cross-functional analytics or data product teams.

Why Join Us

  • Help shape the future of analytics at a global, design-led market leader.
  • Work collaboratively across Finance, Commercial, Operations, IT, and more.
  • Have real influence on how data is defined, trusted, and used across the organisation.
  • Join a supportive, innovative culture known for empowering teams and encouraging growth.
  • Develop your skills with modern cloud, modelling, and automation technologies.

We understand there’s no one size fits all approach. We’re proud to offer an inclusive workplace where every colleague feels valued, supported, and empowered to be their true self. If you require any reasonable adjustments throughout the recruitment process, please let us know and we’ll be happy to accommodate.

Everyone who applies will receive a response.

BI / Datawarehouse Lead / Manager Home / Home / Prestigious Client
Integrity Recruitment Solutions Ltd
Birmingham
Remote or hybrid
Senior
Private salary

BI / Datawarehouse Lead / Manager Home / Prestigious Client Prestigious in their sector, my client has an excellent market image and continues to make a significant impact. They are heavily involved in an ambitious business / systems transformation, and we have an excellent opportunity for an established Datawarehouse Lead to join their team. The successful candidate will play a lead role in the technical development and management of their global Data Warehouse and supporting team. The successful candidate will have a proven background and experience of full cycle development of a corporate, Microsoft based Data Warehouse. Desired skills: Design, Build, Test, Azure, Fabric Lakehouse We are looking to recruit a high-calibre resource, so the client is happy to consider candidates from both a contract and permanent background. Please forward your most recent CV to be considered for telephone screening. SQL / BI / ETL / DATAWAREHOUSE / AZURE / FABRIC / SPARK / PYSPARK / LEAD / MANAGER / HOME / REMOTE / MIDLANDS / BIRMINGHAM / SQL / BI / ETL / DATAWAREHOUSE / AZURE / FABRIC / SPARK / PYSPARK / LEAD / MANAGER / HOME / REMOTE / MIDLANDS / BIRMINGHAM

AI, Automation and Integration Engineer
SALVATION ARMY HOUSING ASSOCIATION
Bolton
Remote or hybrid
Mid - Senior
£55,000

About The Role
This is a brand-new role with a big remit and even bigger opportunity.
As our AI, Automation and Integration Engineer, youll be at the forefront of how Salvation Army Homes uses technology to work smarter, faster and more effectively. This isnt about maintaining the status quo - its about designing the future.
Youll lead the charge on automation, system integration and the responsible adoption of AI, helping us move towards streamlined processes, joined-up data, and genuinely intelligent digital services. From low-code automation and cloud integration to exploring practical AI use cases, youll have the space and backing to experiment, innovate and deliver real impact.
Working within our Digital, Data and ICT team, youll collaborate closely with colleagues across the organisation to turn ideas into working solutions. Youll help create a single version of the truth across our systems, reduce duplication and manual effort, and enable better decision-making through clean, connected data.
Because this role is new, youll play a key part in shaping how it operates - setting standards, defining approaches, and influencing how we use emerging technologies across the organisation. If youre excited by greenfield work, modern platforms and meaningful outcomes, this is a rare chance to make a role your own.

About The Candidate
Youre a hands-on technologist with a strong track record of delivering automation, integration and modern digital solutions in real-world environments - and youre ready to step into a role where you can shape both the technology and the approach.
Youll bring proven experience of designing and delivering process automation, ideally using the Microsoft Power Platform (Power Automate, Power Apps and Power BI), alongside experience building and supporting integrations between business-critical systems. Youre comfortable working across data, workflows and APIs to reduce manual effort and create seamless, joined-up services.
Youll have a strong technical foundation, including:

  • Experience working with cloud platforms, particularly Microsoft Azure
  • Solid SQL Server and database skills
  • Experience developing solutions using modern development languages and tools
  • A good understanding of data integration, data quality and governance principles

You dont just build solutions - you think about how theyre used, governed and scaled. You understand the importance of security, compliance and responsible data use, and you can balance innovation with control. Experience working in complex or multi-system environments is important, as is the ability to document, standardise and improve what you deliver.
Youre also excited by whats next. You may already have experience applying AI concepts in a business context, or you may be keen to develop this further - but either way, youre motivated to explore how AI and emerging technologies can be applied practically, ethically and at scale to improve services and decision-making.
Just as importantly, youre a strong collaborator and communicator. You can translate complex technical ideas into plain English, influence stakeholders at all levels, and work closely with analysts, data specialists and business teams to turn ideas into delivered outcomes. Youre organised, proactive, and comfortable managing multiple priorities in a fast-moving, evolving environment.
Above all, youre motivated by the opportunity to build something new, take ownership of a greenfield role, and play a leading part in an organisations journey towards automation, integration and AI-enabled services - while staying aligned with strong values and a clear social purpose.
The benefits on offer
In return for helping to transform lives, well give you access to some great benefits. These include:

  • 26 days annual leave rising to 31 days
  • An extra day off on your birthday
  • A High Street discount scheme (great savings both on and off-line)
  • Pension with life assurance
  • Discounted private medical insurance
  • Loans available for financial emergencies
  • Occupational Sick Pay
  • A full Induction package and training relevant to the role
  • Support to learn and develop your career

About The Company
A registered social landlord andone of the leading providers of supported housing in the UK, Salvation Army Homes is dedicated to transforming lives by providing accommodation and support for some of the most vulnerable members of society - mainly people with complex needs and/or experiencing homelessness.
Our aim is to work with individuals to build on their strengths, creating person centred, individualised strategies and plans that transform lives, support recovery and enable positive behaviour. In order to succeed, however, we need the right people in place. Our workforce is one of our greatest assets, but only by recruiting the very best can we continue to deliver comprehensive, good quality housing services, support and resettlement. services to our residents. Thats where you come in.
As an equal opportunities employer, Salvation Army Homes is committed to the equal treatment of all current and prospective employees and does not condone discrimination on the basis of age, disability, sex, sexual orientation, pregnancy and maternity, race or ethnicity, religion or belief, gender identity, or marriage and civil partnership. We invite and welcome applications to apply for Salvation Army Homes opportunities without concern of bias or discrimination.
We reserve the right to close this vacancy early if we receive sufficient applications for the role. Therefore, if you are interested, please submit your application as early as possible.

Power BI Developer
Vivo Talent
Not Specified
Fully remote
Senior - Leader
£70,000

Lead Power BI Developer (12 - month Fixed Term Contract)

Location: Remote

Overview

We are seeking a Lead Power BI Developer to deliver high-quality, scalable reporting solutions within a project-focused environment. This role combines hands-on development, leadership, and strong stakeholder engagement, including interaction with senior leadership and C-suite.

A key focus of this role is to establish and embed Power BI best practices across the organisation, coaching and enabling existing staff to improve capability, consistency, and long-term sustainability of BI solutions.

Key Responsibilities

Lead the design and delivery of Power BI dashboards, reports, and data models
Translate business requirements into scalable, user-friendly BI solutions
Drive best practice in data modelling, DAX optimisation, and report design
Own Power BI Service (workspaces, apps, deployment, governance)
Implement and manage access control (RLS, Azure AD security)
Establish governance, standards, and lifecycle management processes
Collaborate with data engineers on pipelines and data model design
Lead stakeholder engagement, including presenting to senior leadership and C-suite
Support testing, deployment, documentation, and user adoption
Mentor team members and provide technical leadership
Embed Power BI best practices across the organisation through coaching and enablement
Essential Skills & Experience

Proven experience delivering end-to-end Power BI solutions in a lead capacity
Strong expertise in Power BI Service, governance, and deployment
Deep understanding of access control (RLS, Azure AD groups)
Advanced DAX and strong SQL skills
Strong data modelling experience (star schema, performance optimisation)
Experience with Azure-based platforms and Databricks
Strong stakeholder management, including C-suite engagement
Excellent communication skills across technical and non-technical audiences
Desirable Experience

Databricks experience
Data warehousing and architecture design exposure
Experience building large-scale enterprise semantic models
Microsoft Dynamics Business Central or Navision
DevOps practices
If you are interested in the role then please apply or reach out

Database Administrator
Gigaclear
Abingdon
Remote or hybrid
Mid - Senior
£55,000 - £60,000
+1

Are you ready and looking for a role that you can make your own, taking the autonomy to set out what and how you do it?

As we have grown, we have accumulated a diverse database infrastructure, including PostgreSQL, Maria DB, InfluxDB and MongoDB systems.

The timing is ripe for an experienced Administrator to take ownership, mature, upgrade and manage our database servers, while supporting development teams and business operations.

Key Accountability & Responsibilities

Work with teams across the Technology department to install, configure, maintain, and upgrade our database servers across development, testing, and production environments.

Monitor database health and perform routine maintenance tasks including index optimisation, table maintenance, and schema modifications.

Work with Development and Data engineering teams to optimise performance & cost of data pipelines

Manage database capacity planning and storage allocation to ensure adequate resources for current and future needs.

Proactive management of databases, ensuring application and operational performance needs are met.

Implement and maintain high availability solutions including replication, clustering, and failover configurations.

Document database architectures, configurations, procedures and policies.

Implement appropriate security controls to ensure databases and data are protected.

Ensure database backup, validation and disaster recovery capabilities are in place and rehearsed.

Provide insight and recommendation on the adoption and consolidation of database related technologies.

Be part of the on 24x7 on call rota to provide out of hours support for our critical systems.

Knowledge & Skills

Proven expertise managing PostgreSQL and MariaDB/MySQL databases

Experience with NoSQL databases.

Experience with cloud database services (AWS).

Deep understanding of relational database concepts, normalisation, and SQL optimisation.

Proficiency in SQL and query optimisation across multiple database platforms.

Experience with database replication, clustering, and high availability configurations.

Familiarity with backup and recovery tools specific to each database platform.

Understanding of database security principles and access control mechanisms.

Experience with monitoring tools and performance analysis techniques.

Knowledge of version control systems (Git) for managing database code and scripts.

Experience with database automation, CI/CD pipelines and tooling.

Gigaclear is a growing Fibre Broadband (FTTP / FTTH) company, developing our fibre-to-the-premises broadband infrastructure to some of the most difficult to reach areas of the UK, empowering those communities with broadband to rival any city.

Staff rewards, benefits and opportunities

We foster a collaborative, engaging culture that empowers staff to grow and maximise their skills. We want to challenge our people in a fair environment where hard work is rewarded and a path for progression is open to all.

  • Generous employer pension; up to 8% matched contribution
  • Income protection & life assurance
  • 25 days holiday (plus bank holidays), holiday purchase scheme and Yay Days!
  • Health cash plan, 24/7 remote GP access and Employee Assistance Programme including counselling & legal advice
  • Unlimited access to online training and development content via our Learning Management System
  • Long service benefits and monthly employee recognition
  • Enhanced maternity and paternity provisions
  • Flexible working environment
  • Health & Wellbeing initiatives and company funded social events

Our approach is to work guided by our mission, vision and values.

Our Mission - Empowering communities with brilliant broadband

Our Vision - Connected Communities

Our Values - Own it, Find the Right Way, Work Together, Win Together

Data Engineer - Data Vault New
Randstad Technologies Recruitment
Not Specified
Remote or hybrid
Senior - Leader
£60/hour - £80/hour

Senior Data Engineer (Data Modernization)

We are looking for a proactive, goal-oriented Senior Data Engineer to build and scale high-performance big data pipelines. This role is ideal for a collaborative problem-solver who excels in distributed environments and values technical excellence through CI/CD and Agile.

The Essentials

  • Modern Modeling: Expert skills in Data Vault Mandatory and Dimensional modeling.
  • Cloud & DevOps: Strong AWS experience with a focus on CI/CD and writing secure code.
  • Data Governance: Proven ability to embed quality, lineage, and monitoring into every pipeline.
  • Leadership: A passion for guiding and coaching fellow team members both technically and procedurally.

The Mission

You will lead data modernization efforts, ensuring large-scale systems remain compliant and secure while maintaining a “right tool for the job” mindset. You’ll bridge the gap between complex data engineering and actionable analytics/ML concepts.

Apply

If you have a passion for building well-governed, scalable systems and want to lead a high-impact team, please apply with your CV

Randstad Technologies is acting as an Employment Business in relation to this vacancy.

AI Engineer
Michael Page
Not Specified
Remote or hybrid
Mid - Senior
Private salary

We are seeking an experienced AI Engineer to develop and implement innovative AI solutions within the industrial and manufacturing sector. The role focuses on leveraging data analytics to optimise processes and drive efficiency.

Client Details

A well know manufacturing company with a global presence

Description

  • Design and develop AI models to optimise manufacturing processes and workflows.
  • Collaborate with cross-functional teams to identify and solve key operational challenges using analytics.
  • Implement machine learning algorithms and predictive analytics to improve process efficiency.
  • Analyse large datasets to extract actionable insights and present findings to stakeholders.
  • Monitor and maintain AI models to ensure optimal performance and accuracy over time.
  • Contribute to the development of data pipelines and infrastructure to support AI initiatives.
  • Ensure compliance with industry standards and best practices in AI and data analytics.
  • Document processes, methodologies, and project outcomes for internal knowledge sharing.

Profile

A successful AI Engineer should have:

  • Proven experience in developing and implementing AI and machine learning models.
  • Proficiency in programming languages such as Python, R, or similar tools.
  • Machine Vision experience is beneficial
  • Previously worked in manufacturing business with multiple sites

Job Offer

Day rate of (Apply online only) per day (outside IR35)

3 month contract with potential extension

Remote role, with occasional travel to site

Senior Data & Analytics Consultant
ITSS Recruitment
Northampton
Fully remote
Senior
£60,000 - £70,000

Senior Data & Analytics Consultant - Fully Remote - Microsoft partner

60,000 - 70,000 + Bonus and the chance to have shares in the business as they move to an employee owned model.

A well established MS partner who are experiencing growth through a very strong project pipeline are looking to add a Senior Data & Analytics Consultant to their team.

The Senior Data & Analytics Consultant will play a key role in helping their customers turn data into meaningful insight and measurable impact.

In this role, you will design and deliver robust data and analytics solutions that strengthen data foundations, enable AI-driven initiatives, and support better operational and strategic decision-making.

You will design and deliver data solutions within the Microsoft ecosystem, including Azure Data Services, Microsoft Dataverse, Power BI, and Dynamics 365 integrations.

Main Duties and Responsibilities:

Data Strategy & Advisory
Data Governance & Quality
Data Analytics & Insights
Data Management & Warehousing
ETL & Data Integration
Data Architecture & Engineering

Knowledge and Experience

5+ years’ strong experience in data analytics, engineering, or business intelligence
within a consultancy or technology services environment
Proven expertise in data modelling, warehousing, and relational database design
Experience designing and delivering cloud-based data solutions, particularly within
Microsoft Azure
Knowledge of ETL processes and data integration across multiple systems
Experience integrating Microsoft technologies (e.g., Dynamics 365, Azure Data
Services, Power BI) with third-party platforms
Understanding of data governance, security, and compliance principles
Hands-on experience with SQL and one or more data programming languages
Strong understanding of Microsoft Dataverse data structures and integration patterns
Experience delivering complex data projects in customer environments
Experience working with and understanding public sector data, security and
governance

Technical Abilities

Proficiency in Microsoft Power BI, SQL, and Azure Data Services
Experience with data integration tools, APIs, and middleware
Knowledge of ETL tools such as Azure Data Factory, SSIS, or similar
Experience with data warehouses, data lakes, and modern data platforms
Familiarity with advanced analytics or machine learning concepts
Experience with Microsoft Power Platform for process automation
Understanding of statistical analysis, reporting, and visualisation techniques

This role will be fully remote with the occasional travel to client site.

You must be eligible for SC clearance due to the nature of their clients so you will need to be a UK national who includes a minimum of 5 years consecutive UK residency.

Please reach out to me on (phone number removed) or (url removed) to find out more about the Senior Data & Analytics Consultant position and get your application moving!

Senior Machine Learning Engineer
DigitalGenius
London
Fully remote
Mid - Senior
£90,000 - £100,000

DigitalGenius is a venture-backed artificial intelligence company bringing practical applications of deep learning and AI to some of the largest ecommerce customer service operations in the world, as well as high-growth companies. We’re a dedicated team of thoughtful and hard-working people committed to transforming customer service through the application of artificial intelligence. Role The continuous improvement of our products and the range of innovation projects we are committed to require us to scale our Machine Learning team. We are searching for a Machine Learning Engineer to join our core AI Team. This is a highly technical role for an outstanding individual who can take ownership of projects and start new initiatives. As a Machine Learning Engineer at DigitalGenius, you will be responsible for building and improving our Natural Language Processing, Image Recognition, and Recommendation solutions to maximise the product’s performance for our customers. Your time will be divided between improving the core product, researching and developing new ML applications and working closely with our clients. This is an excellent opportunity for those with strong programming capabilities and a deep understanding of AI. We are looking for someone with complementary skills that extend beyond NLP, preferably somebody with experience in ecommerce. The AI team at DigitalGenius owns all ML-related research, implementation and maintenance. In practice, this means keeping up to date with the SOTA research, data analysis, and developing scalable production services and infrastructure. Responsibilities - Proactive approach with team members and clients - Continuous improvement of core AI services - Take ownership of the services within your expertise - Contribute to the ongoing innovation R&D projects - Implement and maintain ML Infrastructure - Requirements - Degree in relevant field with 3+ years of industry experience - Strong Technical Skills: Python, Production APIs, Infrastructure as Code, - AWS or other Cloud provider - Deep understanding of Natural Language Processing / Generative AI / - Image Recognition - Extensive experience with machine learning techniques and algorithms such as supervised and unsupervised learning techniques, predictive modelling and statistics. - Experience with MLOps - Excellent organisation skills, working independently and ability to deliver results for deadlines. - A proactive, innovative, pragmatic approach to problem-solving and an ability to think critically and objectively. - Good customer-facing skills and ability to communicate technical concepts to technical and non-technical audiences. - Experience in ecommerce space Benefits - Competitive Salary - Generous vacation time (25 days of annual leave) - Yearly "Reset Week" in addition to annual leave allowance - Freedom to experiment with your own ideas - Environment to develop your skills without bureaucracy or red tape - Monthly fitness stipend of $210 or fully paid Third Space Membership We are an equal opportunity employer and value diversity at our company. We do not discriminate based on race, religion, colour, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.

Frequently asked questions
Haystack features a wide range of remote Data Engineer positions, including roles focused on data pipeline development, ETL processes, data warehousing, cloud data engineering, and real-time data processing across various industries.
You can browse remote Data Engineer job listings on Haystack, create a profile, upload your resume, and apply directly through the platform to employers offering remote positions.
Haystack offers a variety of remote Data Engineer roles including full-time, part-time, contract, and freelance opportunities. You can filter job listings by employment type to find the best fit.
Common skills for remote Data Engineer jobs include proficiency in SQL, Python or Java, experience with cloud platforms like AWS or Azure, knowledge of big data tools such as Hadoop or Spark, and expertise in data modeling and ETL frameworks.
Yes, Haystack offers interview tips, sample questions, and career advice specifically geared towards Data Engineers seeking remote positions to help you prepare effectively.