Make yourself visible and let companies apply to you.
Roles
Remote Data Engineer Jobs
Overview
Find top remote Data Engineer jobs with Haystack, your go-to IT job board for flexible, work-from-anywhere opportunities. Explore the latest openings in data engineering, build scalable data pipelines, and work with cutting-edge technologies—all from the comfort of your home. Start your remote Data Engineer career today!
Infrastructure Data Engineer (Kafka) - Outside IR35 - Remote
Exalto Consulting ltd
London
Fully remote
Mid - Senior
ÂŁ130,446/day
RECENTLY POSTED
+11

Infrastructure Data Engineer - Kafka Focus Fully Remote | Outside IR35 Working pattern: Sunday to Thursday Supporting a team based in Israel Exalto Consulting is supporting a client looking to appoint an experienced Infrastructure Data Engineer with strong Apache Kafka expertise. This is a fully remote contract role for someone who has hands-on experience running and improving Kafka in live production environments. The focus is not on initial setup alone. It is on keeping a high-volume streaming platform reliable, scalable, and well tuned over time. You would be joining a team responsible for maintaining and developing core data infrastructure that supports a range of business-critical use cases. The work includes improving platform performance, strengthening reliability, and helping internal teams make effective use of streaming capabilities. What the role involves You will take ownership of the day-to-day operation and improvement of Kafka infrastructure in production. That includes identifying performance issues, tuning configurations, resolving incidents, and building out the surrounding platform components needed to support a dependable streaming environment. The role also involves close collaboration with engineering, data, and analytics teams, so it is important that you are comfortable working across functions and helping others use the platform effectively. Key responsibilities Operate, maintain, and optimise Kafka clusters in production environments Design and improve Kafka infrastructure to support a range of streaming use cases. Tune Kafka settings including partitions, replication, retention, and throughput Monitor platform performance and address bottlenecks, instability, and capacity issues. Build and support related components including Kafka Connect, Schema Registry, and ELK. Develop internal tooling and microservices to support self-service platform use Improve monitoring, alerting, and observability using Prometheus and Grafana. Investigate production incidents, carry out root cause analysis, and make preventative improvements. Work with engineering, data, and analytics teams to ensure reliable data delivery. What we are looking for Proven experience running Kafka in live production environments. Strong understanding of Kafka internals and the trade-offs involved in scaling and performance tuning. Experience with Kafka Connect and Schema Registry. Good scripting and automation skills, ideally using Python. Experience with Elasticsearch and Kibana. Strong knowledge of Linux environments, shell scripting, and system performance tuning. Experience with Docker and Kubernetes, or similar container and orchestration tooling. Experience with CI/CD, Git, and infrastructure as code tools such as Terraform or Ansible. A solid understanding of distributed systems and streaming architectures. Experience supporting platforms where availability, resilience, and scale are important. Desirable experience Exposure to stream processing tools such as Apache Flink. Experience with cloud platforms including AWS, Azure, or GCP. Knowledge of hybrid environments. Experience with RBAC, multi-tenant systems, or usage metering. Experience with MSSQL or other relational databases. Working arrangement This role is fully remote and sits outside IR35. The team is based in Israel, so the working pattern is Sunday to Thursday. There is a two-hour time difference from the UK, and candidates should be comfortable working in line with that team structure. Important note This role requires genuine hands-on experience operating and optimising Kafka in production. Candidates whose experience is mainly limited to implementation, provisioning, or setup without ongoing production ownership are unlikely to be the right fit. To find out more, please get in touch with Exalto Consulting with a copy of your latest CV. TPBN1\_UKTJ

Lead Developer - Applied AI Engineering
HM Treasury
UK
Remote or hybrid
Senior
Private salary
RECENTLY POSTED

Do you have a strong background in software development? Becoming part of a growing community of data and digital professionals and champion excellence in AI data, and digital and help grow data & digital skills across HM Treasury? If so we would love to hear from you! About the Team The Chief Secretary of the Treasury (CST) has outlined the government's ambition to rewire the state – see Institute for Government speech. Central to this vision is more collaboration and transparency between departments and the centre of government on spending, requiring a greater level of sharing and harmonising of key data sets (Finance, Outcome & Performance data). To meet the spending challenges of the future, HM Treasury is committed to developing an integrated data solution which will enable a single version of the truth which provides real-time, standardised data on finance, outcomes and performance. This will allow for greater autonomy for departments, more open conversations between departments and HMT and more effective, data-driven decision making, ultimately leading to better outcomes for the public. The Finance and Performance Data Integration Service (FPDIS) is a key part of the Government’s ambition to rewire the state. The new compact between department and the centre requires more and better data, and this programme is the means by which the Treasury will get that data. The team is building and every role will bring vital perspectives and insight to the programme. We are currently developing our approach, business case and early thinking about what the future could look like. You would be joining us at the start of an exciting journey. About the Job The key responsibilities of the post holders will be: Technical Leadership · Lead the end-to-end technical design, development, and implementation of AI solutions. This would involve development and maintenance of analytic products in our preferred tech stack (Python, Plotly Dash and Azure) and experimentation with and use of other applications. · Provide technical guidance and mentoring to data engineers, analysts and non-technical staff working on the broader FPDIS programme. · Document AI architectures, models, and agent behaviours to ensure transparency, governance, and continuous improvement. Solution Design & Delivery · Lead technical delivery of an experimental Agile project to extract finance and performance data from PDFs and other documents. · Identify opportunities to apply AI to optimise public spending business processes, improve user experiences and deliver public value. · Integrate AI capabilities with enterprise platforms and services, including low-code environments, APIs, and data pipelines, and cloud-based data integration platforms. Technology Evaluation · Assess and select appropriate AI models, platforms, and tools (e.g. OpenAI, Copilot Studio). · Stay current with emerging AI technologies, particularly developments in agent-based systems, and evaluate their applicability to FPDIS. Collaboration & Partner Engagement · Work closely with partners to translate business needs into AI-enabled solutions, incorporating agent-based architectures where appropriate. · Support the training and upskilling of HMT staff in AI literacy, responsible use of intelligent systems, and adoption of AI-enabled tools. Governance & Compliance · Ensure all AI solutions are ethical, secure, and aligned with HMT’s strategic objectives and regulatory obligations, and wider DSIT guidance. About You · Technical leadership of applied AI projects · Designing of AI solution to a business problem. · Technical Application of LLMs in a Digital Product. · Working as a team Some of the Benefits our people love! · 25 days annual leave (rising to 30 after 5 years), plus 8 public holidays and the King’s birthday (unless you have a legacy arrangement as an existing Civil Servant). Additionally, we operate flexitime systems, allowing employees to take up to an additional 2 days off each month · Flexible working patterns (part-time, job-share, condensed hours) · Generous parental and adoption leave packages · Access to a generous Defined Benefit pension scheme with employer contributions of 28.97% · Access to a cycle-to-work salary sacrifice scheme and season ticket advances · A range of active staff networks, based around interests (e.g. analysts, music society, sports and social club) and diversity For more information about the role and how to apply, please follow the apply link. If you need any reasonable adjustments to take part in the selection process, please tell us about this in your online application form, or speak to the recruitment team.

Data and Systems Analyst
Ashley Community Housing
Bristol
Remote or hybrid
Mid
ÂŁ28,000/day
RECENTLY POSTED

Bristol

Who We Are

ACH is a social enterprise with a clear vision, dedicated to empowering refugees and migrants in the UK to lead self-sufficient and ambitious lives. We bring together a diverse team of strategists and researchers, driven by their own lived experiences, to provide tailored integration services.

Why We Do What We Do

Our mission goes beyond individual support; we actively challenge and disrupt the systems that perpetuate inequalities in our society. With a proven track record of delivering effective support services, in 2024, we helped house over 1,400 individuals and provided support to over 500 of our service users, helping them to achieve their personal goals and lead fulfilling lives in their new country.

We are now looking for a Data and Systems Analyst to join us on a full-time basis, for an 18 month fixed-term contract.

Our Commitment to You

  • Salary of ÂŁ29,500 per annum
  • 25 days annual leave, plus bank holidays
  • Pension
  • Flexible working
  • Employee Recognition Programme
  • Training and development opportunities
  • Employee Assistance Programme
  • Social gatherings and staff retreats
  • A fully stocked staffroom!

This is a purpose-driven opportunity for a data and systems professional with strong Power BI and reporting expertise to join our mission-focused organisation.

Youll get the chance to see your work make a tangible difference, shaping data across our organisation so leaders and teams can make better decisions that directly support refugees and migrants to thrive.

Alongside meaningful work, youll benefit from an employment package designed to give you the flexibility, security and support to do your best work while being part of a genuinely people-centred organisation.

What Youll Be Doing

As a Data and Systems Analyst, you will turn organisational data into clear insight while shaping and supporting the IT systems that help us deliver our objectives.

Working closely with managers, team leaders and the IT team, you will develop and maintain key performance reports and data visualisations to support performance management and decision-making.

Youll analyse business processes, capture new system requirements, and help design reporting, forms and dashboards that improve how data is recorded, understood and used across the organisation.

You will also act as a champion for data integrity and effective system use, supporting staff at all levels to build confidence with reporting tools and ensuring we meet regulatory and governance expectations.

Additionally, you will:

  • Lead on data integrity across all core IT systems
  • Produce monthly and quarterly KPI reports for managers and the Board
  • Provide training to staff on data entry, reporting and system use
  • Develop data quality reports to support GDPR and regulatory compliance
  • Build and maintain a central data warehouse for business intelligence and self-service reporting

What Were Looking For

To be considered as a Data and Systems Analyst, you will need:

  • Experience delivering day-to-day troubleshooting support and the development of new solutions
  • Experience working with suppliers and customers to implement new applications or new modules within existing applications
  • Experience developing Dashboards to produce KPIs reporting
  • Specialised technical skills, including data visualisation techniques, report building, analytical skills, and knowledge of displaying information visually
  • Competency in the use of Microsoft tools such as Power BI, PowerApps, Power Automate, Excel and Salesforce
  • The ability to manipulate data and produce detailed Business Intelligence

The closing date for this role is 5th April 2026.

Other organisations may call this role Data Analyst, Systems Analyst, Business Intelligence Analyst, BI Analyst, Reporting Analyst, Data and Reporting Analyst, Performance Analyst, Data Insights Analyst, or Business Systems Analyst.

Webrecruit and ACH are equal opportunities employers, value diversity and are strongly committed to providing equal employment opportunities for all employees and all applicants for employment. Equal opportunities are the only acceptable way to conduct business and we believe that the more inclusive our environments are, the better our work will be.

So, if you want to grow your career while contributing to life-changing work as a Data and Systems Analyst, please apply via the button shown. This vacancy is being advertised by Webrecruit. The services advertised by Webrecruit are those of an Employment Agency.

AWS Architect
Develop
London
Remote or hybrid
Mid - Senior
ÂŁ800/day - ÂŁ900/day
RECENTLY POSTED

AWS Architect (ÂŁ800-ÂŁ900 per day - 12 month contract) We're partnering with a leading global organisation in the financial data space to find a hands-on AWS Architect. This is a high-impact role focused on building a cutting-edge, cloud-native research environment for advanced analytics, quantitative research, and AI-driven innovation. If you thrive at the intersection of cloud architecture, data science platforms, and AI services, this is an opportunity to shape a next-generation research ecosystem used by top-tier data scientists and researchers. The Role As the AWS Architect, you will lead the end-to-end design and delivery of a scalable, secure, and highly connected research platform on AWS. This environment will empower teams to work with large, complex datasets and accelerate experimentation and insights. Key Responsibilities \* Architect and deliver a cloud-native research platform on AWS \* Design environments supporting Jupyter Notebooks and Amazon SageMaker \* Integrate Amazon Bedrock and emerging AI services to enhance research workflows \* Build scalable data access and connectivity frameworks for structured and unstructured datasets \* Develop Model Context Protocol (MCP) services to enable seamless data and notebook integration \* Implement semantic search and discovery capabilities (e.g. OpenSearch) \* Define best practices across architecture, security, and scalability \* Collaborate closely with engineering, data, and research stakeholders What We're Looking For \* Deep, hands-on AWS expertise (architect-level with strong delivery experience) \* Proven track record building data science platforms or research environments \* Strong grounding in data engineering principles (pipelines, governance, data access) \* Experience integrating AI/ML services, including Amazon Bedrock \* Ability to design and expose APIs/services for data and notebook interoperability \* Experience working in enterprise or regulated environments Nice to Have \* Exposure to emerging AWS AI tools (e.g. Amazon Q, Trainium, Bedrock Agents) \* Experience with agentic AI or MCP-compatible services \* Knowledge of OpenSearch or similar discovery tools \* Background in financial services, capital markets, or data-heavy industries Why Apply? \* Work on a greenfield, high-impact platform \* Collaborate with leading data scientists and researchers \* Shape the future of AI-enabled research environments \* Competitive compensation and flexible working

Power BI Developer - FP&A
Robert Walters
London
Remote or hybrid
Mid - Senior
ÂŁ500/day - ÂŁ600/day
RECENTLY POSTED

We’re partnering with a leading hospitality and members-club group to hire an experienced Power BI Developer for an 3-month duration. This is a great opportunity to deliver high-impact reporting and shape how the business uses data across membership, F&B, and club operations.

The role sits within the FP&A function, working closely with the Interim CFO, and requires someone who is both technically strong and commercially sharp.

The Opportunity

You’ll be responsible for building a full suite of Power BI dashboards that will become core to the business’s decision-making. This includes reporting across:

  • Membership
  • Food & Beverage
  • Club utilisation

The business is moving towards a more modern data environment, so you’ll be developing reports directly from Microsoft Fabric rather than Excel, with plenty of autonomy to shape the approach.

What You’ll Be Doing

  • Designing, building and delivering end-to-end Power BI dashboards
  • Creating robust data models and managing data flows
  • Developing reports directly from Fabric
  • Working with finance and operational teams to understand reporting needs
  • Turning raw data into clear, actionable insights
  • Ensuring data accuracy, consistency, and best-practice governance

About You

We’re looking for someone with:

  • Strong commercial Power BI development experience
  • Hands-on experience building reports from Fabric (must-have)
  • Solid grounding in data modelling and database concepts
  • Confidence working with finance teams and operational stakeholders
  • Ability to work independently and deliver within tight timeframes

Robert Walters Operations Limited is an employment business and employment agency and welcomes applications from all candidates

Snowflake Data Architect
Hirexa Solutions UK
Hemel Hempstead
Remote or hybrid
Senior - Leader
ÂŁ400/day
RECENTLY POSTED
+2

Experience

12+ years of experience in Data Engineering, Data Warehousing, Cloud Data Platforms, and Enterprise Analytics solutions, with strong expertise in modern cloud data architectures.

Job Summary

We are seeking an experienced Data Architect with strong expertise in Snowflake on Amazon Web Services and DBT to design, architect, and optimize scalable enterprise data platforms.

The role involves defining data platform architecture, governance standards, and scalable data transformation frameworks, while ensuring high performance, security, and cost efficiency. The architect will provide technical leadership to data engineering and analytics teams and ensure the platform supports enterprise reporting, advanced analytics, and AI/ML initiatives.

The ideal candidate should also have exposure to AI/ML data platforms and experience in the hospitality domain, supporting systems such as reservations, guest management, and operational analytics.

Key Responsibilities

  • Define and lead the architecture and design of enterprise data platforms using Snowflake on AWS.
  • Architect scalable data ingestion frameworks for integrating multiple source systems into the cloud data platform.
  • Design and govern data transformation frameworks using DBT.
  • Define and enforce data modelling standards including dimensional modelling, star schema, and enterprise data models.
  • Lead architecture reviews and solution design discussions for new data initiatives.
  • Optimize Snowflake performance, workload management, and cost governance.
  • Establish data governance frameworks including access control, data security, and compliance standards.
  • Design and support AI/ML-ready data architecture for advanced analytics and predictive modelling.
  • Provide architectural guidance to data engineering, BI, and analytics teams.
  • Design architecture to support data consumption for reporting systems, operational applications, and analytics platforms.
  • Implement automation, orchestration, and scalable pipeline frameworks using tools such as Apache Airflow.
  • Collaborate with business stakeholders and technical teams to align the data platform with enterprise data strategy.
  • Support hospitality analytics use cases, including guest behaviour analysis, booking trends, revenue analytics, and operational reporting.

Required Technical Skills

Data Platform

  • Strong expertise in
  • Snowflake
  • Deep knowledge of Snowflake architecture, performance tuning, data sharing, security, and workload optimization.

Cloud Platform

Strong experience with Amazon Web Services, including:

  • S3
  • IAM
  • AWS Glue
  • Lambda
  • CloudWatch

Data Transformation

  • Strong experience with
  • DBT
  • for enterprise-scale data modelling, testing, and transformation pipelines.

Programming / Query

  • Strong expertise in SQL for data transformation and performance optimization
  • Python (preferred) for automation and data engineering tasks.

Data Engineering

  • Enterprise ETL / ELT pipeline architecture
  • Data warehousing and enterprise data modelling
  • Dimensional modelling (Star Schema, Snowflake Schema)
  • Data pipeline scalability and reliability design.

AI / Data Science Exposure

  • Experience supporting AI/ML data pipelines and data preparation for machine learning models.
  • Understanding of predictive analytics, recommendation engines, and customer behaviour analytics.
  • Ability to design AI-ready data platforms for future analytics use cases.

Preferred Skills

  • Experience with Apache Airflow for pipeline orchestration.
  • Knowledge of CI/CD pipelines, DevOps, and Git-based development workflows.
  • Experience with data governance, metadata management, and enterprise data catalog tools.
  • Experience with BI tools such as
  • Tableau
  • Microsoft Power BI.
  • Domain experience in hospitality, travel, or hotel systems, including reservation systems, guest analytics, and operational reporting.

Education

Bachelor’s or Master’s degree in Computer Science, Information Technology, Data Engineering, Data Science, or a related field.

Senior SQL Server DBA
Prophet Limited
Coventry
Remote or hybrid
Senior
ÂŁ60,000
RECENTLY POSTED

This is a chance for a well-qualified, and experienced DBA, to use their wider experience to step into a key role in our core data team. In a fast moving and intellectually challenging environment, our goal is to improve overall user satisfaction through product design and delivery.

The system operates on a complex set of SQL Server databases across many different customer environments (including AWS and Azure). Working closely with the Head of Data Services you will be responsible for a wide scope of database support functions including, design, support, performance, and reporting.

The role requires:

  • Established skills in T-SQL programming.
  • A deep understanding of Microsoft SQL Server fundamentals including, indexes, statistics, query plans, tempdb, installation, configuration, DMVs and, troubleshooting.
  • A good understanding relational table design including normalisation, constraints, OLAP vs OLTP, traditional data warehousing.
  • Experienced in critical thinking and troubleshooting.
  • Experience with Azure SQL Database and Azure SQL Managed instance would be beneficial.
  • Experience with Power BI would be beneficial but not essential
  • Ability to communicate confidently with technical, non-technical, internal, and external personnel.
Data Quality Manager
McCabe & Barton
London
Remote or hybrid
Senior - Leader
ÂŁ95,000
RECENTLY POSTED

AleadingFinancialServicesorganisationundergoingalargescaledatatransformationislookingtohireanexperiencedDataQualityManageronapermanentbasis.TheroleoffersasalaryofÂŁ95,000plusastrongbenefitspackageandflexibleworking.

ThisrolewillsuitatechnicallycredibleDataQualityleaderwithagenuinepassionfordataquality,accuracyandtrust.Youwillworkcloselywithdataengineersandplatformteamstoembedpragmaticgovernanceandqualitycontrolsintodelivery,whileinfluencingstakeholdersacrossthebusinessandpossessacommercialmindset.

Thisisahandsontechnicalleadershiprole,combiningdataqualityandgovernanceownershipwithpracticalengineeringinput.YouwillleadasmallteamandpartnerwithdataengineersandoperationalSMEstoembedbestpracticeacrossdataquality,governanceanddatamanagement.

Roleremit

  • Ownandevolvethedatagovernanceframeworkwithinanengineering-ledenvironment
  • Definegovernancestandards,guardrails,datacontractsandSLAs
  • PartnerwithRisk,Audit,DataProtectionandLegaltomeetcompliancerequirements
  • Workwithdataengineeringteamstoembeddataqualityintopipelinesandworkflows
  • Providehands-onguidanceondatamodelling,reconciliation,metadataandbestpractice

Experiencerequired

  • StrongbackgroundinDataQuality,DataGovernanceandDataManagementwithinamoderndataengineeringenvironment
  • Hands-onexperiencewithclouddataplatforms,Azure,SQL,Pythonandorchestrationtools
  • ProvenexperienceembeddingdataqualitycontrolsacrossdatapipelinesandETLtransformationworkflows
  • Goodunderstandingofmoderndataarchitecturesandqualitycontrolpatterns
  • Experiencewithdataprofiling,lineageanalysis,reconciliationandmetadatamanagement
  • Strongstakeholdercommunicationskillswiththeabilitytoinfluenceengineeringteams

IfyouareanexperiencedDataQualityManagerwiththerequiredbackground,pleaserespondwithanup-to-dateCVforreview.

Junior Analytics Engineer
Hillarys HR
Nottingham
Remote or hybrid
Junior
Private salary
RECENTLY POSTED

We’re looking for an experienced Analytics Engineer to help shape our data foundations and deliver high quality, trusted insights across the business. You’ll design and maintain scalable analytics models, ensure data quality, and work closely with teams across Finance, Commercial, Operations, and IT to turn complex requirements into intuitive, reliable datasets.

This role sits at the heart of our analytics function, balancing immediate reporting needs with long-term data architecture, and ensuring our organisation can make confident, data driven decisions.

Key Responsibilities

  • Design, build, and maintain analytics models that convert raw data into trusted, business-ready datasets.
  • Own the full modelling lifecycle (staging → core → marts).
  • Define and maintain consistent metrics, dimensions, and business logic.
  • Implement testing, data quality checks, and SLAs to ensure reliability.
  • Translate business requirements into scalable, well-structured data solutions.
  • Deliver self-service datasets and support reporting/dashboards where needed.
  • Continuously improve analytics engineering processes and standards.
  • Provide robust data foundations to support strategic initiatives, including cost, pricing, and operational analytics.

What We Are Looking For

We’re seeking someone with strong analytical and technical capability, who enjoys solving complex data problems and can confidently partner with stakeholders across the business. You’ll bring the ability to design robust data models, communicate clearly, and ensure outputs are accurate, consistent, and actionable.

  • A degree in Computer Science, Engineering, Maths, Economics, Data Science is desirable.
  • Advanced SQL skills and experience with analytical/dimensional modelling.
  • Practical experience with modern ELT tools (Dataform/dbt) and Git workflows.
  • Knowledge of cloud data warehouses, ideally BigQuery.
  • Familiarity with orchestration tools such as Airflow.
  • Experience designing BI-friendly datasets, particularly for Power BI.
  • Strong communication skills, able to translate business needs into data models.
  • Experience with enterprise systems such as SAP BW/HANA.
  • A track record of building reusable datasets or maintaining a metrics/semantic layer.
  • Basic Python for automation or analytics engineering tooling.
  • Experience collaborating in cross-functional analytics or data product teams.

Why Join Us

  • Help shape the future of analytics at a global, design-led market leader.
  • Work collaboratively across Finance, Commercial, Operations, IT, and more.
  • Have real influence on how data is defined, trusted, and used across the organisation.
  • Join a supportive, innovative culture known for empowering teams and encouraging growth.
  • Develop your skills with modern cloud, modelling, and automation technologies.

We understand there’s no one size fits all approach. We’re proud to offer an inclusive workplace where every colleague feels valued, supported, and empowered to be their true self. If you require any reasonable adjustments throughout the recruitment process, please let us know and we’ll be happy to accommodate.

Everyone who applies will receive a response.

Incorta Developer/Data Engineer (Oracle EBS)
Skillsbay
Not Specified
Fully remote
Mid - Senior
Private salary
RECENTLY POSTED

Contract Details

  • Location: Remote
  • Duration: 5 months
  • OUTSIDE IR35

Role Title

Incorta Developer/Data Engineer

Overview

We are looking for an experienced Incorta Developer/Data Engineer to support reporting and analytics delivery for an organisation using Oracle E-business Suite.

The role will focus on extracting, modelling, and visualising data within Incorta, enabling Real Time insights across finance, HR, and operational datasets.

Key Responsibilities

  • Develop and maintain data models within Incorta
  • Extract and ingest data directly from Oracle EBS tables
  • Build dashboards, reports, and visualisations for business users
  • Optimise performance of large datasets within Incorta
  • Work closely with Finance/HR/Supply Chain stakeholders
  • Translate business requirements into technical reporting solutions
  • Ensure data accuracy, integrity, and governance
  • Support ad-hoc reporting and analytics requests

Core Skills & Experience

Essential

  • Strong hands-on experience with Incorta (development + modelling)
  • Proven experience working with Oracle EBS data structures
  • Strong SQL skills (querying large datasets)
  • Experience building dashboards and analytics solutions
  • Understanding of ERP data (Finance, Procurement, Projects etc.)
  • Experience with data ingestion and transformation concepts

Desirable

  • Experience with Oracle Cloud (Fusion)
  • Knowledge of BI tools (eg Power BI, Tableau)
  • Data warehousing background (nice to have, not essential)
  • Experience in Higher Education/Public Sector (if relevant to your client)

Tech Environment

  • Oracle EBS (source system)
  • Incorta (analytics layer)
  • SQL
  • Potential integration with other BI tools
Salesforce Data Engineer - 12 month FTC
Simplyhealth
Multiple locations
Remote or hybrid
Mid - Senior
Private salary
RECENTLY POSTED

Were not just your average health company; were aiming to revolutionise access to healthcare in the UK by offering innovative health and wellbeing solutions that are affordable, accessible, and effective. From preventive care to comprehensive medical support, we aim to empower individuals to take charge of their health, inspiring them to make the most of their wellbeing. Added to that were the first health insurer in the UK to be awarded B-Corp status in recognition of our significant achievements in sustainability, in addition to our ambitious environmental and social responsibility goals.

As aSalesforce Data Engineer , youll be at the heart of our data-driven transformation. You will help enhance our Salesforce ecosystem by ensuring high-quality data flows into and across the platform, enabling richer insights, personalised customer engagement and more intelligent use of Salesforce capabilities, including emerging AI features.

Reporting to our Data Delivery & Governance Manager, youll work closely with CRM, Product, Tech, and Data colleagues. Youll design and build data integrations, strengthen governance frameworks, and support the development of audience and segmentation capabilities that underpin personalised communications at scale.

Key responsibilities:

Tasked with the integration of data into the Salesforce platform, including but not limited to Marketing Cloud, in accordance with the broader data strategy of the organisation.

Responsible for the high-quality standard of data imported into the Salesforce platform.

Charged with the implementation of robust governance measures for data within the Salesforce platform.

Committed to collaborating with business stakeholders to ascertain their data needs specific to Salesforce.

Engaged in partnership with the Data Architect to contribute to the development of robust architectural designs and principles.

Utilising insights, behavioural and personal data to create and maintain prospect and customer audiences, to facilitate the delivery of personalised comms and messaging.

Identifying, scoping, and working in partnership with CRM, Product and Tech colleagues to prioritise and deploy MarTech and data enhancements.

Evolving our use and adoption of the Salesforce Platform including new channels and AI capability, together with broader marketing technology and data maturity, to enable personalisation at scale.

Providing support, mentoring and advice to colleagues in other areas of the business on best practice in the Marketing Coud platform.

Maintain, build and own native Salesforce business processes

Perform simple and complex daily administration tasks such as data manipulation, data loading, merging of duplicate records, managing custom fields, objects, layouts, list views, security configuration, complex workflows/Process/Flows, and overall system configuration

Proactively identifying opportunities where with marketing data and/or technology we can improve our CRM communications, reduce manual tasks, and improve the customer experience

TPBN1_UKTJ

BI / Datawarehouse Lead / Manager Home / Home / Prestigious Client
Integrity Recruitment Solutions Ltd
Birmingham
Remote or hybrid
Senior
Private salary
RECENTLY POSTED

BI / Datawarehouse Lead / Manager Home / Prestigious Client Prestigious in their sector, my client has an excellent market image and continues to make a significant impact. They are heavily involved in an ambitious business / systems transformation, and we have an excellent opportunity for an established Datawarehouse Lead to join their team. The successful candidate will play a lead role in the technical development and management of their global Data Warehouse and supporting team. The successful candidate will have a proven background and experience of full cycle development of a corporate, Microsoft based Data Warehouse. Desired skills: Design, Build, Test, Azure, Fabric Lakehouse We are looking to recruit a high-calibre resource, so the client is happy to consider candidates from both a contract and permanent background. Please forward your most recent CV to be considered for telephone screening. SQL / BI / ETL / DATAWAREHOUSE / AZURE / FABRIC / SPARK / PYSPARK / LEAD / MANAGER / HOME / REMOTE / MIDLANDS / BIRMINGHAM / SQL / BI / ETL / DATAWAREHOUSE / AZURE / FABRIC / SPARK / PYSPARK / LEAD / MANAGER / HOME / REMOTE / MIDLANDS / BIRMINGHAM

AI, Automation and Integration Engineer
SALVATION ARMY HOUSING ASSOCIATION
Bolton
Remote or hybrid
Mid - Senior
ÂŁ55,000
RECENTLY POSTED

About The Role
This is a brand-new role with a big remit and even bigger opportunity.
As our AI, Automation and Integration Engineer, youll be at the forefront of how Salvation Army Homes uses technology to work smarter, faster and more effectively. This isnt about maintaining the status quo - its about designing the future.
Youll lead the charge on automation, system integration and the responsible adoption of AI, helping us move towards streamlined processes, joined-up data, and genuinely intelligent digital services. From low-code automation and cloud integration to exploring practical AI use cases, youll have the space and backing to experiment, innovate and deliver real impact.
Working within our Digital, Data and ICT team, youll collaborate closely with colleagues across the organisation to turn ideas into working solutions. Youll help create a single version of the truth across our systems, reduce duplication and manual effort, and enable better decision-making through clean, connected data.
Because this role is new, youll play a key part in shaping how it operates - setting standards, defining approaches, and influencing how we use emerging technologies across the organisation. If youre excited by greenfield work, modern platforms and meaningful outcomes, this is a rare chance to make a role your own.

About The Candidate
Youre a hands-on technologist with a strong track record of delivering automation, integration and modern digital solutions in real-world environments - and youre ready to step into a role where you can shape both the technology and the approach.
Youll bring proven experience of designing and delivering process automation, ideally using the Microsoft Power Platform (Power Automate, Power Apps and Power BI), alongside experience building and supporting integrations between business-critical systems. Youre comfortable working across data, workflows and APIs to reduce manual effort and create seamless, joined-up services.
Youll have a strong technical foundation, including:

  • Experience working with cloud platforms, particularly Microsoft Azure
  • Solid SQL Server and database skills
  • Experience developing solutions using modern development languages and tools
  • A good understanding of data integration, data quality and governance principles

You dont just build solutions - you think about how theyre used, governed and scaled. You understand the importance of security, compliance and responsible data use, and you can balance innovation with control. Experience working in complex or multi-system environments is important, as is the ability to document, standardise and improve what you deliver.
Youre also excited by whats next. You may already have experience applying AI concepts in a business context, or you may be keen to develop this further - but either way, youre motivated to explore how AI and emerging technologies can be applied practically, ethically and at scale to improve services and decision-making.
Just as importantly, youre a strong collaborator and communicator. You can translate complex technical ideas into plain English, influence stakeholders at all levels, and work closely with analysts, data specialists and business teams to turn ideas into delivered outcomes. Youre organised, proactive, and comfortable managing multiple priorities in a fast-moving, evolving environment.
Above all, youre motivated by the opportunity to build something new, take ownership of a greenfield role, and play a leading part in an organisations journey towards automation, integration and AI-enabled services - while staying aligned with strong values and a clear social purpose.
The benefits on offer
In return for helping to transform lives, well give you access to some great benefits. These include:

  • 26 days annual leave rising to 31 days
  • An extra day off on your birthday
  • A High Street discount scheme (great savings both on and off-line)
  • Pension with life assurance
  • Discounted private medical insurance
  • Loans available for financial emergencies
  • Occupational Sick Pay
  • A full Induction package and training relevant to the role
  • Support to learn and develop your career

About The Company
A registered social landlord andone of the leading providers of supported housing in the UK, Salvation Army Homes is dedicated to transforming lives by providing accommodation and support for some of the most vulnerable members of society - mainly people with complex needs and/or experiencing homelessness.
Our aim is to work with individuals to build on their strengths, creating person centred, individualised strategies and plans that transform lives, support recovery and enable positive behaviour. In order to succeed, however, we need the right people in place. Our workforce is one of our greatest assets, but only by recruiting the very best can we continue to deliver comprehensive, good quality housing services, support and resettlement. services to our residents. Thats where you come in.
As an equal opportunities employer, Salvation Army Homes is committed to the equal treatment of all current and prospective employees and does not condone discrimination on the basis of age, disability, sex, sexual orientation, pregnancy and maternity, race or ethnicity, religion or belief, gender identity, or marriage and civil partnership. We invite and welcome applications to apply for Salvation Army Homes opportunities without concern of bias or discrimination.
We reserve the right to close this vacancy early if we receive sufficient applications for the role. Therefore, if you are interested, please submit your application as early as possible.

Lead Data Platform Engineer
Digital Skills Ltd
Not Specified
Fully remote
Senior
ÂŁ100,000 - ÂŁ110,000
RECENTLY POSTED
+10
  • Remote (UK-based)
  • SC Clearance Required (British Citizen)
  • Permanent - Up to ÂŁ110,000

We are looking for a hands-on Lead Data Platform Engineer to join a high-impact programme delivering scalable, Real Time data solutions. This role offers the opportunity to work on complex, mission-critical systems using modern technologies across streaming, cloud, and distributed data platforms.

The Role
As a Lead Data Platform Engineer, you will design and deliver batch and Real Time data pipelines that are highly available, low-latency, and scalable.

You will collaborate with cross-functional teams to shape solutions, influence architecture, and ensure data is transformed into reliable, consumable outputs.

Key Responsibilities

  • Design, build, and deploy data pipelines and Back End services
  • Work with Real Time streaming technologies and event-driven architectures
  • Champion engineering best practices including TDD, CI/CD, and clean code
  • Collaborate with architects, analysts, and testers in Agile/BDD environments
  • Take ownership of services in production, ensuring performance and reliability
  • Drive continuous improvement across data platform capabilities
  • Contribute to solution design, estimation, and delivery planning

Required Experience

  • Strong production Java experience
  • Experience with Kafka, Kinesis, or similar streaming technologies
  • Hands-on experience with Spark, Flink, Kafka Streams, or similar
  • Experience with Spring Boot and/or Python
  • Strong experience with AWS (including Lambda and S3)
  • Experience integrating with PostgreSQL, Redis, or similar
  • Solid understanding of data pipelines and data modelling
  • Passion for clean, testable, and maintainable code
  • Experience working in Agile and/or DevOps environments

Desirable Skills

  • Experience with Elasticsearch, OpenSearch, or Solr
  • Familiarity with Terraform, Ansible, or Packer
  • Experience with CI/CD tools such as Jenkins or Drone

If you are interested, please apply or get in touch to discuss further.

Machine Learning Engineer
Sanderson Government & Defence
Surrey
Remote or hybrid
Mid
ÂŁ35,000 - ÂŁ80,000
RECENTLY POSTED
TECH-AGNOSTIC ROLE

We’re looking for a Machine Learning Engineer to design, build, and deploy data-driven models that solve complex problems and enhance real-world performance. You’ll work across the full ML life cycle, collaborating with data, engineering, and product teams to turn ideas into robust, scalable solutions.

Responsibilities:

  • Develop, train, and optimise machine learning models to support business and technical objectives.
  • Build and maintain data pipelines, ensuring high-quality, reliable data for model development.
  • Deploy ML models into production environments and monitor their ongoing performance.
  • Work closely with engineers and analysts to integrate models into wider systems and applications.
  • Evaluate model accuracy, stability, and scalability, implementing improvements where needed.
  • Contribute to experimentation, research, and continuous improvement across ML workflows.
  • Document processes, methodologies, and model behaviour for technical and non-technical audiences.

Reasonable Adjustments:

Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people of all backgrounds and perspectives. Our success is driven by our people, united by the spirit of partnership to deliver the best resourcing solutions for our clients.

If you need any help or adjustments during the recruitment process for any reason, please let us know when you apply or talk to the recruiters directly so we can support you.

SQL Developer - Contract - ÂŁ250-300 per day - Outside of IR35
Randstad Technologies Recruitment
London
Fully remote
Mid
ÂŁ250/day - ÂŁ255/day
RECENTLY POSTED

SQL Developer - Contract - (Apply online only) per day - Outside of IR35

I am currently looking for a strong SQL Developer. My London based client is looking to get someone started immediately.

As a SQL Server/ ETL Developer you will have strong SQL Scripting skills and have the ability to Write stored procedures from scratch. The client is in the process of trying to upload/import over 400 files on a daily basis with no delays.

Location: Remote
Length: 6 months with strong view to extend
Day Rate: 250 Per Day - Dependent on experience
IR35 Status: Outside of IR35.

Required experience will include:

  • You will need experience developing ETL pipelines from various sources.
  • Experience designing, tuning and maintaining existing ETL Processes.
  • The ability to write stored procedures from scratch using SQL.
  • Experience with Azure Data Platform will be beneficial. Including Data Factory, Databricks and Synapse
  • Strong Excel Skills
  • Strong knowledge of building reports for analytics
  • Strong MSBI Skills: SSIS, SSRS, SSAS, SQL, T-SQL.

If you are interested in this SQL Server/ ETL Developer role please apply with your most recent CV. Alternatively email me on Jordan co . uk.

SQL Developer - Contract - 250 per day - Outside of IR35

Randstad Technologies is acting as an Employment Business in relation to this vacancy.

Data Engineer / Consultant
Change-IT Consulting Ltd
London
Remote or hybrid
Mid - Senior
ÂŁ50,000 - ÂŁ100,000
RECENTLY POSTED

We’re hiring a Data Engineer / Consultant Data Engineer to design and deliver scalable data pipelines and big data solutions for clients.

This is a client-facing role, combining hands-on engineering with stakeholder engagement and solution design.

Key Responsibilities

  • Build and optimise data pipelines, ETL processes, and data platforms
  • Develop solutions using Python, Spark, Kafka, Hadoop or similar
  • Work in Agile teams to deliver production-ready systems
  • Translate business requirements into technical data solutions
  • Engage with stakeholders and communicate technical concepts clearly
  • Deploy solutions using cloud (AWS/Azure/GCP), Docker, Kubernetes, CI/CD

Skills & Experience

  • Experience as a Data Engineer / Big Data Engineer
  • Strong coding in Python, Scala or Java
  • Hands-on with Spark, Kafka, ETL / data pipelines
  • Knowledge of cloud platforms (AWS, Azure or GCP)
  • Familiar with Agile and software engineering best practices
  • Strong communication / stakeholder skills

Nice to Have

  • Consulting or client-facing experience
  • Docker, Kubernetes, DevOps, CI/CD
  • Streaming or real-time data experience
Power BI Developer
Vivo Talent
Not Specified
Fully remote
Senior - Leader
ÂŁ70,000
RECENTLY POSTED

Lead Power BI Developer (12 - month Fixed Term Contract)

Location: Remote

Overview

We are seeking a Lead Power BI Developer to deliver high-quality, scalable reporting solutions within a project-focused environment. This role combines hands-on development, leadership, and strong stakeholder engagement, including interaction with senior leadership and C-suite.

A key focus of this role is to establish and embed Power BI best practices across the organisation, coaching and enabling existing staff to improve capability, consistency, and long-term sustainability of BI solutions.

Key Responsibilities

Lead the design and delivery of Power BI dashboards, reports, and data models
Translate business requirements into scalable, user-friendly BI solutions
Drive best practice in data modelling, DAX optimisation, and report design
Own Power BI Service (workspaces, apps, deployment, governance)
Implement and manage access control (RLS, Azure AD security)
Establish governance, standards, and lifecycle management processes
Collaborate with data engineers on pipelines and data model design
Lead stakeholder engagement, including presenting to senior leadership and C-suite
Support testing, deployment, documentation, and user adoption
Mentor team members and provide technical leadership
Embed Power BI best practices across the organisation through coaching and enablement
Essential Skills & Experience

Proven experience delivering end-to-end Power BI solutions in a lead capacity
Strong expertise in Power BI Service, governance, and deployment
Deep understanding of access control (RLS, Azure AD groups)
Advanced DAX and strong SQL skills
Strong data modelling experience (star schema, performance optimisation)
Experience with Azure-based platforms and Databricks
Strong stakeholder management, including C-suite engagement
Excellent communication skills across technical and non-technical audiences
Desirable Experience

Databricks experience
Data warehousing and architecture design exposure
Experience building large-scale enterprise semantic models
Microsoft Dynamics Business Central or Navision
DevOps practices
If you are interested in the role then please apply or reach out

Database Administrator
Gigaclear
Abingdon
Remote or hybrid
Mid - Senior
ÂŁ55,000 - ÂŁ60,000
RECENTLY POSTED
+1

Are you ready and looking for a role that you can make your own, taking the autonomy to set out what and how you do it?

As we have grown, we have accumulated a diverse database infrastructure, including PostgreSQL, Maria DB, InfluxDB and MongoDB systems.

The timing is ripe for an experienced Administrator to take ownership, mature, upgrade and manage our database servers, while supporting development teams and business operations.

Key Accountability & Responsibilities

Work with teams across the Technology department to install, configure, maintain, and upgrade our database servers across development, testing, and production environments.

Monitor database health and perform routine maintenance tasks including index optimisation, table maintenance, and schema modifications.

Work with Development and Data engineering teams to optimise performance & cost of data pipelines

Manage database capacity planning and storage allocation to ensure adequate resources for current and future needs.

Proactive management of databases, ensuring application and operational performance needs are met.

Implement and maintain high availability solutions including replication, clustering, and failover configurations.

Document database architectures, configurations, procedures and policies.

Implement appropriate security controls to ensure databases and data are protected.

Ensure database backup, validation and disaster recovery capabilities are in place and rehearsed.

Provide insight and recommendation on the adoption and consolidation of database related technologies.

Be part of the on 24x7 on call rota to provide out of hours support for our critical systems.

Knowledge & Skills

Proven expertise managing PostgreSQL and MariaDB/MySQL databases

Experience with NoSQL databases.

Experience with cloud database services (AWS).

Deep understanding of relational database concepts, normalisation, and SQL optimisation.

Proficiency in SQL and query optimisation across multiple database platforms.

Experience with database replication, clustering, and high availability configurations.

Familiarity with backup and recovery tools specific to each database platform.

Understanding of database security principles and access control mechanisms.

Experience with monitoring tools and performance analysis techniques.

Knowledge of version control systems (Git) for managing database code and scripts.

Experience with database automation, CI/CD pipelines and tooling.

Gigaclear is a growing Fibre Broadband (FTTP / FTTH) company, developing our fibre-to-the-premises broadband infrastructure to some of the most difficult to reach areas of the UK, empowering those communities with broadband to rival any city.

Staff rewards, benefits and opportunities

We foster a collaborative, engaging culture that empowers staff to grow and maximise their skills. We want to challenge our people in a fair environment where hard work is rewarded and a path for progression is open to all.

  • Generous employer pension; up to 8% matched contribution
  • Income protection & life assurance
  • 25 days holiday (plus bank holidays), holiday purchase scheme and Yay Days!
  • Health cash plan, 24/7 remote GP access and Employee Assistance Programme including counselling & legal advice
  • Unlimited access to online training and development content via our Learning Management System
  • Long service benefits and monthly employee recognition
  • Enhanced maternity and paternity provisions
  • Flexible working environment
  • Health & Wellbeing initiatives and company funded social events

Our approach is to work guided by our mission, vision and values.

Our Mission - Empowering communities with brilliant broadband

Our Vision - Connected Communities

Our Values - Own it, Find the Right Way, Work Together, Win Together

SAS Data Engineer - MUST HAVE SC CLEARANCE - Remote and Telford or Hove - 6 months+
Octopus Computer Associates
Shropshire
Fully remote
Mid - Senior
ÂŁ459/day
RECENTLY POSTED

SAS Data Engineer - MUST HAVE SC CLEARANCE - Remote and Telford or Hove - 6 months+/RATE: ÂŁ459 per day inside IR35

One of our Blue Chip Clients is urgently looking for a SAS Data Engineer.

Hybrid role: requires attendance for occasional workshops (typically a couple of days per month) at one of our sites - Telford or Hove

Please find some details below:

Clearance Required: Active SC with a governing body

SAS Data Engineer to support Live service within CONNECT ACE - CONNECT is a strategic risking tool that cross matches one and a half billion internal and third party data items to enable the customer to capture up to ÂŁ25 million in yield per day in recovered tax revenue.
SC is required for this role, Working in a fast paced environment, an experienced engineer with SAS, Oracle SQL & Unix skills to join the Blue ACE Team to support the Live Services in resolving incidents and problems. Also to support the development on Projects

Must have
SAS 9.4 Programming skills
Unix/Linux Skills
Excellent interpersonal skills
Good planning and scheduling capabilities
SC Clearance
Good people management skills
Good understanding of delivery
Team Player
Customer facing skills
Resilience
Agile (Scrum and Kanban)
SAS Viya Programming & Gitlab knowledge would be advantages

REQUIRED SKILLS:
SAS 9.4 Programming skills
Unix/Linux Skills
Excellent interpersonal skills
Good planning and scheduling capabilities
SC Clearance
Good people management skills
Good understanding of delivery
Team Player
Customer facing skills
Resilience
Agile (Scrum and Kanban)

Additional Requirements:

Please send CV for full details and immediate interviews. We are a preferred supplier to the client.

Data Engineer (Azure)
Hays Technology
London
Remote or hybrid
Mid - Senior
ÂŁ450/day - ÂŁ550/day

Your new company Working for a renowned financial services organisation Your new role Seeking a Data Engineer to help design and maintain scalable batch and near‑real‑time ingestion pipelines, modernizing legacy ETL/ELT processes into Azure and Snowflake, and implementing best‑practice patterns such as CDC, incremental loading, schema evolution, and automated ingestion frameworks. They build cloud‑native solutions using Azure Data Factory/Synapse, Databricks/Spark, ADLS Gen2, and Snowflake capabilities including stages, file formats, COPY INTO, and Streams/Tasks to support raw‑to‑curated data modelling. The role involves creating reusable components and Python libraries to accelerate delivery across teams, enforcing data quality through validation, observability, and robust pipeline design, and ensuring strong security, governance, and documentation standards. Collaboration within agile workflows-including CI/CD, code reviews, and iterative planning-is also key to delivering consistent, reliable, and secure data solutions. What you'll need to succeed Strong hands-on data engineering experience, with strong focus on data ingestion Experience building production pipelines using Azure Data Factory, Databricks, Synapse Solid SQL skills and experience working with modern cloud data warehouses, ideally Snowflake Proficiency in Python for data processing, automation, and pipeline utilities Good understanding of data lake/lakehouse concepts and ingestion patterns Infrastructure-as-Code exposure (Terraform) and CI/CD (Azure DevOps) Able to prototype quickly while adhering to Group standards and controls Communicates clearly with business stakeholders and technical teams Familiarity with orchestration frameworks (Dagster) - desirable Energy commodity trading experience is a real advantage What you'll get in return Flexible working options available. What you need to do now If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found at (url removed)

Page 1 of 3
Frequently asked questions
Haystack features a wide range of remote Data Engineer positions, including roles focused on data pipeline development, ETL processes, data warehousing, cloud data engineering, and real-time data processing across various industries.
You can browse remote Data Engineer job listings on Haystack, create a profile, upload your resume, and apply directly through the platform to employers offering remote positions.
Haystack offers a variety of remote Data Engineer roles including full-time, part-time, contract, and freelance opportunities. You can filter job listings by employment type to find the best fit.
Common skills for remote Data Engineer jobs include proficiency in SQL, Python or Java, experience with cloud platforms like AWS or Azure, knowledge of big data tools such as Hadoop or Spark, and expertise in data modeling and ETL frameworks.
Yes, Haystack offers interview tips, sample questions, and career advice specifically geared towards Data Engineers seeking remote positions to help you prepare effectively.