Make yourself visible and let companies apply to you.
Roles

Remote Data Architect Jobs

Overview

Looking for remote Data Architect jobs? Explore top fully remote Data Architect opportunities on Haystack, your trusted IT job board. Find roles that let you design and optimize data systems from anywhere, with leading companies seeking skilled data architects to drive innovation. Start your remote Data Architect career today!
Filters applied
Remote
Data Architect
Search
Salary
Location
Remote preference
Role type
Seniority
Tech stack
Sectors
Contract type
Company size
Visa sponsorship
GCP Data Engineer
Lime Street Recruitment Limited
City of London
Remote or hybrid
Mid - Senior
Private salary
RECENTLY POSTED
processing-js
tableau
In the role you will be designing and developing complex data processing modules and reporting using Big Query and Tableau. In addition, you will also work closely with the Infrastructure/Platform Team who are responsible for architecting, and operating the core of the client’s Data Analytics platform.You will:Work with both the business teams data scientists and engineers to design, build, optimise and maintain production grade data pipelines and reporting from an internal Data warehouse solution, based on GCP/Big QueryWork with finance, actuaries, data scientists and engineers to understand how the client can make best use of new internal and external data sourcesWork with the client’s delivery partners at to ensure robustness of Design and engineering of the data model/ MI and reporting which can support their ambitions for growth and scaleBAU ownership of data models, reporting and integrations/pipelinesCreate frameworks, infrastructure and systems to manage and govern data assetsProduce detailed documentation to allow ongoing BAU support and maintenance of data structures, schema, reporting etc.Work with the broader Engineering community to develop the client’s data and MLOps capability infrastructureEnsure data quality, governance, and compliance with internal and external standards.Monitor and troubleshoot data pipeline issues, ensuring reliability and accuracy.TPBN1_UKTJ
Microsoft Fabric Architect - SME
IO Associates
UK
Fully remote
Senior - Leader
Private salary
RECENTLY POSTED
fabric
sql
snowflake
dbt
We are supporting a high-profile organisation on a greenfield Microsoft Fabric initiative and are looking for an experienced Fabric Architect to lead the end-to-end platform design. This is a unique opportunity to shape the architecture, governance, and delivery of a brand-new Fabric data platform from the ground up.If you have delivered Microsoft Fabric solutions at an architectural level, defining Lakehouse structures, medallion models, and analytics-ready datasets, this opportunity is for you.Location: Remote (UK) Day Rate: Negotiable Contract: Short-term engagement with potential for extensionKey Responsibilities / Experience Required:
Designing a greenfield Microsoft Fabric platform, including Lakehouse, Bronze, Silver and Gold layers, and reporting structures
Defining data ingestion and transformation architecture across multiple sources such as CRM, CMS, SaaS platforms, and GA4
Designing Fabric Data Factory pipelines and Spark notebooks for scalable ETL/ELT
Establishing data modelling standards for Silver and Gold (Medallion) layers
Preparing Power BI-ready datasets with clear fact and dimension schemas
Providing technical governance guidance, ensuring scalability, maintainability and PII compliance
Advising on best practices for Fabric, Synapse and Databricks integration
Strongly Preferred:
Experience in Azure data services, including ADF, ADLS and Azure SQL
Proven experience integrating web analytics, CRM and CMS data at a platform level
Familiarity with data governance frameworks and enterprise data architecture principles
Nice to Have:
Experience with dbt (Core or Cloud)
Exposure to modern lakehouse ecosystems such as Databricks or Snowflake
Background in nonprofit or charity analytics (optional)
If you have architected Microsoft Fabric solutions from scratch and can define clean, scalable, enterprise-grade platforms, we would love to hear from you.Please reply with your most up-to-date CV.
Junior Data Engineer
Pontoon
Bristol
Remote or hybrid
Junior
Private salary
RECENTLY POSTED
airflow
sql
Job Title: Junior Data Engineer Are you ready to kickstart your career in the exciting world of data engineering? Our client, a leading organization in the IT & Digital industry, is on the hunt for a passionate Junior Data Engineer to join their dynamic team! If you have a knack for problem-solving and are eager to learn, this is the perfect opportunity for you! Pay Rate: Competitive (Umbrella) Duration: 6 months with a view to extend to 9 months Working Pattern: Remote (occasional travel to Bristol for training) Start Date: ASAP What You will Do: As a Junior Data Engineer, you will work closely with a Senior Data Engineer to support a critical application launch. This hands-on role focuses on maintaining data integrity and implementing SQL-based fixes and patches in a fast-paced production environment. Your contributions will be vital in ensuring smooth operations through: * Writing and executing queries for code fixes, patches, and data corrections * Supporting the Senior Data Engineer in maintaining database stability and performance as the application goes live * Troubleshooting and resolving data-related issues promptly * Implementing database patches and updates * Performing data validation and quality checks to ensure accuracy * Documenting technical processes and maintaining clear records of changes made What We are Looking For: To thrive in this role, you should possess a solid foundation in data engineering concepts and technologies. Here is what you need: Essential Skills: * Proficiency in Microsoft SQL Server (mandatory) * Solid understanding of query writing, optimization, and debugging * Experience with database maintenance, data fixes, and patch implementation * Ability to work under pressure in a production environment * Good problem-solving skills and attention to detail * Excellent communication skills to work effectively with senior engineers and stakeholders Desirable Skills: * Experience with Apache Airflow for workflow orchestration * Familiarity with Alteryx for data preparation * Knowledge of REST APIs and integration patterns * Awareness of Agile methodologies Qualifications: * Bachelor’s degree in computer science, Engineering, or a related technical field * Technology certifications in database administration or data engineering are a plus Why Join Us? Our client values creativity, teamwork, and continuous learning. By joining their talented team, you will have the opportunity to: * Work on impactful projects that drive innovation * Collaborate with experienced professionals who are eager to mentor and support you * Enjoy a flexible work environment that promotes work-life balance * Develop your skills in a thriving industry Ready to Apply? If you are excited about this opportunity and meet the qualifications, we would love to hear from you! Join our client in shaping the future of data engineering. Apply Now! Please note if you do not hear back regarding your application within 5 working days you have unfortunately been unsuccessful currently, but we thank you for your interest. Adecco is a disability-confident employer. It is important to us that we run an inclusive and accessible recruitment process to support candidates of all backgrounds and all abilities to apply. Adecco is committed to building a supportive environment for you to explore the next steps in your career. If you require reasonable adjustments at any stage, please let us know and we will be happy to support you. We use generative AI tools to support our candidate screening process. This helps us ensure a fair, consistent, and efficient experience for all applicants. Rest assured, all final decisions are made by our hiring team, and your application will be reviewed with care and attention
Data Engineer
Datatech
Mansfield
Fully remote
Junior - Mid
£45,000 - £45,000
RECENTLY POSTED
aws
git
python
airflow
sql
snowflake
+1
Data Engineer, Remote Modern Cloud Data Stack £45,000 DOE No sponsorship, post-grad visa not available A high-visibility opportunity with a values-led organisation modernising its data platform and refreshing its data strategy. You’ll be trusted early, work directly with stakeholders across the business, and build the foundations that power better insight, smarter decisions, and real-world impact. This suits someone with 2+ years’ experience who wants to step up, take ownership, and grow quickly in a supportive environment. Communication is central here, you’ll succeed by translating business questions into robust, trusted data assets, and by bringing people with you on the journey. What you’ll do • Help shape and deliver a refreshed data strategy and modern intelligence platform • Build reliable, scalable ELT/ETL pipelines into a cloud data warehouse (Snowflake, Databricks, or similar) • Develop and optimise core data models and transformations (dimensional, analytics-ready, built to last) • Create trusted data products that enable self-service analytics across the organisation • Improve data quality, monitoring, performance, and cost efficiency • Partner with analysts, BI, and non technical stakeholders to turn questions into production-grade data assets • Contribute to standards, best practice, and reusable engineering frameworks • Support responsible AI tooling, including programmatic LLM workflows where relevant What you’ll bring • 2+ years’ experience in data engineering within a modern cloud stack • Strong SQL, plus a solid data modelling foundation • Python preferred (or similar) for pipeline development and automation • Cloud exposure (AWS, Azure, or GCP) • Familiarity with orchestration and analytics engineering tools (dbt, Airflow, or similar) • Strong habits around governance, security, documentation, Git, and CI/CD What will make you stand out in this business • Clear, confident communication, you can explain technical choices in plain English • Strong stakeholder mindset, you ask the right questions and align on outcomes early • Ownership, curiosity, and a bias for building things properly Excited? Apply now
Lead Data Engineer
Cathcart Technology
Edinburgh
Fully remote
Senior
£100,000
RECENTLY POSTED
python
snowflake
I’m partnering with a fast-growing, highly respected analytics and consultancy organisation working at the forefront of the energy, chemicals and low-carbon sectors to hire a Lead Snowflake Data Engineer . Backed by major investors and trusted by global clients, it’s a great time to be joining their team (fully remote - MUST be UK based). This is a rare greenfield leadership opportunity where you’ll define the data strategy, architecture and tooling from day one, while building and mentoring a high-performing data engineering team. You’ll deliver scalable cloud-native pipelines on Snowflake , help unify data across multiple acquired businesses, and develop machine learning solutions that drive real commercial insight. Working closely with senior stakeholders , you’ll turn complex requirements into robust technical solutions and set standards for data quality and governance. With a modern Python-first stack , no legacy constraints and strong business backing, the role offers genuine technical ownership and strategic impact in a fully remote position . You’ll ideally have most of the following: ** Hands-on experience with Snowflake (essential) \ Proven experience leading data engineering projects end-to-end ** Strong Python background and modern data engineering practices \ Deep understanding of ETL / ELT, data modelling and transformations ** Practical machine learning experience (desirable) \ Strong communication with technical and non-technical teams You’ll be joining at a pivotal time, with the chance to build a modern data function from the ground up and make a visible, long-term impact on the business. The role offers significant technical autonomy , real influence at senior stakeholder level, and the opportunity to work on meaningful problems connected to the global energy transition. In return they offer a very competitive salary and strong benefits package , flexible remote working across the UK with occasional travel , and clear long-term progression path into senior data leadership. It’s genuinely a really exciting opportunity to combine hands-on engineering, strategic thinking and team leadership in an environment that actively supports innovation, learning and ambitious technical ownership. If you’re keen to learn more, please apply or drop Matthew MacAlpine at Cathcart Technology a message. Cathcart Technology is acting as an Employment Agency in relation to this vacancy.TPBN1_UKTJ
Lead Data Engineer
Cathcart Technology
Glasgow
Fully remote
Senior
£85,000 - £100,000
RECENTLY POSTED
python
snowflake
I’m partnering with a fast-growing, highly respected analytics and consultancy organisation working at the forefront of the energy, chemicals and low-carbon sectors to hire a Lead Snowflake Data Engineer. Backed by major investors and trusted by global clients, it’s a great time to be joining their team (fully remote - MUST be UK based).This is a rare greenfield leadership opportunity where you’ll define the data strategy, architecture and tooling from day one, while building and mentoring a high-performing data engineering team. You’ll deliver scalable cloud-native pipelines on Snowflake, help unify data across multiple acquired businesses, and develop machine learning solutions that drive real commercial insight. Working closely with senior stakeholders, you’ll turn complex requirements into robust technical solutions and set standards for data quality and governance. With a modern Python-first stack, no legacy constraints and strong business backing, the role offers genuine technical ownership and strategic impact in a fully remote position.You’ll ideally have most of the following:** Hands-on experience with Snowflake (essential)** Proven experience leading data engineering projects end-to-end** Strong Python background and modern data engineering practices** Deep understanding of ETL / ELT, data modelling and transformations** Practical machine learning experience (desirable)** Strong communication with technical and non-technical teamsYou’ll be joining at a pivotal time, with the chance to build a modern data function from the ground up and make a visible, long-term impact on the business. The role offers significant technical autonomy, real influence at senior stakeholder level, and the opportunity to work on meaningful problems connected to the global energy transition.In return they offer a very competitive salary and strong benefits package, flexible remote working across the UK with occasional travel, and clear long-term progression path into senior data leadership.It’s genuinely a really exciting opportunity to combine hands-on engineering, strategic thinking and team leadership in an environment that actively supports innovation, learning and ambitious technical ownership.If you’re keen to learn more, please apply or drop Matthew MacAlpine at Cathcart Technology a message.Cathcart Technology is acting as an Employment Agency in relation to this vacancy
Data Consultant
Hays Technology
Cardiff
Fully remote
Mid - Senior
£400/day
RECENTLY POSTED
powerbi
sql
Job Details
400 Per Day
Outside IR35
Remote role for a client based in Wales
6-month contract
Essential
Enhanced DBS check will be undertaken prior to the commencement of the contract.
Available to start within 2 weeks
Ability to work independently on a technical project, and to take initiative.
Skills Our client is looking for a Data Consultant to support them with consolidating and managing their data across various platforms, as well as generating PowerBI reports and dashboards. Our client is looking for a candidate with experience in:
Extensive experience in PowerBI
Strong skills working in Excel and SQL
Understanding of both HR & Finance data systems
Responsibilities
Building PowerBI dashboards and reports for multiple databases, including Excel.
Data cleaning and increasing data quality across different HR and Finance related systems.
Communicating with senior stakeholders and Operations Directors to gather business requirements.
What you need to do now If you’re interested in this role, click ‘apply now’ to forward an up-to-date copy of your CV, or call us now. If this job isn’t quite right for you, but you are looking for a new position, please contact us for a confidential discussion about your career.Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C’s, Privacy Policy and Disclaimers which can be found at (url removed)
Databricks Implementation Consultant
Robert Half Limited
London
Fully remote
Mid - Senior
Private salary
RECENTLY POSTED
sql
We are seeking a Databricks Implementation Consultant / Architect for an initial 4-6 weeks contract. The primary objective of this role is to assess and recommend the optimal cloud platform (Azure or Google Cloud Platform) for implementing Databricks.This will involve a detailed evaluation of both platforms against analytical requirements, cost models, security, compliance, and integration considerations, culminating in a clear recommendation for senior stakeholdersRole OverviewYou will lead a comprehensive Cloud & Data Platform Assessment focused on defining a future-ready architecture and positioning Databricks as the core analytics layer. This role requires deep expertise in Databricks implementations across Azure and Google Cloud Platform (GCP).Cloud & Data Platform Assessment:
Review the existing data landscape, including CRM data in SQL Server, external datasets, geospatial sources, and BI consumption.
Assess how shortlisted cloud platforms (Azure vs GCP) meet analytical requirements: large-scale data volumes (20M+ rows), mixed data types, NLP/AI workloads, geospatial analytics, and daily refresh needs.
Compare Azure and GCP for architecture, security, Databricks integration, and compliance with data handling standards.
Produce workload-based cost models for storage, compute, pipelines, and AI/NLP workloads.
Define target architectures for each option, including high-level diagrams.
Evaluate implementation complexity, operational overhead, and scalability.
Required Experience:
Proven track record of end-to-end Databricks implementations in Azure and GCP.
Strong expertise in lakehouse architecture, data engineering, and analytics frameworks.
Hands-on experience as a Data Architect or Implementation Consultant.
Delivery of AI and NLP workloads using cloud-native services.
Ability to produce technical diagrams, cost models, and senior-level recommendations.
Background in security, governance, and compliance design.
Conducting architecture comparisons between Azure and GCP, including evaluating Databricks compatibility.
Developing costed platform models aligned to specific workloads and Databricks usage.
Performing security and compliance assessments for Databricks workloads.
Identifying integration dependencies and risks across cloud platforms, Databricks, and existing systems.
Producing clear recommendation papers for senior stakeholders, outlining preferred cloud platform and supporting rationale.
Contract Details:
Fully Remote Engagement
Inital 4-6 weeks Contract
Competitive Day Rate - Outside IR35
Robert Half Ltd acts as an employment business for temporary positions and an employment agency for permanent positions. Robert Half is committed to diversity, equity and inclusion. Suitable candidates with equivalent qualifications and more or less experience can apply. Rates of pay and salary ranges are dependent upon your experience, qualifications and training. If you wish to apply, please read our Privacy Notice describing how we may process, disclose and store your personal data: roberthalf.com/gb/en/privacy-notice.
Lead Data Platform Engineer
ARC IT Recruitment
Not Specified
Remote or hybrid
Senior
£80,000 - £90,000
RECENTLY POSTED
processing-js
python
sql
A fast-growing financial technology business is expanding its data capability and is looking for an experienced data professional to help shape and scale its analytics platform. This role is ideal for someone who enjoys building robust data foundations and wants greater exposure to solution design and platform architecture.You’ll be instrumental in evolving a cloud-based data environment, enabling reliable reporting, analytics, and downstream decision-making across the organisation. Working alongside product teams, engineers, and business users, you’ll translate complex requirements into resilient, high-performing data solutions.Core responsibilities
Own the reliability and performance of data platforms used for insight and reporting
Partner with technical and non-technical stakeholders to define data requirements and delivery outcomes
Create and manage automated data pipelines to ensure timely, accurate data availability
Develop and maintain database objects, schemas, and data models
Diagnose data quality issues and implement long-term fixes
Apply best practices around security, governance, and deployment
What we’re looking for
Commercial experience in data engineering roles (mid-senior level)
Hands-on expertise with Azure data services such as Data Factory, Synapse and Azure SQL
Experience using Databricks and Python in data processing workflows
Strong SQL skills and practical knowledge of dimensional modelling and warehouse design
Familiarity with CI/CD approaches for data and analytics solutions
Confident communicator with the ability to work across teams
This role is UK-based, offers a high degree of remote working, and comes with a salary of up to £80,000 to £90,000 depending on experience.
Site Masterdata Expert - FMCG Sector - Oracle
Randstad Digital
UK
Remote or hybrid
Mid - Senior
£450/day - £550/day
RECENTLY POSTED
TECH-AGNOSTIC ROLE
The Site Master Data Expert (MDE) is responsible for the accurate creation, maintenance, and governance of site-level master data to support manufacturing and packaging operations. The role acts as a subject-matter expert for Master Data processes, tools, and standards, ensuring Right-First-Time (RFT) data execution and driving continuous improvement across the site.The site-level Master Data specialist who sits between factory operations, packaging, and systems. This person ensures product, packaging, and BOM data is created accurately and on time to support manufacturing and project delivery.This is not a pure IT role and not a generic admin role - it’s a process-driven operational data role closely tied to manufacturing, packaging, and artwork workflows.Essential Skills
Master Data Management (MDM)
ERP systems (SAP background highly relevant, even if not named explicitly)
BOMs (Bills of Materials)
Product / Packaging Master Data
Artwork / Packaging workflow
Strong attention to detail and structured ways of working/ governance mindest
Digital systems experience (ERP / Master Data tools)
Strong problem-solving skills
Comfort working with multiple stakeholders (R&D, packaging, projects)
Ability to work independently within structured processes
Fluent English (written and spoken); additional languages beneficial
Degree preferred (University or University of Applied Sciences)
Desirable Skills
Veritas - (Oracle based syste)
MDG-M (SAP Master Data Governance)
Atlas
Key Duties
Creates and maintains product & packaging master data
Owns data accuracy (Right-First-Time)
Maintains Bills of Materials (BOMs)
Supports projects by ensuring master data is ready on time
Trains stakeholders on how to use master data correctly
Improves data processes and standards over time
Ideal previous sector / role experience
FMCG / CPG manufacturing environments
Food, Petcare, Pharma, or regulated manufacturing
Roles such as:
Master Data Analyst / Specialist
Product Master Data Coordinator
Packaging Data Specialist
ERP / SAP Master Data roles
Manufacturing Data / Operations Data roles
This is a long term contract role which can be worked remotely for a global client. I have interview slots ready to be filled so don’t delay and apply ASAPRandstad Technologies is acting as an Employment Business in relation to this vacancy.
Senior Data Engineer
Tenth Revolution Group
Multiple locations
Fully remote
Senior
£60,000 - £65,000
RECENTLY POSTED
fabric
python
sql
pyspark
About the Role We are looking for a Senior Data Engineer to join a leading Microsoft partner that is modernising data platforms and delivering innovative analytics solutions for organisations across the UK. You will work closely with clients to understand their business challenges before designing tailored solutions that improve efficiency, drive self‑service reporting and support long‑term scalability. This is a hands‑on role where you will support clients from a variety of different sectors. You will also be able to supplement this hands-on experience with the opportunity to gain Microsoft focus certifications and accreditations. Responsibilities Build and manage data pipelines using Azure Synapse, Data Factory, Databricks or Microsoft Fabric Design, implement and maintain data lakes data warehouses and ETL/ELT processes Develop scalable data models for reporting in Power BI Work closely with stakeholders to understand business needs and advise on solutions that best fit the individual needs of the businessSkills and Experience Hands‑on experience Azure services such as Synapse, Data Factory or Databricks Strong SQL skills Proficiency in Python and/or PySpark Experience with Power BI and data modellingWhat is on offer Salary up to £65,000 Fully remote working from anywhere in the UK Performance‑related bonus scheme Pension scheme and private healthcare optionsThis is just a brief overview of the role. For the full details, simply apply with your CV and we’ll be in touch to discuss it further. Tenth Revolution Group are the go‑to recruiter for Data & AI roles in the UK, offering more opportunities nationwide than any other recruitment agency. We are proud sponsors of SQLBits, Power Platform World Tour and the London Fabric User Group
Data Engineer
DTG Global
Enfield
Remote or hybrid
Junior - Mid
£50,000 - £60,000
RECENTLY POSTED
fabric
python
sql
Data Engineer needed for a UK organisation investing in its data platform to support the development of reliable, well-structured datasets for reporting and analytics. This role sits within a central data team and focuses on building, maintaining and improving data pipelines and data models that support consistent access to data across the business. The role * Build and maintain data pipelines to ingest, transform and deliver data for analytical use * Ensure data is processed efficiently and securely, in line with agreed standards * Implement monitoring, logging and error handling across data workflows * Create and maintain tests to validate data accuracy and pipeline behaviour * Monitor performance and apply optimisation techniques where required * Design and maintain relational and dimensional data models * Write and optimise SQL and transformation logic * Produce clear technical documentation covering data structures and processes * Work with colleagues across data, analytics and governance to support data quality and compliance What we’re looking for * Experience in data engineering or data platform development * Strong SQL skills with experience working on production data pipelines * Understanding of data modelling concepts for analytics and reporting * Familiarity with layered or staged data architectures * Knowledge of data quality, validation and governance practices * Strong analytical and problem-solving skills Desirable experience * Cloud data platforms such as Microsoft Fabric or Azure data services * Python or Spark for data transformation and automation * Experience writing tests for data pipelines * Ability to communicate technical concepts clearly to non-technical stakeholders * Experience working in regulated or data-sensitive environments If you’re a Data Engineer looking for a role focused on building robust data foundations rather than ad-hoc reporting, we’d be happy to discuss this further in confidence
Analytics Engineer (Telecoms) x2
Hays Technology
London
Remote or hybrid
Mid - Senior
£544,000/day - £725,544/day
RECENTLY POSTED
processing-js
terraform
sql
microsoft-azure
tableau
dbt
Your new company Working for a renowned British telecoms organisationYour new role We are seeking 2x Analytics Engineer to join our team at a leading telecoms organisation. This role focuses on transforming raw data into clean, analytics-ready datasets and bridging the gap between engineering and analytics by dealing with data transformations end processes feeding end user outputs. You will work on data transformation processes feeding end-user outputs, optimisation, ensuring accuracy, scalability, and improving performance and resource usage for large-scale data processing.What you’ll need to succeed
Experience working as a Analytics Engineer/ or similar role that lands between a hands on Data Analyst/ Engineer.
Proven experience in complex data process migration projects.
Hands-on experience working with large-scale data environments.
Spark optimization experience
Strong experience with Microsoft Azure cloud-based platform.
Expertise in setting up and managing data pipelines.
SQL and data modeling expertise.
Familiarity with dbt or similar data transformation tools.
Knowledge of orchestration and optimization techniques for data workflows.
Experience with Infrastructure as Code (Terraform) for cloud deployments.
Familiarity with Tableau, including setting up and maintaining Tableau Cloud solutions.
Demonstrated ability in developing, testing, and deploying complex data models and methodologies.
What you’ll get in return Flexible working options available.What you need to do now If you’re interested in this role, click ‘apply now’ to forward an up-to-date copy of your CV, or call us now.Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C’s, Privacy Policy and Disclaimers which can be found at (url removed)
Data Engineer (Snowflake and Matillion) - £425PD - Remote
Tenth Revolution Group
City of London
Fully remote
Mid - Senior
£350/day - £425/day
RECENTLY POSTED
snowflake
fabric
aws
git
python
airflow
+4
Data Engineer ( Snowflake and Matillion) - £425PD - Remote About the Role We are looking for a Data Engineer with strong experience in Snowflake and Matillion to design, build, and maintain scalable data pipelines and analytics-ready data models. You’ll work closely with analytics, product, and business teams to turn raw data into reliable, high-quality datasets that power reporting, dashboards, and advanced analytics. This role is ideal for someone who enjoys working in a modern cloud data stack and takes pride in building clean, performant, and well-documented data solutions. Key Responsibilities Design, build, and maintain ELT pipelines using Matillion to ingest data from multiple sources into Snowflake Develop and optimize data models in Snowflake for analytics and reporting use cases Ensure data quality, reliability, and performance across pipelines and warehouse workloads Collaborate with analytics engineers, data analysts, and stakeholders to understand data requirements Implement best practices for Snowflake (clustering, scaling, cost optimization, security) Monitor and troubleshoot data pipelines, resolving failures and performance issues Manage and evolve data transformations using SQL and version control Document data pipelines, models, and business logic for long-term maintainability Support CI/CD processes and promote automation across the data platform Required Qualifications 3+ years of experience as a Data Engineer or in a similar role Strong hands-on experience with Snowflake (data modeling, performance tuning, security) Proven experience building pipelines with Matillion Advanced SQL skills and solid understanding of ELT best practices Experience working with cloud data architectures (AWS, Azure, or GCP) Familiarity with version control systems (e.g., Git) Strong problem-solving skills and attention to detail Ability to communicate clearly with technical and non-technical stakeholders Nice to Have Experience with dbt or other transformation frameworks Exposure to data orchestration tools (Airflow, etc.) Understanding of data governance, lineage, and metadata management Experience supporting BI tools (Power BI, Tableau, Looker, etc.) Python experience for data tooling or automation Experience working in an agile or product-driven environment To apply for this role please submit your CV or contact Dillon Blackburn on (phone number removed) or at (url removed). Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We’re the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment
Azure Data Architect
Morgan Hunt Group Limited
Inverness
Fully remote
Senior - Leader
£450/day
RECENTLY POSTED
fabric
microsoft-azure
Morgan Hunt are working with a public sector organisation to recruit an Azure Data Architect on a 6-month contract. The role is fully remote, with no requirement to travel to the organisations offices. They are looking for someone to lead the design and delivery of modern, scalable data platforms in Microsoft Azure.You will be responsible for defining end-to-end data architectures and guiding teams through successful implementations across cloud data platforms. You will work closely with stakeholders, engineers, and analysts to ensure data solutions are aligned with the organisations goals and best practices.Key Responsibilities:
Design and implement enterprise-grade data architectures using Azure Fabric
Architect and optimise datalakes for analytics, reporting, and advanced workloads
Lead data governance, cataloguing, and compliance initiatives using Microsoft Purview
Own and oversee data platform implementations from concept through delivery
Define standards, patterns, and best practices for Azure data solutions
Collaborate with engineering and business teams to translate requirements into scalable designs
Skills & Experience
Proven experience as a Data Architect in Azure environments
Strong hands-on knowledge of Azure Fabric and modern data architectures
Experience designing and implementing datalakes (e.g. medallion architectures)
Practical experience with Microsoft Purview for data governance and lineage
Track record of delivering successful data platform implementations
Excellent communication and stakeholder management skills
450 per day, outside IR35Fully remote6 month contractPlease get in touch for further informationMorgan Hunt is a multi-award-winning recruitment business for interim, contract and temporary recruitment and acts as an Employment Agency in relation to permanent vacancies. Morgan Hunt is an equal opportunities employer. Job suitability is assessed on merit in accordance with the individual’s skills, qualifications and abilities to perform the relevant duties required in a particular role.
Databricks Implementation Consultant
Robert Half
London
Fully remote
Mid - Senior
Private salary
RECENTLY POSTED
sql
We are seeking a Databricks Implementation Consultant/Architect for an initial 4-6 weeks contract. The primary objective of this role is to assess and recommend the optimal cloud platform (Azure or Google Cloud Platform) for implementing Databricks.This will involve a detailed evaluation of both platforms against analytical requirements, cost models, security, compliance, and integration considerations, culminating in a clear recommendation for senior stakeholdersRole OverviewYou will lead a comprehensive Cloud & Data Platform Assessment focused on defining a future-ready architecture and positioning Databricks as the core analytics layer. This role requires deep expertise in Databricks implementations across Azure and Google Cloud Platform (GCP).Cloud & Data Platform Assessment:
Review the existing data landscape, including CRM data in SQL Server, external datasets, geospatial sources, and BI consumption.
Assess how shortlisted cloud platforms (Azure vs GCP) meet analytical requirements: large-scale data volumes (20M+ rows), mixed data types, NLP/AI workloads, geospatial analytics, and daily refresh needs.
Compare Azure and GCP for architecture, security, Databricks integration, and compliance with data handling standards.
Produce workload-based cost models for storage, compute, pipelines, and AI/NLP workloads.
Define target architectures for each option, including high-level diagrams.
Evaluate implementation complexity, operational overhead, and scalability.
Required Experience:
Proven track record of end-to-end Databricks implementations in Azure and GCP.
Strong expertise in lakehouse architecture, data engineering, and analytics frameworks.
Hands-on experience as a Data Architect or Implementation Consultant.
Delivery of AI and NLP workloads using cloud-native services.
Ability to produce technical diagrams, cost models, and senior-level recommendations.
Background in security, governance, and compliance design.
Conducting architecture comparisons between Azure and GCP, including evaluating Databricks compatibility.
Developing costed platform models aligned to specific workloads and Databricks usage.
Performing security and compliance assessments for Databricks workloads.
Identifying integration dependencies and risks across cloud platforms, Databricks, and existing systems.
Producing clear recommendation papers for senior stakeholders, outlining preferred cloud platform and supporting rationale.
Contract Details:
Fully Remote Engagement
Inital 4-6 weeks Contract
Competitive Day Rate - Outside IR35
Robert Half Ltd acts as an employment business for temporary positions and an employment agency for permanent positions. Robert Half is committed to diversity, equity and inclusion. Suitable candidates with equivalent qualifications and more or less experience can apply. Rates of pay and salary ranges are dependent upon your experience, qualifications and training. If you wish to apply, please read our Privacy Notice describing how we may process, disclose and store your personal data:
Data Engineer
Tenth Revolution Group
London
Fully remote
Mid
£50,000 - £65,000
fabric
python
sql
Data Engineer - Azure | Databricks / Fabric - Remote - £50k-£65k I’m currently supporting one of the UK’s fastest-growing Microsoft data consultancies as they continue to scale their team. If you’re a Data Engineer who wants to work on modern Azure projects, next‑gen lakehouse architectures, and large-scale transformation programmes, this one is worth a look. This consultancy is known for investing heavily in their people, offering real progression pathways, and giving engineers the chance to take ownership of meaningful, enterprise-grade work. If you’re looking for a role where you can grow quickly and make a real impact, this could be a great fit. What You’ll Be Doing In this role, you’ll be central to designing and delivering modern data solutions for a range of clients: Architect and deliver scalable data solutions using Databricks, Synapse, and Microsoft Fabric Build and optimise ETL/ELT pipelines and high-quality data models using SQL & Python Develop Power BI dashboards that support insight-led decision-making Implement data lakes and medallion lakehouse architectures Apply strong standards around data quality, governance, and security Work collaboratively in Agile, cross-functional teams Support major cloud migration and modernisation initiatives What’s In It for You High-Growth Environment: You’ll work with cutting-edge Microsoft technologies on impactful projects across multiple industries. Career Development: The business actively funds certifications, structured training programmes, and clear progression opportunities. Fully Remote: Work from anywhere in the UK, travel is only required occasionally and covered by the company. You’ll be a great fit if you have: Strong experience with Azure Synapse, Databricks, or Microsoft Fabric Solid SQL & Python skills for ETL/ELT development Experience working with data lakes and large datasets A good understanding of BI, data warehousing, and modern data architectures Interested? This team is one of the most in-demand in the Microsoft data space, and roles with them don’t stay open for long. If you’d like to explore the opportunity, apply now
Page 1 of 2

Frequently asked questions

What types of remote Data Architect jobs are available on Haystack?
Haystack features a wide range of remote Data Architect positions across various industries including finance, healthcare, technology, and e-commerce. These roles vary from designing data frameworks to managing big data solutions and cloud architectures.
How do I apply for a remote Data Architect position on Haystack?
To apply, simply create an account on Haystack, upload your updated resume, and submit your application through the job listing. Some employers may request additional assessments or portfolio examples, which will be specified in the job description.
Are remote Data Architect jobs on Haystack open to international candidates?
Many remote Data Architect roles on Haystack are open to international applicants, but eligibility depends on the employer’s specific requirements, such as working hours alignment and legal work authorization. Each job posting clearly states these details.
What skills and qualifications are typically required for remote Data Architect jobs?
Commonly required skills include expertise in data modeling, database management, cloud platforms (AWS, Azure, GCP), ETL processes, and programming languages like SQL and Python. Professional experience and certifications in data architecture or related fields are often preferred.
Can I set up job alerts for remote Data Architect positions on Haystack?
Yes, you can create personalized job alerts by specifying your job preferences, such as role, location (remote), and experience level. You will receive email notifications when new remote Data Architect positions matching your criteria are posted.