Make yourself visible and let companies apply to you.
Roles

Snowflake Jobs in London

Overview

Looking for Snowflake jobs in London? Discover the latest Snowflake developer, engineer, and architect roles across top tech companies in the UK capital. Whether you’re an experienced Snowflake professional or just starting out, our London-based Snowflake job listings connect you with exciting opportunities to advance your data career. Start your Snowflake job search in London today and land your next role in the heart of the tech industry!
Filters applied
London
Snowflake
Search
Salary
Location
Remote preference
Role type
Seniority
Tech stack
Sectors
Contract type
Company size
Visa sponsorship
Snowflake Data Engineer
Tenth Revolution Group
London
Hybrid
Senior
£85,000 - £100,000
RECENTLY POSTED
snowflake
processing-js
aws
git
kafka
python
+4
Senior Snowflake Data Engineer - Hybrid - £85k-£100k About the Role I am looking for an experienced Senior Snowflake Data Engineer to join a dynamic team working on cutting-edge data solutions. This is an exciting opportunity to design, build, and optimise high-performance data pipelines using Snowflake, dbt, and modern engineering practices. If you are passionate about data engineering, test-driven development, and cloud technologies, we’d love to hear from you. Key Responsibilities Design, develop, and optimise scalable data pipelines in Snowflake. Build and maintain dbt models with robust testing and documentation. Apply test-driven development principles for data quality and schema validation. Optimise pipelines to reduce processing time and compute costs. Develop modular, reusable transformations using SQL and Python. Implement CI/CD pipelines and manage deployments via Git. Automate workflows using orchestration tools such as Airflow or dbt Cloud. Configure and optimise Snowflake warehouses for performance and cost efficiency. Required Skills & Experience 7+ years in data engineering roles. 3+ years hands-on experience with Snowflake. 2+ years production experience with DBT (mandatory). Advanced SQL and strong Python programming skills. Experience with Git, CI/CD, and DevOps practices. Familiarity with ETL/ELT tools and cloud platforms (AWS, Azure). Knowledge of Snowflake features such as Snowpipe, streams, tasks, and query optimisation. Preferred Qualifications Snowflake certifications (SnowPro Core or Advanced). Experience with DBT Cloud and custom macros. Exposure to real-time streaming (Kafka, Kinesis). Familiarity with data observability tools and BI integrations (Tableau, Power BI). On offer Opportunity to work with modern data technologies and large-scale architectures. Professional development and certification support. Collaborative, engineering-focused culture. Competitive salary and benefits package. Interested? Apply now with your CV highlighting your Snowflake, DBT and DevOps experience
Data Engineer (Snowflake and Matillion) - £425PD - Remote
Tenth Revolution Group
City of London
Fully remote
Mid - Senior
£350/day - £425/day
RECENTLY POSTED
snowflake
fabric
aws
git
python
airflow
+4
Data Engineer ( Snowflake and Matillion) - £425PD - Remote About the Role We are looking for a Data Engineer with strong experience in Snowflake and Matillion to design, build, and maintain scalable data pipelines and analytics-ready data models. You’ll work closely with analytics, product, and business teams to turn raw data into reliable, high-quality datasets that power reporting, dashboards, and advanced analytics. This role is ideal for someone who enjoys working in a modern cloud data stack and takes pride in building clean, performant, and well-documented data solutions. Key Responsibilities Design, build, and maintain ELT pipelines using Matillion to ingest data from multiple sources into Snowflake Develop and optimize data models in Snowflake for analytics and reporting use cases Ensure data quality, reliability, and performance across pipelines and warehouse workloads Collaborate with analytics engineers, data analysts, and stakeholders to understand data requirements Implement best practices for Snowflake (clustering, scaling, cost optimization, security) Monitor and troubleshoot data pipelines, resolving failures and performance issues Manage and evolve data transformations using SQL and version control Document data pipelines, models, and business logic for long-term maintainability Support CI/CD processes and promote automation across the data platform Required Qualifications 3+ years of experience as a Data Engineer or in a similar role Strong hands-on experience with Snowflake (data modeling, performance tuning, security) Proven experience building pipelines with Matillion Advanced SQL skills and solid understanding of ELT best practices Experience working with cloud data architectures (AWS, Azure, or GCP) Familiarity with version control systems (e.g., Git) Strong problem-solving skills and attention to detail Ability to communicate clearly with technical and non-technical stakeholders Nice to Have Experience with dbt or other transformation frameworks Exposure to data orchestration tools (Airflow, etc.) Understanding of data governance, lineage, and metadata management Experience supporting BI tools (Power BI, Tableau, Looker, etc.) Python experience for data tooling or automation Experience working in an agile or product-driven environment To apply for this role please submit your CV or contact Dillon Blackburn on (phone number removed) or at (url removed). Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We’re the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment
Business Analyst - Tax
HAYS
London
Hybrid
Mid - Senior
£600/day - £700/day
RECENTLY POSTED
snowflake
Leading financial services organisation to appoint an experienced Tax Business AnalystYour new company Hays have partnered with a leading financial services organisation to appoint an experienced Business Analyst to a complex financial transformation, with initial focus on FATCA/CRS and VAT regulatory change. If you’re a Senior Tax Business Analyst, with demonstrated experience in tax transformations and a strong background in FATCA and VAT, then we want to hear from you. Length: Initial 6 months, likely extension Location: London Hybrid: Yes (3 days per week in-office) Rate: TBC (likely between £600 - £700/day inside via Umbrella) Your new role and what you’ll need to succeed You will be an experienced Business Analyst, with a proven background in complex financial change transformations (preferably in tax) within large, multi-entity banks or consultancies. You will bring in-depth knowledge of FATCA/CRS and VAT, alongside the following core skills:- Business Analysis & Project Delivery: Proven experience as a Tax Business Analyst within banking or consultancy; full project lifecycle expertise including scope, BRDs/FRDs, functional specs, UAT plans. Project Management skills desirable.
Tax & Financial Knowledge: Strong understanding of FATCA/CRS regimes, UK VAT compliance, accounting principles (multi-GAAP), and global ledger flows. Familiarity with financial products and their tax implications.
Data & Platform Expertise: Skilled in data analysis, BI tools (Power BI), modern platforms (Snowflake, APIs), and Finance systems (Oracle GL/Subledger, Murex, LoanIQ). Ability to map complex processes and enhance data quality.
Technical Proficiency: Advanced Microsoft Excel, PowerPoint, and Word skills for analysis and reporting.
Stakeholder Engagement: Excellent communication and relationship-building skills; ability to simplify complex tax/finance concepts and collaborate across senior stakeholders and technical teams.
Transformation Mindset: Track record in tax process automation and regulatory reporting programs; proactive risk-and-control approach to standardisation and operational risk reduction. What you need to do now If you’re interested in this role, click ‘apply now’ to forward an up-to-date copy of your CV, or call us now.If this job isn’t quite right for you, but you are looking for a new position, please contact us for a confidential discussion about your career.
MLOps Tech Lead
Stackstudio Digital Ltd.
London
Hybrid
Senior
£500/day - £525/day
RECENTLY POSTED
processing-js
aws
mongodb
mysql
tensorflow
git
+13
Job DetailsRole / Job Title:MLOps Tech LeadWork Location:London, UKOffice Requirement (Hybrid):2 days per weekKey Responsibilities (High-Level) Data Pipeline Development: Lead the technical direction of projects and ensure the use of Sainsbury’s best practices to the best quality. Data Integration: Lead and provide expertise on Integrate data from various sources, ensuring data consistency, integrity, and quality across the entire data lifecycle. Infrastructure Management: Provide guidance for the junior & Mid Data Engineers on the best practices when building and managing data infrastructure, including data lakes, warehouses, and distributed processing systems (e.g., PySpark, Hadoop).The RoleAs a Tech Lead, you will play a critical role in designing, building, and maintaining data pipelines and infrastructure that enable the development and deployment of machine learning models and drive engineering excellence. You will collaborate closely with data scientists, and lead ML engineers, and software engineers to ensure data is clean, accessible, and optimised for large-scale processing and analysis.Your Responsibilities Data Pipeline Development: Lead the technical direction of projects and ensure the use of Sainsbury’s best practices to the best quality. Data Integration: Lead and provide expertise on Integrate data from various sources, ensuring data consistency, integrity, and quality across the entire data lifecycle. Infrastructure Management: Provide guidance for the junior & Mid Data Engineers on the best practices when building and managing data infrastructure, including data lakes, warehouses, and distributed processing systems (e.g., PySpark, Hadoop). Data Preparation: Collaborate with data scientists to prepare and transform raw data into formats suitable for machine learning, including feature engineering and data augmentation. Automation: Implement automation tools and frameworks (CI/CD) to streamline the deployment and monitoring of machine learning models in production. Performance Optimisation: Optimise data processing workflows and storage solutions to improve performance and reduce costs. Collaboration: Work closely with cross-functional teams, including data science, engineering, and product management, to deliver data solutions that meet business needs. Mentorship: junior and mid-level data engineers and provide technical guidance on best practices and emerging technologies in data engineering and machine learning and helping to enhance their skills and career growth. Knowledge Sharing and Empowerment: Promote a culture of knowledge sharing within the engineering teams by organising regular technical workshops, brown bag sessions, and code reviews. Innovation and Continuous Improvement: Foster a collaborative and inclusive team environment that encourages continuous learning and improvement.Your ProfileEssential Skills / Knowledge / Experience Knowledge of machine learning frameworks (e.g., PySpark, PyTorch) and model deployment tools (e.g., MLflow, TensorFlow Serving). Strong experience with data processing frameworks (e.g., Apache Spark, Flink). Expertise in SQL and NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB, Cassandra). Hands-on experience with cloud platforms (e.g., AWS, GCP, Azure) and their data services (e.g., Snowflake, S3, BigQuery, Redshift). Experience with containerisation and orchestration tools (e.g., Docker, Kubernetes). Familiarity with version control systems (e.g., Git) and CI/CD pipelines.Desirable Skills / Knowledge / Experience Certifications: AWS Certified Big Data Specialty, Google Professional Data Engineer, or equivalent.Soft Skills: o Excellent problem-solving and analytical skills. o Strong communication skills, with the ability to explain complex technical concepts to non-technical stakeholders. o Ability to work independently and in a team-oriented, collaborative environment.Leadership and Communication Strong leadership skills with the ability to inspire and guide team. Lead scrum ceremonies as and when needed (Standup, Planning, and grooming sessions). Excellent verbal and written communication skills, with the ability to articulate complex technical concepts. Creating a safe and inclusive environment where all team members feel that their input is valued and are never dissuaded from speaking up or asking questions.Collaborative Attitude Strong team player with a collaborative approach to working with cross-functional teams within the Media Agency. Open to feedback and willing to provide constructive criticism to others. Be available for the team, responding within a reasonable time frame and if not possible clearly sign positing alternative contacts who can guide. Building a community across Media Agency. Contribute to a positive and inclusive atmosphere within the team.Knowledge Sharing and Empowerment Commitment to fostering a learning culture within the team and ensuring knowledge transfer across all levels. Support and mentor C3s and C4s engineers by providing them opportunities to lead initiatives and contribute to the technical roadmap.
MLOps Tech Lead
Stackstudio Digital Ltd.
London
Hybrid
Senior
£500/day - £525/day
RECENTLY POSTED
processing-js
aws
mongodb
mysql
tensorflow
git
+13
Job Details Role / Job Title: MLOps Tech Lead Work Location: London, UK Office Requirement (Hybrid): 2 days per week Key Responsibilities (High-Level) Data Pipeline Development: Lead the technical direction of projects and ensure the use of Sainsbury’s best practices to the best quality.Data Integration: Lead and provide expertise on Integrate data from various sources, ensuring data consistency, integrity, and quality across the entire data lifecycle.Infrastructure Management: Provide guidance for the junior & Mid Data Engineers on the best practices when building and managing data infrastructure, including data lakes, warehouses, and distributed processing systems (e.g., PySpark, Hadoop). The Role As a Tech Lead , you will play a critical role in designing, building, and maintaining data pipelines and infrastructure that enable the development and deployment of machine learning models and drive engineering excellence. You will collaborate closely with data scientists, and lead ML engineers, and software engineers to ensure data is clean, accessible, and optimised for large-scale processing and analysis. Your Responsibilities Data Pipeline Development: Lead the technical direction of projects and ensure the use of Sainsbury’s best practices to the best quality.Data Integration: Lead and provide expertise on Integrate data from various sources, ensuring data consistency, integrity, and quality across the entire data lifecycle.Infrastructure Management: Provide guidance for the junior & Mid Data Engineers on the best practices when building and managing data infrastructure, including data lakes, warehouses, and distributed processing systems (e.g., PySpark, Hadoop).Data Preparation: Collaborate with data scientists to prepare and transform raw data into formats suitable for machine learning, including feature engineering and data augmentation.Automation: Implement automation tools and frameworks (CI/CD) to streamline the deployment and monitoring of machine learning models in production.Performance Optimisation: Optimise data processing workflows and storage solutions to improve performance and reduce costs.Collaboration: Work closely with cross-functional teams, including data science, engineering, and product management, to deliver data solutions that meet business needs.Mentorship: junior and mid-level data engineers and provide technical guidance on best practices and emerging technologies in data engineering and machine learning and helping to enhance their skills and career growth.Knowledge Sharing and Empowerment: Promote a culture of knowledge sharing within the engineering teams by organising regular technical workshops, brown bag sessions, and code reviews.Innovation and Continuous Improvement: Foster a collaborative and inclusive team environment that encourages continuous learning and improvement. Your Profile Essential Skills / Knowledge / Experience Knowledge of machine learning frameworks (e.g., PySpark, PyTorch) and model deployment tools (e.g., MLflow, TensorFlow Serving).Strong experience with data processing frameworks (e.g., Apache Spark, Flink).Expertise in SQL and NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB, Cassandra).Hands-on experience with cloud platforms (e.g., AWS, GCP, Azure) and their data services (e.g., Snowflake, S3, BigQuery, Redshift).Experience with containerisation and orchestration tools (e.g., Docker, Kubernetes).Familiarity with version control systems (e.g., Git) and CI/CD pipelines. Desirable Skills / Knowledge / Experience Certifications: AWS Certified Big Data Specialty, Google Professional Data Engineer, or equivalent. Soft Skills:o Excellent problem-solving and analytical skills.o Strong communication skills, with the ability to explain complex technical concepts to non-technical stakeholders.o Ability to work independently and in a team-oriented, collaborative environment. Leadership and Communication Strong leadership skills with the ability to inspire and guide team.Lead scrum ceremonies as and when needed (Standup, Planning, and grooming sessions).Excellent verbal and written communication skills, with the ability to articulate complex technical concepts.Creating a safe and inclusive environment where all team members feel that their input is valued and are never dissuaded from speaking up or asking questions. Collaborative Attitude Strong team player with a collaborative approach to working with cross-functional teams within the Media Agency.Open to feedback and willing to provide constructive criticism to others.Be available for the team, responding within a reasonable time frame and if not possible clearly sign positing alternative contacts who can guide.Building a community across Media Agency.Contribute to a positive and inclusive atmosphere within the team. Knowledge Sharing and Empowerment Commitment to fostering a learning culture within the team and ensuring knowledge transfer across all levels.Support and mentor C3s and C4s engineers by providing them opportunities to lead initiatives and contribute to the technical roadmap.TPBN1_UKTJ
Senior Data Engineer x1/ Data Engineer x1 (Financial Services)
Hays Technology
London
Remote or hybrid
Senior
£600/day - £800/day
react
aws
mongodb
spring-boot
kubernetes
kafka
+10
Your new company Working for a renowned commodity, metals, trades and exchange group. You’ll be a key part of the Enterprise Data team helping to replace legacy ETL tools (Informatica) and deliver modern data engineering capabilities. Your work will include managing data pipelines, supporting analysis and visualisation, and collaborating with ETL developers and wider technology teams to deliver solutions aligned with our strategic roadmap. You’ll work across backend, data, and infrastructure engineering, contributing to solution design, implementation, deployment, testing, and support. This is a hands-on role for someone with strong data engineering skills and experience in regulated environments. Your new role Design, build, and maintain scalable data pipelines and infrastructure for analytics and integration across data platforms. Ensure data quality and reliability through automated validation, monitoring, and testing using Python, Java, or Scala. Develop and manage database architectures, including data lakes and warehouses. Clean, transform, and validate data to maintain consistency and accuracy. Collaborate with technical and non-technical teams, providing clear communication on project progress and requirements. Create and maintain accurate technical documentation. Support internal data analysis and reporting for business objectives. Investigate and resolve data-related issues, implementing improvements for stability and performance. Evaluate and prototype solutions to ensure optimal architecture, cost, and scalability. Implement best practices in automation, CI/CD, and test-driven development.What you’ll need to succeed Strong experience in Data Engineering, with demonstrable lead 5involvement in at least one production-grade data system within financial services or a similarly regulated industry. Strong coding skills in Python or Java (Spring Boot); React experience is a plus. Proficiency with modern data tools: Airflow, Spark, Kafka, dbt, Snowflake or similar. Experience with cloud platforms (AWS, Azure, GCP), containerization (Docker, Kubernetes), and CI/CD. Data Quality: Proven ability to validate and govern data pipelines, ensuring data integrity, correctness, and compliance. Experience working within financial services/ highly regulated environments. Bonus Skills: SQL and RDBMS (PostgreSQL, SQL Server). NoSQL/distributed databases (MongoDB). Streaming pipelines experience. What you’ll get in return An exciting opportunity to join an international organisation in financial services. Furthermore, a competitive day rate inside IR35 for this role will be offered in addition to your own dedicated Hays Consultant to guide you through every step of the application process. What you need to do now If you’re interested in this role, click ‘apply now’ to forward an up-to-date copy of your CV, or call us now. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C’s, Privacy Policy and Disclaimers which can be found at (url removed)
Azure & Snowflake DevOps Engineer
Tenth Revolution Group
London
Fully remote
Mid - Senior
£400/day
snowflake
fabric
Azure DevOps/Platform Engineer (Contract)Location: Remote (UK or Nearshore) Start Date: End of January 2026 (Interviews early January) Duration: Minimum 6 months Day Rate: £400/day Outside IR35About the RoleMy client is seeking experienced Azure DevOps/Platform Engineers to join a major transformation programme. The focus is on enabling modern cloud infrastructure and self-service capabilities for BI platforms. Candidates ideally would have worked on Azure DevOps projects with a lot of Snowflake as well.Key Responsibilities
Design and implement Infrastructure as Code (IaC) solutions using Azure DevOps.
Configure and manage networking, security, and containerisation within Azure environments.
Drive observability and monitoring across platforms.
Support creation of self-service capabilities for BI and analytics teams.
Collaborate with stakeholders to ensure robust, scalable, and secure cloud solutions.
Essential Skills
Strong experience with Azure DevOps and related tooling.
Expertise in IaC, networking, security, and containerisation.
Exposure to Snowflake or other modern data platforms.
Solid understanding of monitoring and observability frameworks.
To discuss this role further please submit your CV or contact Brandon ForbesTenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We’re the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
Snowflake DevOps Engineer - Fully Remote - £450/pd
Tenth Revolution Group
City of London
Fully remote
Mid - Senior
£400/day - £450/day
snowflake
fabric
kubernetes
docker
Snowflake DevOps Engineer - Fully Remote - £450/pd (Outside IR35) Please note - this role is only open to applicants who are based in the UK with the unrestricted right to work in the UK. This organisation is not able to offer sponsorship. About the Role We are seeking an experienced Snowflake DevOps Engineer to join our team on a 6-month contract. This is a fully remote position, but candidates must be based in the UK and have unrestricted right to work in the UK. You will play a key role in designing, implementing, and maintaining robust DevOps practices for Snowflake environments, leveraging Azure DevOps for Infrastructure as Code (IaC), networking, security, and containerisation. Key Responsibilities Build and maintain automated deployment pipelines for Snowflake using Azure DevOps. Implement Infrastructure as Code (IaC) for scalable and secure environments. Ensure best practices in networking and security within cloud-based data platforms. Support containerisation strategies and integration with Snowflake. Collaborate with cross-functional teams to deliver high-quality solutions. Essential Skills & Experience Proven experience as a DevOps Engineer with Snowflake. Strong proficiency in Azure DevOps tools and practices. Expertise in IaC, networking, and security within cloud environments. Hands-on experience with containerisation technologies (e.g., Docker, Kubernetes). Excellent problem-solving and communication skills. Contract Details Duration: 6 months Rate: £450/day (Outside IR35) Location: Fully remote (UK-based only) Start Date: End of January 2026 Interview Process: Two stagesTo apply for this role please submit your CV or contact David Airey on (phone number removed) or at (url removed). Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We’re the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment
Data Engineer – SC Cleared – Databricks
SR2
London
Remote or hybrid
Mid - Senior
£450/day - £500/day
processing-js
aws
terraform
python
java
azure-databricks
+3
We are seeking a hands-on Data Engineer with deep expertise in building and managing streaming and batch data pipelines. The ideal candidate will have strong experience working with large-scale data systems operating on cloud-based platforms such as AWS and Databricks. This role also involves close collaboration with hyperscalers and data platform vendors to evaluate and document Proofs of Concept (PoCs) for modern data platforms, while effectively engaging with senior stakeholders across the organisation. Key Responsibilities: Design, develop, and maintain streaming and batch data pipelines using modern data engineering tools and frameworks. Work with large volumes of structured and unstructured data, ensuring high performance and scalability. Collaborate with cloud providers and data platform vendors (e.g., AWS, Microsoft Azure, Databricks) to conduct PoCs for data platform solutions. Evaluate PoC outcomes and provide comprehensive documentation including architecture, performance benchmarks, and recommendations.Required Experience & Skills: Proven experience as a Data Engineer with a strong focus on streaming and batch processing. Hands-on experience with cloud-based data plaforms such as AWS/ Databricks. Strong programming skills in Python, Scala, or Java. Experience with data modeling, ETL/ELT processes, and data warehousing. Experience conducting and documenting PoCs with hyperscalers or data platform vendors.Preferred Qualifications: Certifications in AWS, Azure, or Databricks. Experience with Snowflake, IBM DataStage, or other enterprise data tools. Knowledge of CI/CD pipelines and infrastructure as code (e.g., Terraform, CloudFormation)
Page 1 of 1

Frequently asked questions

What types of Snowflake jobs are available in London?
London offers a variety of Snowflake roles including Data Engineers, Snowflake Architects, Data Analysts, and Cloud Data Specialists across industries such as finance, retail, and technology.
Do I need specific certifications to apply for Snowflake jobs in London?
While not always mandatory, certifications like SnowPro Core Certification can significantly improve your chances of landing a job. Employers value proven expertise with Snowflake's platform.
Can I find remote or hybrid Snowflake job opportunities in London?
Yes, many companies in London offer remote or hybrid working options for Snowflake-related roles, reflecting the flexible work trends in the IT sector.
What skills should I highlight on my resume for a Snowflake job in London?
Key skills include expertise in Snowflake data warehousing, SQL, cloud platforms (AWS, Azure, Google Cloud), ETL processes, and experience with data modeling and pipeline development.
How can Haystack help me find the right Snowflake job in London?
Haystack specializes in IT job placements and provides tailored job listings, application tips, and market insights to help you connect with leading employers seeking Snowflake talent in London.