Make yourself visible and let companies apply to you.
Roles

DBT Jobs in London

Overview

Looking for DBT jobs in London? Discover the latest opportunities for Data Build Tool (DBT) professionals on Haystack. Whether you're a DBT developer, analyst, or data engineer, find your next role in London’s thriving tech scene. Start your DBT career search with us today!
Filters applied
London
DBT
Search
Salary
Location
Remote preference
Role type
Seniority
Tech stack
Sectors
Contract type
Company size
Visa sponsorship
Snowflake Data Engineer
Tenth Revolution Group
London
Hybrid
Senior
£85,000 - £100,000
RECENTLY POSTED
snowflake
processing-js
aws
git
kafka
python
+4
Senior Snowflake Data Engineer - Hybrid - £85k-£100k About the Role I am looking for an experienced Senior Snowflake Data Engineer to join a dynamic team working on cutting-edge data solutions. This is an exciting opportunity to design, build, and optimise high-performance data pipelines using Snowflake, dbt, and modern engineering practices. If you are passionate about data engineering, test-driven development, and cloud technologies, we’d love to hear from you. Key Responsibilities Design, develop, and optimise scalable data pipelines in Snowflake. Build and maintain dbt models with robust testing and documentation. Apply test-driven development principles for data quality and schema validation. Optimise pipelines to reduce processing time and compute costs. Develop modular, reusable transformations using SQL and Python. Implement CI/CD pipelines and manage deployments via Git. Automate workflows using orchestration tools such as Airflow or dbt Cloud. Configure and optimise Snowflake warehouses for performance and cost efficiency. Required Skills & Experience 7+ years in data engineering roles. 3+ years hands-on experience with Snowflake. 2+ years production experience with DBT (mandatory). Advanced SQL and strong Python programming skills. Experience with Git, CI/CD, and DevOps practices. Familiarity with ETL/ELT tools and cloud platforms (AWS, Azure). Knowledge of Snowflake features such as Snowpipe, streams, tasks, and query optimisation. Preferred Qualifications Snowflake certifications (SnowPro Core or Advanced). Experience with DBT Cloud and custom macros. Exposure to real-time streaming (Kafka, Kinesis). Familiarity with data observability tools and BI integrations (Tableau, Power BI). On offer Opportunity to work with modern data technologies and large-scale architectures. Professional development and certification support. Collaborative, engineering-focused culture. Competitive salary and benefits package. Interested? Apply now with your CV highlighting your Snowflake, DBT and DevOps experience
Analytics Engineer (Telecoms) x2
Hays Technology
London
Remote or hybrid
Mid - Senior
£544,000/day - £725,544/day
RECENTLY POSTED
processing-js
terraform
sql
microsoft-azure
tableau
dbt
Your new company Working for a renowned British telecoms organisationYour new role We are seeking 2x Analytics Engineer to join our team at a leading telecoms organisation. This role focuses on transforming raw data into clean, analytics-ready datasets and bridging the gap between engineering and analytics by dealing with data transformations end processes feeding end user outputs. You will work on data transformation processes feeding end-user outputs, optimisation, ensuring accuracy, scalability, and improving performance and resource usage for large-scale data processing.What you’ll need to succeed
Experience working as a Analytics Engineer/ or similar role that lands between a hands on Data Analyst/ Engineer.
Proven experience in complex data process migration projects.
Hands-on experience working with large-scale data environments.
Spark optimization experience
Strong experience with Microsoft Azure cloud-based platform.
Expertise in setting up and managing data pipelines.
SQL and data modeling expertise.
Familiarity with dbt or similar data transformation tools.
Knowledge of orchestration and optimization techniques for data workflows.
Experience with Infrastructure as Code (Terraform) for cloud deployments.
Familiarity with Tableau, including setting up and maintaining Tableau Cloud solutions.
Demonstrated ability in developing, testing, and deploying complex data models and methodologies.
What you’ll get in return Flexible working options available.What you need to do now If you’re interested in this role, click ‘apply now’ to forward an up-to-date copy of your CV, or call us now.Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C’s, Privacy Policy and Disclaimers which can be found at (url removed)
Data Engineer (Snowflake and Matillion) - £425PD - Remote
Tenth Revolution Group
City of London
Fully remote
Mid - Senior
£350/day - £425/day
RECENTLY POSTED
snowflake
fabric
aws
git
python
airflow
+4
Data Engineer ( Snowflake and Matillion) - £425PD - Remote About the Role We are looking for a Data Engineer with strong experience in Snowflake and Matillion to design, build, and maintain scalable data pipelines and analytics-ready data models. You’ll work closely with analytics, product, and business teams to turn raw data into reliable, high-quality datasets that power reporting, dashboards, and advanced analytics. This role is ideal for someone who enjoys working in a modern cloud data stack and takes pride in building clean, performant, and well-documented data solutions. Key Responsibilities Design, build, and maintain ELT pipelines using Matillion to ingest data from multiple sources into Snowflake Develop and optimize data models in Snowflake for analytics and reporting use cases Ensure data quality, reliability, and performance across pipelines and warehouse workloads Collaborate with analytics engineers, data analysts, and stakeholders to understand data requirements Implement best practices for Snowflake (clustering, scaling, cost optimization, security) Monitor and troubleshoot data pipelines, resolving failures and performance issues Manage and evolve data transformations using SQL and version control Document data pipelines, models, and business logic for long-term maintainability Support CI/CD processes and promote automation across the data platform Required Qualifications 3+ years of experience as a Data Engineer or in a similar role Strong hands-on experience with Snowflake (data modeling, performance tuning, security) Proven experience building pipelines with Matillion Advanced SQL skills and solid understanding of ELT best practices Experience working with cloud data architectures (AWS, Azure, or GCP) Familiarity with version control systems (e.g., Git) Strong problem-solving skills and attention to detail Ability to communicate clearly with technical and non-technical stakeholders Nice to Have Experience with dbt or other transformation frameworks Exposure to data orchestration tools (Airflow, etc.) Understanding of data governance, lineage, and metadata management Experience supporting BI tools (Power BI, Tableau, Looker, etc.) Python experience for data tooling or automation Experience working in an agile or product-driven environment To apply for this role please submit your CV or contact Dillon Blackburn on (phone number removed) or at (url removed). Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We’re the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment
Data Engineer Manager
Young's Employment Services Ltd
Brent
Hybrid
Senior - Leader
£90,000
RECENTLY POSTED
fabric
aws
kafka
python
java
apache-spark
+4
Hybrid - London with 2/3 days WFH Circ £85,000 - £95,000 + Attractive Bonus & Benefits Hands On Data Engineer Manager required for this exciting newly created position with a prestigious and rapidly expanding business in West London. It would suit someone with official management experience, or potentially a Lead / Senior Engineer looking to take on more managerial responsibility. The Data Engineer Manager will play a pivotal role at the heart of our client’s data & analytics operation. Having implemented a new MS Fabric based Data platform, the need now is to scale up and meet the demand to deliver data driven insights and strategies right across the business globally. There’ll be a hands-on element to the role as you’ll be troubleshooting, reviewing code, steering the team through deployments and acting as the escalation point for data engineering. Our client can offer an excellent career development opportunity and a vibrant, creative and collaborative work environment. This is a hybrid role based in Central / West London with the flexibility to work from home 2 or 3 days per week. Key Responsibilities include; Define and take ownership of the roadmap for the ongoing development and enhancement of the Data Platform. Design, implement, and oversee scalable data pipelines and ETL/ELT processes within MS Fabric, leveraging expertise in Azure Data Factory, Databricks, and other Azure services. Advocate for engineering best practices and ensure long-term sustainability of systems. Integrate principles of data quality, observability, and governance throughout all processes. Participate in recruiting, mentoring, and developing a high-performing data organization. Demonstrate pragmatic leadership by aligning multiple product workstreams to achieve a unified, robust, and trustworthy data platform that supports production services such as dashboards, new product launches, analytics, and data science initiatives. Develop and maintain comprehensive data models, data lakes, and data warehouses (e.g., utilizing Azure Synapse). Collaborate with data analysts, Analytics Engineers, and various stakeholders to fulfil business requirements. Key Experience, Skills and Knowledge: Experience leading data or platform teams in a production environment as a Senior Data Engineer, Tech Lead, Data Engineering Manager etc. Proven success with modern data infrastructure: distributed systems, batch and streaming pipelines Hands-on knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Experience building, defining, and owning data models, data lakes, and data warehouses Programming proficiency in the likes of Python, Pyspark, SQL, Scala or Java. Experience operating in a cloud-native environment such as Azure, AWS, GCP etc ( Fabric experience would be beneficial but is not essential). Excellent stakeholder management and communication skills. A strategic mindset, with a practical approach to delivery and prioritisation. Proven success with modern data infrastructure: distributed systems, batch and streaming pipelines. Experience building, defining, and owning data models, data lakes, and data warehouses. Exposure to data science concepts and techniques is highly desirable. Strong problem-solving skills and attention to detail. Salary is dependent on experience and expected to be in the region of £85,000 - £95,000 + an attractive bonus scheme and benefits package. For further information, please send your CV to Wayne Young at Young’s Employment Services Ltd. YES are operating as both a recruitment Agency and Recruitment Business. TPBN1_UKTJ
Senior Data Engineer x1/ Data Engineer x1 (Financial Services)
Hays Technology
London
Remote or hybrid
Senior
£600/day - £800/day
react
aws
mongodb
spring-boot
kubernetes
kafka
+10
Your new company Working for a renowned commodity, metals, trades and exchange group. You’ll be a key part of the Enterprise Data team helping to replace legacy ETL tools (Informatica) and deliver modern data engineering capabilities. Your work will include managing data pipelines, supporting analysis and visualisation, and collaborating with ETL developers and wider technology teams to deliver solutions aligned with our strategic roadmap. You’ll work across backend, data, and infrastructure engineering, contributing to solution design, implementation, deployment, testing, and support. This is a hands-on role for someone with strong data engineering skills and experience in regulated environments. Your new role Design, build, and maintain scalable data pipelines and infrastructure for analytics and integration across data platforms. Ensure data quality and reliability through automated validation, monitoring, and testing using Python, Java, or Scala. Develop and manage database architectures, including data lakes and warehouses. Clean, transform, and validate data to maintain consistency and accuracy. Collaborate with technical and non-technical teams, providing clear communication on project progress and requirements. Create and maintain accurate technical documentation. Support internal data analysis and reporting for business objectives. Investigate and resolve data-related issues, implementing improvements for stability and performance. Evaluate and prototype solutions to ensure optimal architecture, cost, and scalability. Implement best practices in automation, CI/CD, and test-driven development.What you’ll need to succeed Strong experience in Data Engineering, with demonstrable lead 5involvement in at least one production-grade data system within financial services or a similarly regulated industry. Strong coding skills in Python or Java (Spring Boot); React experience is a plus. Proficiency with modern data tools: Airflow, Spark, Kafka, dbt, Snowflake or similar. Experience with cloud platforms (AWS, Azure, GCP), containerization (Docker, Kubernetes), and CI/CD. Data Quality: Proven ability to validate and govern data pipelines, ensuring data integrity, correctness, and compliance. Experience working within financial services/ highly regulated environments. Bonus Skills: SQL and RDBMS (PostgreSQL, SQL Server). NoSQL/distributed databases (MongoDB). Streaming pipelines experience. What you’ll get in return An exciting opportunity to join an international organisation in financial services. Furthermore, a competitive day rate inside IR35 for this role will be offered in addition to your own dedicated Hays Consultant to guide you through every step of the application process. What you need to do now If you’re interested in this role, click ‘apply now’ to forward an up-to-date copy of your CV, or call us now. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C’s, Privacy Policy and Disclaimers which can be found at (url removed)
Data Engineer
Eligo Recruitment Ltd
Hoddesdon
Fully remote
Senior
£80,000 - £95,000
processing-js
aws
airflow
sql
dbt
Are you a Senior Data Engineer with iGaming or Gambling experience, looking to build and scale modern data platforms? BENEFITS: £80,000–£95,000 depending on experience, fully remote, excellent benefits package You’ll be joining a fast-growing iGaming and online casino company operating a custom-built platform that supports millions of player interactions. The business is a recognised leader across sports betting and online casino, with a strong focus on performance, reliability and data-driven decision-making. As a Senior Data Engineer, you’ll be responsible for designing, building and maintaining scalable data pipelines and infrastructure that underpin analytics, reporting and product insight across the organisation. Core Responsibilities Design, build and maintain robust data pipelines to support analytics, product and reporting needs Develop and optimise ETL/ELT processes for large volumes of player, game and transaction data Work closely with data analysts and stakeholders to ensure data is reliable, accessible and well-structured Improve data quality, monitoring and observability across the platform Support real-time and batch data processing use cases Collaborate with engineering teams to integrate data solutions with the wider platform Ensure data architecture aligns with security, compliance and regulatory requirements Contribute to data platform strategy, tooling decisions and best practiceRequired Experience & Expertise Proven experience as a Data Engineer, ideally within iGaming, gambling or another regulated environment Strong experience with SQL and modern data warehousing solutions Experience building pipelines using tools such as Airflow, dbt or similar Solid understanding of cloud platforms, ideally AWS Experience working with event-driven or streaming data architectures is a plus Strong grasp of data modelling, performance optimisation and scalability Comfortable collaborating with analytics, product and engineering teams Eligo Recruitment is acting as an Employment Business in relation to this vacancy. Eligo is proud to be an equal opportunity employer dedicated to fostering diversity and creating an inclusive and equitable environment for employees and applicants. We actively celebrate and embrace differences, including but not limited to race, colour, religion, sex, sexual orientation, gender identity, national origin, veteran status, and disability. We encourage applications from individuals of all backgrounds and experiences and all will be considered for employment without discrimination. At Eligo Recruitment diversity, equity and inclusion is integral to achieving our mission to ensure every workplace reflects the richness of human diversity
Data Engineer – GCP/DSS
DCV Technologies
London
Hybrid
Mid - Senior
£65,000/day - £75,000/day
processing-js
terraform
python
kanban
bash
sql
+2
Job Title: Data Engineer – GCP/DSS Department: Enabling Functions Location: Hybrid, London Type: Both Contract (Inside IR35) & Permanent available Salary: Competitive; depends on experience and open to discussion Purpose of Job What you will be working on While our broker platform is the core technology crucial to success – this role will focus on supporting the middle/back-office operations that will lay the foundations for further and sustained success. We’re a multi-disciplined team, bringing together expertise in software and data engineering, full stack development, platform operations, algorithm research, and data science. Our squads focus on delivering high-impact solutions – we favour a highly iterative, analytical approach. You will be designing and developing complex data processing modules and reporting using Big Query and Tableau. In addition, you will also work closely with the Infrastructure/Platform Team, responsible for architecting and operating the core of the Data Analytics platform. Principle Accountabilities Work with both the business teams (finance and actuary initially), data scientists and engineers to design, build, optimise and maintain production grade data pipelines and reporting from an internal Data warehouse solution, based on GCP/Big Query. Work with finance, actuaries, data scientists and engineers to understand how we can make best use of new internal and external data sources. Work with our delivery partners at EY/IBM to ensure robustness of design and engineering of the data model/MI and reporting which can support our ambitions for growth and scale. BAU ownership of data models, reporting and integrations/pipelines. Create frameworks, infrastructure and systems to manage and govern data assets. Produce detailed documentation to allow ongoing BAU support and maintenance of data structures, schema, reporting etc. Work with the broader Engineering community to develop our data and MLOps capability infrastructure. Ensure data quality, governance, and compliance with internal and external standards. Monitor and troubleshoot data pipeline issues, ensuring reliability and accuracy Regulatory Conduct and Rules
Act with integrity
Act with due skill, care and diligence
Be open and co-operative with Lloyd’s, the FCA, the PRA, and other regulators
Pay due regard to the interests of customers and treat them fairly
Observe proper standards of market conduct Education, Qualifications, Knowledge, Skills and Experience * Experience designing data models and developing industrialised data pipelines. * Strong knowledge of database and data lake systems. * Hands-on experience in Big Query, dbt, GCP cloud storage. * Proficient in Python, SQL and Terraform. * Knowledge of Cloud SQL, Airbyte, Dagster. * Comfortable with shell scripting with Bash or similar. * Experience provisioning new infrastructure in a leading cloud provider, preferably GCP. * Proficient with Tableau Cloud for data visualization and reporting. * Experience creating DataOps pipelines. * Comfortable working in an Agile environment, actively participating in approaches such as Scrum or Kanban Desirable Skills Experience of streaming data systems and frameworks would be a plus. Experience working in regulated industry, especially financial services, would be a plus. Experience creating MLOps pipelines is a plus The applicant must also demonstrate the following skills and abilities Excellent communication skills (both oral and written). Pro-active, self-motivated and able to use own initiative. Excellent analytical and technical skills. Ability to quickly comprehend the functions and capabilities of new technologies. Ability to offer balanced opinion regarding existing and future technologies. How to Apply If you are interested in the Data Engineer – GCP/DSS position, please apply here
Page 1 of 1

Frequently asked questions

What types of DBT jobs are available in London?
London offers a variety of DBT jobs including roles such as DBT Developers, Data Engineers specializing in DBT, Analytics Engineers, and Data Transformation Specialists working with DBT frameworks.
What skills are employers looking for in DBT professionals in London?
Employers typically seek experience with DBT (Data Build Tool), SQL proficiency, knowledge of data warehousing concepts (Snowflake, BigQuery, Redshift), version control systems like Git, and familiarity with cloud platforms such as AWS or GCP.
What is the average salary for a DBT job in London?
Salaries for DBT roles in London vary depending on experience and seniority but typically range from £45,000 to £90,000 per year, with senior positions or specialized roles potentially offering higher compensation.
How can I apply for DBT jobs listed on Haystack?
You can apply directly through the Haystack platform by creating a profile, uploading your CV, and submitting applications to your chosen DBT job listings. Some listings may also provide direct company application links.
Are there remote or flexible DBT job opportunities available in London?
Yes, many companies in London offer remote or flexible working arrangements for DBT roles to accommodate different working styles, especially post-pandemic. Job listings on Haystack will specify if remote or hybrid options are available.