Make yourself visible and let companies apply to you.
Roles

DBT Jobs

Overview

Find the best DBT jobs tailored for data professionals on Haystack. Explore top opportunities in data engineering, analytics engineering, and more, all seeking expertise in dbt (data build tool). Whether you're a seasoned DBT developer or looking to grow your skills, our curated DBT job listings connect you with leading companies ready to hire. Start your next data career move with Haystack today!
Filters applied
DBT
Search
Salary
Location
Remote preference
Role type
Seniority
Tech stack
Sectors
Contract type
Company size
Visa sponsorship
Microsoft Fabric Architect - SME
IO Associates
UK
Fully remote
Senior - Leader
Private salary
RECENTLY POSTED
fabric
sql
snowflake
dbt
We are supporting a high-profile organisation on a greenfield Microsoft Fabric initiative and are looking for an experienced Fabric Architect to lead the end-to-end platform design. This is a unique opportunity to shape the architecture, governance, and delivery of a brand-new Fabric data platform from the ground up.If you have delivered Microsoft Fabric solutions at an architectural level, defining Lakehouse structures, medallion models, and analytics-ready datasets, this opportunity is for you.Location: Remote (UK) Day Rate: Negotiable Contract: Short-term engagement with potential for extensionKey Responsibilities / Experience Required:
Designing a greenfield Microsoft Fabric platform, including Lakehouse, Bronze, Silver and Gold layers, and reporting structures
Defining data ingestion and transformation architecture across multiple sources such as CRM, CMS, SaaS platforms, and GA4
Designing Fabric Data Factory pipelines and Spark notebooks for scalable ETL/ELT
Establishing data modelling standards for Silver and Gold (Medallion) layers
Preparing Power BI-ready datasets with clear fact and dimension schemas
Providing technical governance guidance, ensuring scalability, maintainability and PII compliance
Advising on best practices for Fabric, Synapse and Databricks integration
Strongly Preferred:
Experience in Azure data services, including ADF, ADLS and Azure SQL
Proven experience integrating web analytics, CRM and CMS data at a platform level
Familiarity with data governance frameworks and enterprise data architecture principles
Nice to Have:
Experience with dbt (Core or Cloud)
Exposure to modern lakehouse ecosystems such as Databricks or Snowflake
Background in nonprofit or charity analytics (optional)
If you have architected Microsoft Fabric solutions from scratch and can define clean, scalable, enterprise-grade platforms, we would love to hear from you.Please reply with your most up-to-date CV.
Senior Data Engineer
Fruition Group
Leeds
Hybrid
Senior
£80,000
RECENTLY POSTED
aws
python
java
powerbi
apache-spark
airflow
+3
Job Title: Senior Data Engineer Location: Leeds, 2x per week Salary: Up to £80,000 per annumWhy Apply? This is an exciting opportunity to work as a Senior Data Engineer delivering scalable, high quality data solutions for a leading client in the technology sector. This position offers professional growth, challenging projects, and access to cutting edge cloud data technologies.Senior Data Engineer Responsibilities:
Design, develop, and optimise robust, scalable data pipelines and architectures to support Business Intelligence and analytics initiatives.
Manage and maintain cloud-based data platforms (AWS, Azure, or Google Cloud) including data lakes, warehouses, and lakehouse solutions.
Transform and process structured and unstructured data using modern ETL/ELT frameworks (Apache Spark, Airflow, dbt).
Collaborate closely with product managers, analysts, and software developers to ensure seamless integration and high-quality data availability.
Develop, maintain, and enhance reporting and analytics capabilities through tools such as PowerBI, Tableau, or QuickSight.
Apply best practices in data governance, data quality, and performance optimisation.
Operate in an agile environment, contributing to technical discussions and problem-solving initiatives.
Senior Data Engineer Requirements:
Proven experience in building and managing cloud-based data platforms (AWS Redshift/Glue, Azure Data Factory/Synapse, Google BigQuery/Dataflow).
Strong programming skills in Python, SQL, and Java for data engineering tasks.
Experience designing reliable, maintainable, and high-performance data pipelines and architectures.
Broad understanding of data warehousing, data lakes, and lakehouse architectures.
Familiarity with Business Intelligence and data visualisation tools.
Excellent analytical thinking, attention to detail, and problem-solving skills.
Strong collaboration and communication skills, able to work with both technical and non-technical stakeholders.
Comfortable with complexity, ambiguity, and working independently or as part of a team in a fast-paced environment.
We are an equal opportunities employer and welcome applications from all suitably qualified persons regardless of their race, sex, disability, religion/belief, sexual orientation or age.
Data Engineer
Datatech
Mansfield
Fully remote
Junior - Mid
£45,000 - £45,000
RECENTLY POSTED
aws
git
python
airflow
sql
snowflake
+1
Data Engineer, Remote Modern Cloud Data Stack £45,000 DOE No sponsorship, post-grad visa not available A high-visibility opportunity with a values-led organisation modernising its data platform and refreshing its data strategy. You’ll be trusted early, work directly with stakeholders across the business, and build the foundations that power better insight, smarter decisions, and real-world impact. This suits someone with 2+ years’ experience who wants to step up, take ownership, and grow quickly in a supportive environment. Communication is central here, you’ll succeed by translating business questions into robust, trusted data assets, and by bringing people with you on the journey. What you’ll do • Help shape and deliver a refreshed data strategy and modern intelligence platform • Build reliable, scalable ELT/ETL pipelines into a cloud data warehouse (Snowflake, Databricks, or similar) • Develop and optimise core data models and transformations (dimensional, analytics-ready, built to last) • Create trusted data products that enable self-service analytics across the organisation • Improve data quality, monitoring, performance, and cost efficiency • Partner with analysts, BI, and non technical stakeholders to turn questions into production-grade data assets • Contribute to standards, best practice, and reusable engineering frameworks • Support responsible AI tooling, including programmatic LLM workflows where relevant What you’ll bring • 2+ years’ experience in data engineering within a modern cloud stack • Strong SQL, plus a solid data modelling foundation • Python preferred (or similar) for pipeline development and automation • Cloud exposure (AWS, Azure, or GCP) • Familiarity with orchestration and analytics engineering tools (dbt, Airflow, or similar) • Strong habits around governance, security, documentation, Git, and CI/CD What will make you stand out in this business • Clear, confident communication, you can explain technical choices in plain English • Strong stakeholder mindset, you ask the right questions and align on outcomes early • Ownership, curiosity, and a bias for building things properly Excited? Apply now
Snowflake Data Engineer
Tenth Revolution Group
London
Hybrid
Senior
£85,000 - £100,000
RECENTLY POSTED
snowflake
processing-js
aws
git
kafka
python
+4
Senior Snowflake Data Engineer - Hybrid - £85k-£100k About the Role I am looking for an experienced Senior Snowflake Data Engineer to join a dynamic team working on cutting-edge data solutions. This is an exciting opportunity to design, build, and optimise high-performance data pipelines using Snowflake, dbt, and modern engineering practices. If you are passionate about data engineering, test-driven development, and cloud technologies, we’d love to hear from you. Key Responsibilities Design, develop, and optimise scalable data pipelines in Snowflake. Build and maintain dbt models with robust testing and documentation. Apply test-driven development principles for data quality and schema validation. Optimise pipelines to reduce processing time and compute costs. Develop modular, reusable transformations using SQL and Python. Implement CI/CD pipelines and manage deployments via Git. Automate workflows using orchestration tools such as Airflow or dbt Cloud. Configure and optimise Snowflake warehouses for performance and cost efficiency. Required Skills & Experience 7+ years in data engineering roles. 3+ years hands-on experience with Snowflake. 2+ years production experience with DBT (mandatory). Advanced SQL and strong Python programming skills. Experience with Git, CI/CD, and DevOps practices. Familiarity with ETL/ELT tools and cloud platforms (AWS, Azure). Knowledge of Snowflake features such as Snowpipe, streams, tasks, and query optimisation. Preferred Qualifications Snowflake certifications (SnowPro Core or Advanced). Experience with DBT Cloud and custom macros. Exposure to real-time streaming (Kafka, Kinesis). Familiarity with data observability tools and BI integrations (Tableau, Power BI). On offer Opportunity to work with modern data technologies and large-scale architectures. Professional development and certification support. Collaborative, engineering-focused culture. Competitive salary and benefits package. Interested? Apply now with your CV highlighting your Snowflake, DBT and DevOps experience
Snowflake Data Architect - £550 Inside IR35- Hybrid
Tenth Revolution Group
Warwick
Hybrid
Mid - Senior
£400/day - £550/day
RECENTLY POSTED
snowflake
processing-js
fabric
prometheus
python
microsoft-azure
+1
Snowflake Data Architect - £550 Inside IR35 - HybridWe are seeking an experienced Data Architect to design, build, and maintain scalable, secure, and high-performing data platforms. The ideal candidate will have strong expertise in Azure-based data solutions, Snowflake, and modern data engineering tools, and will play a key role in shaping our enterprise data architecture to support analytics, reporting, and advanced data use cases.Key Responsibilities . Design and implement end-to-end data architectures using Azure cloud services . Architect and optimize data solutions on Snowflake for performance, scalability, and cost efficiency . Build and maintain data pipelines using Azure Data Factory (ADF) . Develop and manage transformation workflows using DBT . Design and support ET/ELT processes for structured and semi-structured data . Develop data engineering solutions using Python for data processing, automation, and orchestration . Implement monitoring and observability for data systems using Prometheus . Define data models, schemas, and standards to ensure data consistency and quality . Collaborate with data engineers, analysts, and business stakeholders to translate requirements into technical solutions . Ensure data security, governance, and compliance with organizational and regulatory standards . Troubleshoot and optimize data pipelines and architectures for reliability and performanceRequired Qualifications . Proven experience as a Data Architect or Senior Data Engineer . Strong hands-on experience with Microsoft Azure data services . Extensive experience with Snowflake data warehousing . Proficiency in Azure Data Factory (ADF) for data orchestration . Strong Python programming skills . Hands-on experience with DBT for data transformation and modelling . Solid understanding of ET/ELT architecture and best practices . Experience with monitoring and observability tools such as Prometheus . Strong knowledge of data modelling, data warehousing concepts, and cloud architecture . Excellent problem-solving and communication skillsPreferred Qualifications . Experience with CI/CD for data pipelines . Familiarity with infrastructure-as-code tools . Experience working in Agile or DevOps environments . Knowledge of data governance, metadata management, and data quality frameworksTo apply for this role please submit your CV or contact Dillon Blackburn (see below)Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We’re the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
Engineering Manager
Stepstone UK
Portsmouth
Hybrid
Senior - Leader
Private salary
RECENTLY POSTED
aws
javascript
dot-net
terraform
kafka
python
+5
At The Stepstone Group, we have a simple yet very important mission: The right job for everyone. Using our data, platform, and technology, we create opportunities for jobseekers and companies around the world to find a perfect match in fair and equitable way. With over 20 brands across 30+ countries, we strive for fair and unbiased hiring.Join our team of 4,000+ employees and be part of reshaping the labour market and becoming the worlds leading job platform.Job DescriptionJoin our team and youll be responsible for providing technical leadership to all engineering and development areas across your domain, developing and executing an engineering strategy aligned to the portfolio and global tech strategies.Working in the Design and Platform Performance domain, you willlead the technology side of web and mobile analytics for a platform with over 50 million visits each month, deliver the best user experiences and personalisation for talent and talent seekers. We built in-house large scale tracking platform, powered by Tealium, Kafka and Adobe Analytics, and successfully rolled it out to 10 of our brands. Along with the tracking we are owners of our AB testing tool Optimizely, our in-house Design System Genesys and Frontend Framework, which when put together powers the product development at scale using a data driven approach.You will play a vital role as we reimagine the labour market to make it work for everybody.Your responsibilities:
Managing and mentoring engineering professionals in your teams and domain, setting objectives for your teams and domain, owning the delivery of the outcomes, monitoring ongoing progress and performance within the OKR framework
Partnering with your product manager, other engineering managers, staff engineers, architect, agile coach and engineering director to formulate a domain level technical strategy
Building and maintaining relationships with both internal and external stakeholders
Facilitate communication between teams under your leadership and enabling free exchange of information within your sphere of influence, developing appropriate organisational structure together with the Engineering Director
Resourcing of plans to support the business objectives and working with the Engineering Director on budgeting within the domain
Qualifications
Experience of providing technical leadership to an engineering team; with excellent analytical, problem solving and influencing skills, resilient, open to change and results oriented
Experience of coding and testing of analytics tracking code using Tealium iQ, Tealium Event Stream, and Adobe Analytics
Experience of integrating data sources via web and hypermedia APIs
Proficient in Web Technologies like HTML, CSS, JavaScript, React.js, Next.js and Node.js, and in event-driven and eventual consistency systems using Kafka, .Net, Java, REST APIs, AWS, Terraform and DevOps
Nice to have:experience in data pipping and modelling using SQL, DBT, ETL, Data Warehousing, Redshift and Python, and ecommerce and mobile applications background
Additional InformationWere a community here that cares as much about your life outside work as how you feel when youre with us. Because your job shouldnt take over your life, it should enrich it. Here are some of the benefits we offer:
29 days holiday allowance + bank holidays
Private medical and dental healthcare
Matching pension contribution of 4 or 5% (after 3 years of service up to 10%)
24/7 Employee Assistance Programme
Life Assurance Cover
Cycle to work scheme
Hybrid working model (3 days working fromthe office)
Volunteering days
and you can bring your dog to the office onMondays and Fridays!
Our commitmentEqual opportunities are important to us. We believe that diversity and inclusion at The Stepstone Group are critical to our success as a global company, so we want to recruit, develop, and keep the best talent. We encourage applications from everyone, regardless of background, gender identity, sexual orientation, disability status, ethnicity, belief, age, family or parental status, and any other characteristic.As a global business we further our DEI and sustainability progress by working with national and international bodies and are proud to have been recognised for our work - both locally and internationally, including:
Armed Forces Covenant: SilverAward, Employer Recognition Scheme
EcoVadis: Bronze Award
Fertility Friendly Employer, accredited by Fertility Matters at Work
RIDI (Recruitment Industry Disability) Awards: Inclusive Technology Award 2024
Stonewall: Gold Award
Stonewall: Top 100 Workplace Equality Index (85)
Snowflake Senior Developer
Resourgenix Ltd
Not Specified
Hybrid
Senior
£375 - £450
RECENTLY POSTED
snowflake
processing-js
kafka
python
airflow
sql
+4
Location: London (Hybrid) Employment Type: Full-time Seniority: Senior Individual Contributor Reports to: Data Engineering Lead / ManagerSummary of roleWe are seeking a Snowflake Senior Developer to design, develop, and optimise data solutions on our cloud data platform. You will work closely with data engineers, analysts, and architects to deliver high-quality, scalable data pipelines and models. Strong expertise in Snowflake, ETL/ELT, data modelling, and data warehousing is essential.Responsibilities
Snowflake Development: Build and optimise Snowflake objects (databases, schemas, tables, views, tasks, streams, resource monitors).
ETL/ELT Pipelines: Develop and maintain robust data pipelines using tools like dbt, Airflow, Azure Data Factory, or similar.
Data Modelling: Implement dimensional models (star/snowflake schemas), handle SCDs, and design efficient structures for analytics.
Performance Tuning: Optimise queries, manage clustering, caching, and warehouse sizing for cost and speed.
Data Quality: Implement testing frameworks (dbt tests, Great Expectations) and ensure data accuracy and freshness.
Security & Governance: Apply RBAC, masking policies, and comply with data governance standards.
Collaboration: Work with BI teams to ensure semantic alignment and support self-service analytics.
Documentation: Maintain clear technical documentation for pipelines, models, and processes.
Qualifications
Matric and a Degree in IT
Strong SQL skills (complex queries, performance tuning) and proficiency in Python for data processing.
Experience with ETL/ELT tools (dbt, Airflow, ADF, Informatica, Matillion).
Solid understanding of data warehousing concepts (Kimball, Data Vault, normalization).
Familiarity with cloud platforms (Azure preferred; AWS/GCP acceptable).
Knowledge of data governance, security, and compliance (GDPR).
Excellent problem-solving and communication skills.
Skills
Experience with Snowpark, UDFs, dynamic tables, and external tables.
Exposure to streaming/CDC (Kafka, Fivetran, Debezium).
BI tool integration (Power BI, Tableau, Looker).
Certifications: SnowPro Core or Advanced.
Analytics Engineer (Telecoms) x2
Hays Technology
London
Remote or hybrid
Mid - Senior
£544,000/day - £725,544/day
RECENTLY POSTED
processing-js
terraform
sql
microsoft-azure
tableau
dbt
Your new company Working for a renowned British telecoms organisationYour new role We are seeking 2x Analytics Engineer to join our team at a leading telecoms organisation. This role focuses on transforming raw data into clean, analytics-ready datasets and bridging the gap between engineering and analytics by dealing with data transformations end processes feeding end user outputs. You will work on data transformation processes feeding end-user outputs, optimisation, ensuring accuracy, scalability, and improving performance and resource usage for large-scale data processing.What you’ll need to succeed
Experience working as a Analytics Engineer/ or similar role that lands between a hands on Data Analyst/ Engineer.
Proven experience in complex data process migration projects.
Hands-on experience working with large-scale data environments.
Spark optimization experience
Strong experience with Microsoft Azure cloud-based platform.
Expertise in setting up and managing data pipelines.
SQL and data modeling expertise.
Familiarity with dbt or similar data transformation tools.
Knowledge of orchestration and optimization techniques for data workflows.
Experience with Infrastructure as Code (Terraform) for cloud deployments.
Familiarity with Tableau, including setting up and maintaining Tableau Cloud solutions.
Demonstrated ability in developing, testing, and deploying complex data models and methodologies.
What you’ll get in return Flexible working options available.What you need to do now If you’re interested in this role, click ‘apply now’ to forward an up-to-date copy of your CV, or call us now.Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C’s, Privacy Policy and Disclaimers which can be found at (url removed)
Data Engineer (Snowflake and Matillion) - £425PD - Remote
Tenth Revolution Group
City of London
Fully remote
Mid - Senior
£350/day - £425/day
RECENTLY POSTED
snowflake
fabric
aws
git
python
airflow
+4
Data Engineer ( Snowflake and Matillion) - £425PD - Remote About the Role We are looking for a Data Engineer with strong experience in Snowflake and Matillion to design, build, and maintain scalable data pipelines and analytics-ready data models. You’ll work closely with analytics, product, and business teams to turn raw data into reliable, high-quality datasets that power reporting, dashboards, and advanced analytics. This role is ideal for someone who enjoys working in a modern cloud data stack and takes pride in building clean, performant, and well-documented data solutions. Key Responsibilities Design, build, and maintain ELT pipelines using Matillion to ingest data from multiple sources into Snowflake Develop and optimize data models in Snowflake for analytics and reporting use cases Ensure data quality, reliability, and performance across pipelines and warehouse workloads Collaborate with analytics engineers, data analysts, and stakeholders to understand data requirements Implement best practices for Snowflake (clustering, scaling, cost optimization, security) Monitor and troubleshoot data pipelines, resolving failures and performance issues Manage and evolve data transformations using SQL and version control Document data pipelines, models, and business logic for long-term maintainability Support CI/CD processes and promote automation across the data platform Required Qualifications 3+ years of experience as a Data Engineer or in a similar role Strong hands-on experience with Snowflake (data modeling, performance tuning, security) Proven experience building pipelines with Matillion Advanced SQL skills and solid understanding of ELT best practices Experience working with cloud data architectures (AWS, Azure, or GCP) Familiarity with version control systems (e.g., Git) Strong problem-solving skills and attention to detail Ability to communicate clearly with technical and non-technical stakeholders Nice to Have Experience with dbt or other transformation frameworks Exposure to data orchestration tools (Airflow, etc.) Understanding of data governance, lineage, and metadata management Experience supporting BI tools (Power BI, Tableau, Looker, etc.) Python experience for data tooling or automation Experience working in an agile or product-driven environment To apply for this role please submit your CV or contact Dillon Blackburn on (phone number removed) or at (url removed). Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We’re the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment
Data Engineer Manager
Young's Employment Services Ltd
Brent
Hybrid
Senior - Leader
£90,000
RECENTLY POSTED
fabric
aws
kafka
python
java
apache-spark
+4
Hybrid - London with 2/3 days WFH Circ £85,000 - £95,000 + Attractive Bonus & Benefits Hands On Data Engineer Manager required for this exciting newly created position with a prestigious and rapidly expanding business in West London. It would suit someone with official management experience, or potentially a Lead / Senior Engineer looking to take on more managerial responsibility. The Data Engineer Manager will play a pivotal role at the heart of our client’s data & analytics operation. Having implemented a new MS Fabric based Data platform, the need now is to scale up and meet the demand to deliver data driven insights and strategies right across the business globally. There’ll be a hands-on element to the role as you’ll be troubleshooting, reviewing code, steering the team through deployments and acting as the escalation point for data engineering. Our client can offer an excellent career development opportunity and a vibrant, creative and collaborative work environment. This is a hybrid role based in Central / West London with the flexibility to work from home 2 or 3 days per week. Key Responsibilities include; Define and take ownership of the roadmap for the ongoing development and enhancement of the Data Platform. Design, implement, and oversee scalable data pipelines and ETL/ELT processes within MS Fabric, leveraging expertise in Azure Data Factory, Databricks, and other Azure services. Advocate for engineering best practices and ensure long-term sustainability of systems. Integrate principles of data quality, observability, and governance throughout all processes. Participate in recruiting, mentoring, and developing a high-performing data organization. Demonstrate pragmatic leadership by aligning multiple product workstreams to achieve a unified, robust, and trustworthy data platform that supports production services such as dashboards, new product launches, analytics, and data science initiatives. Develop and maintain comprehensive data models, data lakes, and data warehouses (e.g., utilizing Azure Synapse). Collaborate with data analysts, Analytics Engineers, and various stakeholders to fulfil business requirements. Key Experience, Skills and Knowledge: Experience leading data or platform teams in a production environment as a Senior Data Engineer, Tech Lead, Data Engineering Manager etc. Proven success with modern data infrastructure: distributed systems, batch and streaming pipelines Hands-on knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Experience building, defining, and owning data models, data lakes, and data warehouses Programming proficiency in the likes of Python, Pyspark, SQL, Scala or Java. Experience operating in a cloud-native environment such as Azure, AWS, GCP etc ( Fabric experience would be beneficial but is not essential). Excellent stakeholder management and communication skills. A strategic mindset, with a practical approach to delivery and prioritisation. Proven success with modern data infrastructure: distributed systems, batch and streaming pipelines. Experience building, defining, and owning data models, data lakes, and data warehouses. Exposure to data science concepts and techniques is highly desirable. Strong problem-solving skills and attention to detail. Salary is dependent on experience and expected to be in the region of £85,000 - £95,000 + an attractive bonus scheme and benefits package. For further information, please send your CV to Wayne Young at Young’s Employment Services Ltd. YES are operating as both a recruitment Agency and Recruitment Business. TPBN1_UKTJ
D2C Data & Business Intelligence Manager
AJ Bell Business Solutions Limited
Salford
Hybrid
Senior - Leader
Private salary
RECENTLY POSTED
google-analytics
confluence
python
jira
sql
snowflake
+2
We’re now recruiting a D2C Data & Business Intelligence Manager to join our D2C Insights team; the team that drives customer-focused, data-led decision making across our direct-to-consumer strategy, product, PR & content, and business development teams. The team are responsible for insights across customer research, customer feedback, market intelligence, competitor analysis, and data-driven insights.The D2C Data and BI Manager leads a small but dynamic and experienced group of BI developers and engineers who work across customer, financial and commercial data held in our Enterprise Data Warehouse Snowflake alongside our CRM and Customer Data Platform Bloomreach.Were passionate about delivering an increasingly personalised and segmented approach to every customer touchpoint to empower our customers to reach their financial goals.This role is a hands-on leader for the team, who can ensure we continue to have strong technical processes and governance whilst working with technical and non-technical colleagues to gather requirements, develop and deliver elegant and robust solutions, and embed insights.What does the job involve?
Leadership of a growing and high performing team of data engineers and BI developers coaching individuals and the team to deliver their best for the business.
Ensuring strong technical and business processes including documentation so that our data pipelines and data assets are robust, and that change is delivered in a governed and controlled manner and at the pace our stakeholders require
Working closely with our stakeholders in D2C (particularly Marketing and Product) to understand their plans and strategies to develop an operating model and plans that deliver robust data products and dashboards when required.
Continually assess our ways of working and roadmaps, proactively identifying opportunities for efficiency and innovation
Ensuring change is delivered in line with agreed plans and QA’ing before delivery to the business.
Supporting the team to develop engaging and accurate dashboards and reports.Youll take the lead on rolling these out to the business, ensuring colleagues understand how to navigate and interpret the data including leading on our approach to self-service and data literacy across the division.
Working closely with our central data and technology colleagues to collaborate on projects, share best practice and agree on common ways of working across the AJ Bell data community
Proactively identify risks or opportunities highlighted through data and reporting, taking the lead on surfacing these to senior management along with context and what practical options we have to address them.
Ensuringcompliance with regulatory requirements and industry standards related to data privacy, security, and confidentiality.
Supporting and enabling the business to achieve its regulatory requirements, including consumer duty.
About you:We’re looking for someone with a broad range of competence, knowledge and skills which would help you succeed in this role, but it is not critical to meet every single criteria listed below. We encourage a wide range of applicants.
Proven experience in leading data and reporting teams in financial services or highly regulated industries
Demonstrable experience of delivering improved outcomes from data and BI particularly governance, efficiency and scalability
Previous responsibility for business-critical data assets and platforms
Experience of working with diverse data sources across customer, transactional and external data and different cadences including real time. Experience of working with third party data enhancement and marketing data sources such as Google Analytics is highly desirable
Outstanding stakeholder engagement and management skills
Excellent long-term planning and short-term prioritisation skills, including effective communication and ensuring delivery of priorities
Experience in analysis and development within a BI environment
Agile/scrum experience would be a benefit
Good levels of data and market awareness
Highly effective communication skills and comfortable working with both technical and non-technical teams
Excellent analytical and problem-solving skills, with the ability to interpret complex datasets and generate actionable insights.
In-depth knowledge of cloud-based data architecture, database development and management. Previous experience with Snowflake in particular is desirable
Strong understanding of data and systems integration and data pipelines including development approaches.
Broad knowledge of business intelligence principles, approaches and tools/platforms. Power BI is preferred
Understanding of marketing and communication technologies, particularly CRM platforms and orchestration tools. Experience with Bloomreach is desirable.
Best practice for systems and data governance including development processes and documentation
Reporting architecture and approaches including self-service enablement
Understanding of marketing and product performance metrics and approaches.Exposure to Google Analytics and user engagement tools and testing platforms such as Contentsquare, AppsFlyer, Optimizely is highly desirable
Proven understanding of GDPR and data governance, preferably in financial services
Knowledge of AJ Bell and our product offerings, and interest in the investment platform marketwould be a plus
Excellent proficiency in data manipulation languages, particularly SQL, and able to write complex queries and repeatable stored procedures. Experience in low-code solutions like DBT is highly desirable
Experience of developing data pipeline and automations, for example via Python
Data integration using APIs
Ownership of systems critical data and string experience of maintenance, development and improvement.
Dashboard design, development and delivery.Experience with Power BI including semantic model development and with DAX is preferred.
Self-serve approaches and improvements to data literacy and adoption.
Documentation and workflow management using tools such as JIRA and Confluence
About UsAJ Bell is one of the fastest-growing investment platform businesses in the UK offering an award-winning range of solutions that caters for everyone, from professional financial advisers to DIY investors with little to no experience. We have over 644,000 customers using our award-winning platform propositions to manage assets totalling more than £103.3 billion. Our customers trust us with their investments, and by continuously striving to make investing easier, we aim to help even more people take control of their financial futures.Having listed on the Main Market of the London Stock Exchange in December 2018, AJ Bell is now a FTSE 250 company.Headquartered in Manchester with offices in central London and Bristol, we now have over 1,500 employees and have been named one of the UK’s 'Best 100 Companies to Work For forsix consecutive yearsand in 2024 named a Great Place to Work.At AJ Bell you can expect a friendly working environment with a strong sense of teamwork, we have a great sense of pride in what we do, and this is reflected in our guiding principles.What we offer:
Competitive starting salary
Starting holiday entitlement of 27 days, increasing up to 31 days with length of service and a holiday buy and sell scheme
A choice of pension schemes with matched contributions up to 8%
Discretionary bonus scheme
Annual free share awards scheme
Buy As You Earn (BAYE) Scheme
Health Cash Plan provided by Simply Health
Discounted private healthcare scheme and dental plan
Free gym
Employee Assistance Programme
Bike loan scheme
Sick pay+ pledge
Enhanced maternity, paternity, and shared parental leave
Loans for travel season tickets
Death in service scheme
Paid time off for volunteer work
Charitable giving opportunities through salary sacrifice
Calendar of social events, including monthly payday drinks, annual Christmas party, summer party and much more
Personal development programmes built around you and your career goals, including access to personal skills workshops
Monthly leadership breakfasts and lunches
Casual dress code
Access to a range of benefits from our sponsorship deals
Hybrid working:At AJ Bell, our people are the heart of our culture. We believe in building strong connections by working together. That’s why we offer a hybrid working model, where youll spend a minimum of 50% of your working time per month in the office. For new team members, an initial period will be full-time in the office to help you immerse yourself in our business and build valuable relationships with your colleagues.AJ Bell is committed to providing an environment of mutual respect where equal employment opportunities are available to all applicants and all employees are empowered to bring their whole self to work.We do not discriminate on the basis of race, sex, gender identity, sexual orientation, age, pregnancy, religion, physical and mental disability, marital status and any other characteristics protected by the Equality Act 2010. All decisions to hire are based on qualifications, merit and business need.If you like the sound of the above, or just want to know more about the company and the role, we’d love to speak to you.
Snowflake Data Engineer
Nicholas Associates Graduate Placements
Brighton
Hybrid
Mid
£50,000 - £60,000
snowflake
aws
python
airflow
sql
dbt
Location: Brighton, East Sussex (Hybrid / Remote possible) Salary: £50,000 - £60,000 per annum (DOE) Job Type: Permanent, Full-time About the Role We are looking for a talented Snowflake Data Engineer to join our growing data team based in Brighton. You’ll play a key role in enhancing our data platform, automating data workflows and enabling data-driven decision-making across the business. This role is ideal for someone who enjoys working with modern cloud data platforms, has excellent SQL skills, and is keen to build scalable data pipelines and data models. Key Responsibilities Design, build, and maintain scalable ETL/ELT data pipelines using Snowflake. Administer, optimise and support the Snowflake data platform for performance and cost efficiency Ingest, transform, and integrate data from multiple sources (e.g., GA4, internal systems). Develop and maintain data models to support analytics, reporting and business use cases. Ensure high data quality, monitoring, testing and documentation of pipelines and models. Collaborate with BI, analytics and engineering teams to ensure data meets business needs. Support data governance, security, compliance and best practices in data engineering. Required Skills & Experience Hands-on experience in Snowflake data warehouse development and optimisation. Strong SQL skills for querying, transformation and performance tuning. Experience building and managing ETL/ELT pipeline. Proficiency with at least one scripting/programming language (e.g., Python). Familiarity with modern data engineering tools like dbt, Airflow, Prefect, or similar is a plus. Knowledge of cloud platforms (AWS / Azure / GCP). Understanding of data modelling, quality controls and best practices. Qualifications Degree in Computer Science, Data Engineering, IT or a related field (or equivalent experience). Snowflake certifications or relevant cloud/data engineering certifications are advantageous. About Us We are dedicated to fostering a diverse and inclusive community. In line with our Diversity and Inclusion policy, we welcome applications from all qualified individuals, regardless of age, gender, ethnicity, sexual orientation, or disability. As a Disability Confident Employer, and part of the Nicholas Associates Group, we are committed to supporting candidates with disabilities, and we’re happy to discuss flexible working options. We are committed to protecting the privacy of all our candidates and clients. If you choose to apply, your information will be processed in accordance with the Nicholas Associates Group of companies
Senior Data Engineer
Opus Recruitment Solutions
Milton Keynes
Hybrid
Senior
£450/day - £525/day
python
sql
pyspark
dbt
Outside IR35 | £500–£525 per day | Milton Keynes | Hybrid Working | 6‑Month Initial Term We are currently looking for an experienced Senior Data Engineer to support a major data modernisation programme. You’ll be instrumental in reshaping and enhancing data pipelines as the business moves towards a Databricks Lakehouse setup. The work centres on creating scalable, high‑quality data flows that underpin analytics, reporting, and strategic insight across the organisation. This contract is outside IR35, requires one on‑site day each week, and offers an immediate start with strong extension prospects. What You’ll Be Doing Developing, refining, and maintaining robust ELT/ETL data pipelines Supporting the migration of data assets into a Databricks Lakehouse framework Ensuring data is accurate, reliable, and optimised for analytical consumption Partnering with stakeholders to deliver well‑engineered, business‑aligned solutions Monitoring production systems and resolving performance or reliability issues What They’re Looking For 7+ years of Data Engineering experience, ideally within cloud‑native environments Strong background in building and optimising large‑scale data pipelines Practical expertise with Databricks and Azure services Confident communicator with strong problem‑solving ability Core Technologies Databricks DBT Python PySpark SQL Azure If you are interested in this role then please apply via this platform or email me a copy of your most up to date CV to (url removed) and I will be in touch. Outside IR35 | £500–£525 per day | Milton Keynes | Hybrid Working | 6‑Month Initial Term
Data Engineer
Hays Technology
Basingstoke
Hybrid
Junior - Mid
£50,000 - £55,000
python
yaml
postgresql
sql
snowflake
dbt
Your new company Join a pioneering leader in the niche energy sector, driving the transition to a greener future. This fast-growing organisation is renowned for its cutting-edge technology and commitment to sustainability. With a focus on innovation and customer experience, they are shaping the future of clean transport and creating a positive environmental impact. Your new role As a Data Engineer, you will play a key role in maintaining and expanding the Data Warehouse and Data Pipeline. Reporting to the existing Data Engineer, you’ll collaborate closely with Data Analysts to integrate new data sources, enhance functionality, and optimise performance. Your responsibilities will include monitoring the data stack, designing and modifying data models using dbt Core and VS Code, and ensuring seamless integration of new data sources. This is an exciting opportunity for someone who enjoys problem-solving and wants to make a tangible impact on the organisation’s data capabilities. What you’ll need to succeed To thrive in this role, you’ll need: Strong SQL skills (PostgreSQL or Snowflake SQL within dbt preferred) Understanding of Cloud Data Warehouse concepts and design Knowledge of SOAP and REST APIs, JSON, and YAML Basic Python skills A logical approach to problem-solving and a collaborative mindsetWhat you’ll get in return You’ll receive a salary of up to £55,000, generous holiday allowance, a pension scheme, and the chance to work in a supportive environment that values learning and growth. This is a fantastic opportunity to develop your technical skills while contributing to exciting projects in a rapidly evolving business. What you need to do now If you’re interested in this role, click ‘apply now’ to forward an up-to-date copy of your CV, or call us now. If this job isn’t quite right for you, but you are looking for a new position, please contact us for a confidential discussion about your career. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C’s, Privacy Policy and Disclaimers which can be found at (url removed)
Senior Data Engineer x1/ Data Engineer x1 (Financial Services)
Hays Technology
London
Remote or hybrid
Senior
£600/day - £800/day
react
aws
mongodb
spring-boot
kubernetes
kafka
+10
Your new company Working for a renowned commodity, metals, trades and exchange group. You’ll be a key part of the Enterprise Data team helping to replace legacy ETL tools (Informatica) and deliver modern data engineering capabilities. Your work will include managing data pipelines, supporting analysis and visualisation, and collaborating with ETL developers and wider technology teams to deliver solutions aligned with our strategic roadmap. You’ll work across backend, data, and infrastructure engineering, contributing to solution design, implementation, deployment, testing, and support. This is a hands-on role for someone with strong data engineering skills and experience in regulated environments. Your new role Design, build, and maintain scalable data pipelines and infrastructure for analytics and integration across data platforms. Ensure data quality and reliability through automated validation, monitoring, and testing using Python, Java, or Scala. Develop and manage database architectures, including data lakes and warehouses. Clean, transform, and validate data to maintain consistency and accuracy. Collaborate with technical and non-technical teams, providing clear communication on project progress and requirements. Create and maintain accurate technical documentation. Support internal data analysis and reporting for business objectives. Investigate and resolve data-related issues, implementing improvements for stability and performance. Evaluate and prototype solutions to ensure optimal architecture, cost, and scalability. Implement best practices in automation, CI/CD, and test-driven development.What you’ll need to succeed Strong experience in Data Engineering, with demonstrable lead 5involvement in at least one production-grade data system within financial services or a similarly regulated industry. Strong coding skills in Python or Java (Spring Boot); React experience is a plus. Proficiency with modern data tools: Airflow, Spark, Kafka, dbt, Snowflake or similar. Experience with cloud platforms (AWS, Azure, GCP), containerization (Docker, Kubernetes), and CI/CD. Data Quality: Proven ability to validate and govern data pipelines, ensuring data integrity, correctness, and compliance. Experience working within financial services/ highly regulated environments. Bonus Skills: SQL and RDBMS (PostgreSQL, SQL Server). NoSQL/distributed databases (MongoDB). Streaming pipelines experience. What you’ll get in return An exciting opportunity to join an international organisation in financial services. Furthermore, a competitive day rate inside IR35 for this role will be offered in addition to your own dedicated Hays Consultant to guide you through every step of the application process. What you need to do now If you’re interested in this role, click ‘apply now’ to forward an up-to-date copy of your CV, or call us now. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C’s, Privacy Policy and Disclaimers which can be found at (url removed)
Data Engineer (Snowflake)
Adria Solutions
Brighton
Remote or hybrid
Mid - Senior
£75,000
snowflake
google-analytics
aws
airflow
sql
dbt
Data Engineer (Snowflake) We are seeking an experienced Data Engineer (Snowflake) to join our clients team on a permanent basis. This role will focus on administering and developing our Snowflake data platform, building robust data pipelines, and transforming data to support analytics and marketing activation use cases.The successful candidate will initially work on projects involving the ingestion of multiple data sources - including Google Analytics 4 (GA4)- and transforming data to surface insights within Google Ads.Key Responsibilities Administer, maintain, and optimise the Snowflake data platformDesign, build, and manage scalable ETL/ELT data pipelinesIngest and integrate 34 data sources, including GA4Transform and model data to support reporting and activation in Google AdsEnsure data quality, performance, and cost efficiencyCollaborate with analytics, marketing, and engineering teamsDocument data solutions and provide ongoing platform supportRequired Skills & Experience Strong hands-on experience with SnowflakeProven experience building data pipelines in a cloud environmentAdvanced SQL skills and experience with data modellingExperience working with GA4 or digital analytics dataExperience integrating data with Google Ads or similar platformsFamiliarity with cloud platforms (GCP, AWS, or Azure)Strong communication and problem-solving skillsDesirable Experience Experience with tools such as dbt, Airflow, or similar orchestration frameworksBackground in marketing, analytics, or advertising data environmentsUnderstanding of data governance, privacy, and consent frameworksWhat We Offer Competitive salary and benefits packageFlexible working arrangementsOpportunity to work on high-impact data and marketing initiativesSupportive, collaborative team environmentHow to Apply If you are a skilled Data Engineer (Snowflake) looking for your next permanent opportunity, we would love to hear from you. Please apply with your CV or contact us for further information.Data Engineer (Snowflake)TPBN1_UKTJ
Lead Data Engineer
Cathcart Technology
Multiple locations
Hybrid
Senior
Private salary
aws
kafka
python
java
airflow
scala
+1
I’m working with a world-class technology company in Edinburgh to help them find a Lead Data Engineer to join their team (hybrid working but there is flex on this for the right person). This is your chance to take the technical lead on complex, large-scale data projects that power real-world products used by millions of people . The organisation has been steadily growing for a number of years and have become a market leader in their field so it’s genuinely a really exciting time to join!You’ll be joining a forward-thinking team that’s passionate about doing things properly using a modern tech stack , cloud-first approach, and a genuine commitment to engineering excellence. As Lead Data Engineer, you’ll be hands-on in designing and building scalable data platforms and pipelines that enable advanced analytics, machine learning, and business-critical insights. You’ll shape the technical vision , set best practices, and make key architectural decisions that define how data flows across the organisation.You won’t be working in isolation either as collaboration is at the heart of this role. You’ll work closely with engineers, product managers, and data scientists to turn ideas into high-performing, production-ready systems. You’ll also play a big part in mentoring others , driving standards across the team, and influencing the overall data strategy.The ideal person for this role will have a strong background in data engineering , with experience building modern data solutions using technologies like Kafka , Spark , Databricks , dbt , and Airflow . You’ll know your way around cloud platforms (AWS, GCP, or Azure) and be confident coding in Python , Java , or Scala . Most importantly, you’ll understand what it takes to design data systems that are scalable , reliable and built for the long haul.In return, they are offering a competitive salary (happy to discuss prior to application), great benefits which includes uncapped holidays and multiple bonuses! Their office in central Edinburgh is only a short walk from Haymarket train station. The role is Hybrid (ideally 1 or 2 days in office), however, they can be flex on this for the right candidate.If you’re ready to step into a role where your technical leadership will have a visible impact and where you can build data systems that continue to scale then please apply or contact Matthew MacAlpine at Cathcart Technology.Cathcart Technology is acting as an Employment Agency in relation to this vacancy.TPBN1_UKTJ
Data Engineer
Eligo Recruitment Ltd
Hoddesdon
Fully remote
Senior
£80,000 - £95,000
processing-js
aws
airflow
sql
dbt
Are you a Senior Data Engineer with iGaming or Gambling experience, looking to build and scale modern data platforms? BENEFITS: £80,000–£95,000 depending on experience, fully remote, excellent benefits package You’ll be joining a fast-growing iGaming and online casino company operating a custom-built platform that supports millions of player interactions. The business is a recognised leader across sports betting and online casino, with a strong focus on performance, reliability and data-driven decision-making. As a Senior Data Engineer, you’ll be responsible for designing, building and maintaining scalable data pipelines and infrastructure that underpin analytics, reporting and product insight across the organisation. Core Responsibilities Design, build and maintain robust data pipelines to support analytics, product and reporting needs Develop and optimise ETL/ELT processes for large volumes of player, game and transaction data Work closely with data analysts and stakeholders to ensure data is reliable, accessible and well-structured Improve data quality, monitoring and observability across the platform Support real-time and batch data processing use cases Collaborate with engineering teams to integrate data solutions with the wider platform Ensure data architecture aligns with security, compliance and regulatory requirements Contribute to data platform strategy, tooling decisions and best practiceRequired Experience & Expertise Proven experience as a Data Engineer, ideally within iGaming, gambling or another regulated environment Strong experience with SQL and modern data warehousing solutions Experience building pipelines using tools such as Airflow, dbt or similar Solid understanding of cloud platforms, ideally AWS Experience working with event-driven or streaming data architectures is a plus Strong grasp of data modelling, performance optimisation and scalability Comfortable collaborating with analytics, product and engineering teams Eligo Recruitment is acting as an Employment Business in relation to this vacancy. Eligo is proud to be an equal opportunity employer dedicated to fostering diversity and creating an inclusive and equitable environment for employees and applicants. We actively celebrate and embrace differences, including but not limited to race, colour, religion, sex, sexual orientation, gender identity, national origin, veteran status, and disability. We encourage applications from individuals of all backgrounds and experiences and all will be considered for employment without discrimination. At Eligo Recruitment diversity, equity and inclusion is integral to achieving our mission to ensure every workplace reflects the richness of human diversity
Azure Data Engineer - £500 - Hybrid
Tenth Revolution Group
Newcastle upon Tyne
Hybrid
Mid - Senior
£450/day - £550/day
processing-js
fabric
terraform
github
git
kafka
+7
Azure Data Engineer - £500PD - Hybrid We are seeking an Azure Data Engineer with strong experience in Databricks to design, build, and optimize scalable data pipelines and analytics solutions on the Azure cloud platform. The ideal candidate will have hands-on expertise across Azure data services, data modeling, ETL/ELT development, and collaborative engineering practices. Key Responsibilities * Design, develop, and maintain scalable data pipelines using Azure Databricks (Python, PySpark, SQL). * Build and optimize ETL/ELT workflows that ingest data from various on-prem and cloud-based sources. * Work with Azure services including Azure Data Lake Storage, Azure Data Factory, Azure Synapse Analytics, Azure SQL, and Event Hub. * Implement data quality validation, monitoring, metadata management, and governance processes. * Collaborate closely with data architects, analysts, and business stakeholders to understand data requirements. * Optimize Databricks clusters, jobs, and runtimes for performance and cost efficiency. * Develop CI/CD workflows for data pipelines using tools such as Azure DevOps or GitHub Actions. * Ensure security best practices for data access, data masking, and role-based access control. * Produce technical documentation and contribute to data engineering standards and best practices. Required Skills and Experience * Proven experience as a Data Engineer working with Azure cloud services. * Strong proficiency in Databricks, including PySpark, Spark SQL, notebooks, Delta Lake, and job orchestration. * Strong SQL and data modeling skills (e.g., dimensional modeling, data vault). * Experience with Azure Data Factory or other orchestration tools. * Understanding of data lakehouse architecture and distributed computing principles. * Experience with CI/CD pipelines and version control (Git). * Knowledge of REST APIs, JSON, and event-driven data processing. * Solid understanding of data governance, data lineage, and security controls. * Ability to solve complex technical problems and communicate solutions clearly. Preferred Qualifications * Industry certifications (e.g., Databricks Data Engineer Associate/Professional, Azure Data Engineer Associate). * Experience with Azure Synapse SQL or serverless SQL pools. * Familiarity with streaming technologies (e.g., Spark Structured Streaming, Kafka, Event Hub). * Experience with infrastructure-as-code (Terraform or Bicep). * Background in BI or analytics engineering (Power BI, dbt) is a plus. To apply for this role please submit your CV or contact Dillon Blackburn on (phone number removed) or at (url removed). Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We’re the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment
Page 1 of 2

Frequently asked questions

What types of DBT jobs are available on this platform?
Our platform features a variety of DBT-related roles, including DBT developers, data engineers specializing in DBT, analytics engineers, and roles focused on data transformation using the DBT framework.
Do I need prior experience with DBT to apply for these jobs?
While some positions require hands-on experience with DBT, other entry-level roles or internships may provide opportunities to learn on the job. Job descriptions will specify the experience level required.
Can I find remote DBT job opportunities here?
Yes, we list both onsite and remote DBT jobs. You can filter your job search to focus specifically on remote opportunities based on your preferences.
What skills are employers looking for in DBT professionals?
Employers typically look for skills in SQL, data modeling, version control (git), knowledge of data warehouses like Snowflake or BigQuery, as well as experience with DBT macros and testing frameworks.
How can I make my application stand out for DBT positions?
Highlight your experience with data transformation projects, showcase any public DBT projects or repositories, and emphasize your understanding of data engineering best practices and collaboration using tools like git.