Make yourself visible and let companies apply to you.
Roles

Data Engineer Jobs in London

Overview

Looking for top Data Engineer jobs in London? Explore the latest opportunities on Haystack, your go-to IT job board connecting skilled data engineers with leading companies in the heart of the UK’s tech hub. Whether you're an experienced data engineer or just starting your career, find roles that match your skills and ambitions in London’s vibrant tech scene. Start your job search today and take the next step in your data engineering career!
Filters applied
London
Data Engineer
Search
Salary
Location
Remote preference
Role type
Seniority
Tech stack
Sectors
Contract type
Company size
Visa sponsorship
Principal Data Engineer
CV Technical
Multiple locations
Hybrid
Senior
£55,000
RECENTLY POSTED
r
python
sql
microsoft-azure
CV Technical are proud to be partnered with a leading UK data consultancy who are looking to grow their data engineering capabilities with a Senior Data Engineer on a permanent basis.You’ll have the opportunity to work across a variety of client engagements, create real business impact, and contribute to the growth of a fast-moving, close-knit consultancy.The Role:Manchester based with a hybrid working arrangement.You will work across a diverse portfolio of client assignments, delivering technical solutions that address a wide spectrum of data-related challenges and enabling organisations to make informed, data-driven decisions.In this position, you’ll design and implement advanced data systems and pipelines that meet clients’ needs with scalability and reliability in mind.You will also collaborate closely with both technical and business stakeholders, ensuring you fully understand their objectives and can translate their requirements into effective solutions.Essential Skills & Experience:UK Resident - Position requires Full Security ClearanceBachelors or Masters degree in a STEM subjectProgramming experience in Python, R, SQL, SASExperienced with DevOps processesExperienced working in regulated industriesExperienced on the Microsoft Azure stackExcellent stakeholder and communication skills.4+ years experience in data engineeringExperience mentoring or supporting more junior engineersExperience migrating between platforms.Experience with SAS tools and SAS administration.Agile Development and Deployment experienceAs this role is for a consultancy, travel across the UK to client sites may be required on occasion. Due to the nature of the projects, only candidates who have full working rights (no sponsorship offered) and those who can obtain full security clearance will be considered.Benefits:£40,000 - £60,000 base salary (dependant on experience)Company + Performance bonusExcellent company pension contributionPrivate healthcareLearning & Development budgetUK Remote - Occassional office travelInterviews are happening immediately, please apply directly to be considered.*Only candidates living in the UK and those who can obtain full security clearance are being considered*TPBN1_UKTJ
GCP Data Engineer
Lime Street Recruitment Limited
City of London
Remote or hybrid
Mid - Senior
Private salary
RECENTLY POSTED
processing-js
tableau
In the role you will be designing and developing complex data processing modules and reporting using Big Query and Tableau. In addition, you will also work closely with the Infrastructure/Platform Team who are responsible for architecting, and operating the core of the client’s Data Analytics platform.You will:Work with both the business teams data scientists and engineers to design, build, optimise and maintain production grade data pipelines and reporting from an internal Data warehouse solution, based on GCP/Big QueryWork with finance, actuaries, data scientists and engineers to understand how the client can make best use of new internal and external data sourcesWork with the client’s delivery partners at to ensure robustness of Design and engineering of the data model/ MI and reporting which can support their ambitions for growth and scaleBAU ownership of data models, reporting and integrations/pipelinesCreate frameworks, infrastructure and systems to manage and govern data assetsProduce detailed documentation to allow ongoing BAU support and maintenance of data structures, schema, reporting etc.Work with the broader Engineering community to develop the client’s data and MLOps capability infrastructureEnsure data quality, governance, and compliance with internal and external standards.Monitor and troubleshoot data pipeline issues, ensuring reliability and accuracy.TPBN1_UKTJ
Business Information Analyst
London and Quadrant Housing Trust
London
Hybrid
Mid - Senior
£41,168 - £53,000
RECENTLY POSTED
mysql
dax
Title: Business Information AnalystContract Type: Permanent, Full time, 35 hours per weekSalary: £47,135 to £53,000 per annum London based or £41,168 to £47,033 per annum Regional, dependant on experience.Grade: 9Reporting Office: London, Stratford or Manchester, TraffordPersona: Agile Worker: 20% - 40% of contractual hours to be worked from reporting office (hybrid working)Closing Date: 2 nd February at 5pmInterview Dates: 10 th February 2026 at our office in StratfordEarly applications are encouraged as we reserve the right to close the advertisement and interview earlier than stated.Please click here for the role profileBenefits include: Excellent pension plan (up to 6% double contribution), 28 days Annual Leave rising to 31 days with length of service + Bank Holidays, Westfield Health Cash Plan, non-contributory life assurance, up to 21 hours volunteering paid days, lifestyle benefits, Employee Assistance Programme and many more …Join our Service Insights and Improvement Team at L&Q:Our department has expanded, and we are increasing and improving our reporting functionality as we incorporate new services, systems and data sources moving to one version of the truth.As a key Analyst of Maintenance Services’ data analytics, this role ensures that KPI reporting is developed and refined. You will also provide operational reporting which constantly reflects and anticipates business needs.The successful candidate must have experience designing and delivering end to end BI solutions using Microsoft Power BI including data modelling, DAX and Power Query.You can read and write TSQL & MySQL Syntax at an advanced level and perform select and join queries to return data, with the ability to optimise performance for large datasets.Experience with configuring and supporting reporting development using visualisation tools, Power Paginated reports is a must.We are looking for someone who can work well under pressure, who can fulfil ad-hoc data queries, data requests from around the business as well as support our drive to enable transparency of performance and support new initiatives with dataThe role reports to the Service Insights and Improvement Manager in Maintenance Services.Our core function is to deliver over 220,000 repairs a year to our residents using a mix of inhouse teams and contractors striving to improve first time fix and wait timesYour impact in the role:The Business Information Analyst ensures that Maintenance Service’s operational reporting supports data driven decision making through the creation, development and provision of performance reports, dashboards and insights. Developing robust dashboards, scorecards and reports.Drive the comprehensive delivery of reporting requests/business problems, from initial conception to the deployment of new dashboards/reports, utilising project management methodologies to meticulously track and report progress.Maintain a forward-looking approach to data analysis, identifying areas for investigation and improvement before they become critical.The Business Information Analyst will work with colleagues in Maintenance Services Service Improvement Team and other teams across the department to provide information to make the case for business change and to monitor the embedding of changes made through the delivery of improvement projects.You will help the department achieve its goals by providing strategic data support, anticipating future challenges, developing advanced reporting and analytics for the Heads of Service, and Directors, and ensure data accuracy and improvementWhat you’ll bring:To be shortlisted candidates will need to be able to demonstrate the following -• Read and write TSQL, Syntax at an advanced level (essential)• Experience with Power BI, including the writing of queries and measures in ‘M’ and ‘DAX’ (essential)• Experience with configuring and supporting reporting using visualisation tools. PBI/Power Paginated Report or similar (essential)• Strong I.T skills and advanced knowledge of MS Office 365, Microsoft Excel (essential)• Desirables are Automation, PowerApps, and SharePoint knowledge and a wide experience of sourcing data into BI (SharePoint, Exchange, API etc.)• Commitment to providing high levels of customer service• Experience of producing regular KPI reports• Demonstrates an understanding of business improvement methods• First class communication skills, with an ability to present data and analysis to stakeholders• Capable of embracing constantAgile working can be based either from our office in London or from Manchester. However, candidates based in the North West will be required to attend the London office for monthly meetings in person.If you require any reasonable adjustments at any stage during this process, including application stage, please emailAbout L&Q:We’re one of the UK’s leading housing associations and developers. We were founded on a simple belief: high quality housing is vital for people’s health, happiness and security. Everyone deserves a quality home that gives them the chance to live a better life.250,000 people call our properties ‘home’, and we’re proud to serve diverse communities across London, the South East and North West of England.People are at the heart of our business and our success depends on employing the best people and getting the best from them. The foundation of everything that we are is built on our corporate values and behavioural framework , which outlines our core expectations and should be demonstrated at all times, and all levels, when representing L&Q.At L&Q, we know that diversity and inclusion make us stronger - and they’re at the heart of everything we do.When we recruit, we look at what really matters: your skills, experience, and potential. We’re proud to be recognised for creating an inclusive workplace. We’re a Disability Confident Leader (Level 3) and we’ve introduced our own Recruitment Advocate scheme to make sure every step of our hiring process is fair, transparent, and consistent. It’s all part of our commitment to ending discrimination and making L&Q a place where everyone feels welcome. Fine out more here .Sustainability is also at the heart of what we do. We recognise the responsibility we hold as one of the UK’s largest housing associations.Click here to find out more about L&Q and why you should join us!#TJ
Lead Data Engineer
Experis
London
Hybrid
Senior
Private salary
RECENTLY POSTED
aws
python
sql
snowflake
Job Title: Lead Data Engineer Location: London (Hybrid) Contract: 6 Months (Potential Extension) Start Date: ASAPAbout the Client Our client is transforming their industry by replacing cigarettes with innovative, smoke-free alternatives. They are leveraging technology, data, and AI to drive a global shift toward a smoke-free world. This is a fast-paced, high-impact environment, perfect for candidates who are strategic, independent, and excited to work at the forefront of data and AI innovationThe Role We are looking for a skilled Data Engineer to design, build, and optimize enterprise-scale data pipelines and cloud platforms. You will translate business and AI/ML requirements into robust, scalable solutions while collaborating across multi-disciplinary teams and external vendors.As a key member of the data architecture you will:
Build and orchestrate data pipelines across Snowflake and AWS environments.
Apply data modeling, warehousing, and architecture principles (Kimball/Inmon).
Develop pipeline programming using Python, Spark, and SQL; integrate APIs for seamless workflows.
Support Machine Learning and AI initiatives, including NLP, Computer Vision, Time Series, and LLMs.
Implement MLOps, CI/CD pipelines, data testing, and quality frameworks.
Act as an AI super-user, applying prompt engineering and creating AI artifacts.
Work independently while providing clear justification for technical decisions.
Key Skills & Experience
Strong experience in data pipeline development and orchestration.
Proficient with cloud platforms (Snowflake, AWS fundamentals).
Solid understanding of data architecture, warehousing, and modeling.
Programming expertise: Python, Spark, SQL, API integration.
Knowledge of ML/AI frameworks, MLOps, and advanced analytics concepts.
Experience with CI/CD, data testing frameworks, and versioning strategies.
Ability to work effectively in multi-team, vendor-integrated environments.
Why This Role
Join a global, transformative initiative shaping a smoke-free future.
Work with cutting-edge cloud, AI, and data technologies.
Opportunity to influence technical and strategic decisions across enterprise data delivery.
Dynamic, innovative environment where your work has real business impact.
Postgres Data Architect
Stackstudio Digital Ltd.
London
Hybrid
Mid - Senior
£500/day - £550/day
RECENTLY POSTED
aws
terraform
kafka
postgresql
gitlab
db2
Job DetailsRole / Job Title:Postgres Data Architect with CDC SkillsWork Location:250 Bishopsgate, London, UKOffice Presence (Hybrid):2 days per weekThe RolePostgreSQL Data Architect with strong hands-on experience in Change Data Capture (CDC). The candidate will design and implement robust data migration strategies, ensuring seamless integration between legacy systems and modern cloud-based architectures.Responsibilities
Architect CDC Pipelines: Design and optimize Change Data Capture workflows (IBM CDC or equivalent), including subscription design, bookmarks, resync, and replay strategies
Cloud Migration & Hosting: Lead PostgreSQL migration from on-premises/mainframe to cloud platforms (AWS Aurora preferred), ensuring performance, security, and scalability
Integration & ETL Pipelines: Build robust pipelines using CDC Kafka/S3 Aurora with UPSERT/MERGE patterns; guarantee idempotency, ordering, and reliable delivery
Data Encoding & Validation: Manage EBCDIC UTF-8 conversions, packed decimal/binary numeric, and validate transformations with automated test suites
Cutover & Governance: Execute dual-run validations, reconciliation (counts, checksums), rollback strategies, and ensure compliance with masking, encryption, and IAM policies
Performance & Observability: Monitor lag, throughput, and error rates; develop dashboards (CloudWatch/Grafana) and operational runbooks for proactive alerting
Automation & Tooling: Utilize schema conversion tools, IaC (Terraform), CI/CD pipelines (GitLab), and AWS services (Glue, Athena, Redshift) for downstream analytics
Data Modelling & Conversion (Good to have): Transform Db2 schemas to Aurora PostgreSQL; design logical/physical models, enforce referential integrity, and apply best practices for normalization/denormalization
Your ProfileEssential Skills / Knowledge / Experience
Strong hands-on experience with PostgreSQL (Aurora preferred) and advanced data modelling
Expertise in CDC tools (IBM CDC or similar) and data migration strategies
Proven experience in PostgreSQL cloud hosting and migration
Proficiency in ETL/ELT pipelines, Kafka, and AWS ecosystem
Solid understanding of data encoding, transformation, and validation techniques
Familiarity with IaC, CI/CD, and observability frameworks
Excellent problem-solving and communication skills
Desirable Skills / Knowledge / Experience
Mainframe Modernization
Machine Learning Engineer
Anson McCade
London
Hybrid
Mid - Senior
£65,000
RECENTLY POSTED
aws
tensorflow
python
docker
pytorch
£45,000 - £65,000 GBP £7,000 DV Bonus Hybrid WORKING Location: Central London, Greater London - United Kingdom Type: PermanentTitle: Machine Learning EngineerArea: National Security ProjectsLocation: London (Hybrid) - 3 days per weekSecurity: Eligibility for Developed Vetting Clearance with the UK GovernmentSalary: Up to £65k + £7k annual DV bonus (once obtained)About the Role:We’re seeking Machine Learning Engineers to design and deploy ML models for national security applications. You’ll work on GenAI, LLMOps, and traditional ML solutions using AWS infrastructure, unique datasets collaborating with data scientists and software engineers to build greenfield solutions.What You’ll Do:
Build and optimise Machine Learning pipelines
Develop LLM-powered solutions and apply responsible AI practices
Transition experiments into production-ready solutions
Implement experiment tracking and monitoring
Collaborate with data scientists and engineers
What We’re Looking For:
Proficiency in Python and ML frameworks (scikit-learn, PyTorch, TensorFlow)
Hands-on experience with AWS ML services (SageMaker, Lambda)
Familiarity with containerisation (Docker) and orchestration (Kubernetes/ECS)
Knowledge of LLMOps and GenAI tools (LangChain, LangSmith)
Understanding of feature engineering and vector databases
Strong grasp of CI/CD practices for ML deployment
Why Join Us:
Work with niche datasets and cutting-edge tech
Hybrid working and flexible benefits
£7k tax-free DV bonus once clearance completes
Apply today and make a real-world impact.Reference: AMC/JWH/MLEL1#jawh
Senior Data Scientist
Datatech
London
Hybrid
Senior
£70,000 - £75,000
RECENTLY POSTED
aws
tensorflow
python
pytorch
pandas
scipy
Senior Data Scientist - Customer Data Salary: £70,000 - £85,000 (DoE) Location: Hybrid - 2/3 days per week in a Central London office Job Reference: J13015 Full UK working rights required - no sponsorship available Immediate requirement - strong leadership and senior stakeholder skills We are seeking an experienced, passionate, and highly motivated Senior Data Scientist to play a pivotal role in unlocking the value of customer data and shaping how it is used across the business. This is a senior, highly autonomous position, acting as the number two to the Director of Customer Data, where you will operate at a strategic level while remaining hands-on. This is an excellent opportunity for a senior-level data scientist who wants real ownership, influence, and visibility, and to be part of a business at a transformative point in its data maturity. The company has recently implemented a new Customer Data Platform (CDP) and is at a genuinely exciting stage of its data journey. You will be instrumental in helping define best practice, drive advanced analytics use cases, and influence how customer data is activated across products, CRM, and marketing. While experience with personalisation and recommender systems would be highly desirable, it is not essential. The role is broader in scope and suited to someone who enjoys owning complex customer data problems end-to-end and shaping the direction of advanced data science initiatives. ________________________________________ The Role • Act as a senior technical and strategic lead within the Customer Data team, working closely with (and deputising for) the Director. • Take full ownership of your role, with the autonomy to shape priorities, define approaches, and mould the position to maximise impact. • Lead the development of advanced machine learning solutions across customer data use cases, including (but not limited to) personalisation, segmentation, propensity modelling, and customer insight. • Contribute to the evolution and activation of the newly implemented CDP, helping the organisation realise its full value. • Own the full machine learning lifecycle - from problem definition and model design through to deployment, monitoring, and optimisation. • Collaborate closely with CRM, marketing, product, engineering, and regional teams to ensure solutions are aligned to business goals. • Partner with data engineering and platform teams to ensure scalable, robust, and production-ready solutions. • Act as a senior stakeholder, able to clearly communicate complex concepts and influence decision-making at all levels. _______________________________________ Skills & Experience • Strong, hands-on experience in machine learning and applied data science within customer or commercial domains. • Experience with recommender systems, personalisation, or deep learning is desirable but not essential. • Solid Python skills and experience with ML libraries such as pandas, numpy, scipy, scikit-learn, TensorFlow or PyTorch. • Experience working across cloud environments (GCP, AWS, or Azure) and analytics platforms such as Dataiku. • Good understanding of MLOps practices, including deployment, monitoring, and retraining pipelines. • Proven ability to work cross-functionally with marketing, CRM, product, and engineering teams. • Excellent communication, leadership, and stakeholder management skills. • Experience operating in a global or multi-regional environment is a plus. ________________________________________ If you would like to hear more, please do get in touch. Alternatively, you can refer a friend or colleague by taking part in our fantastic referral schemes. For each relevant candidate you introduce to us (there is no limit) and we place, you will be entitled to our general gift/voucher scheme. Datatech is one of the UK’s leading recruitment agencies in analytics and host of the critically acclaimed Women in Data event. For more information, visit (url removed)
Senior Data Engineer - (ML and AI Platform)
Datatech
London
Hybrid
Senior
£65,000 - £80,000
RECENTLY POSTED
aws
python
sql
pyspark
snowflake
Senior Data Engineer (ML and AI Platform) Location London with hybrid working Monday to Wednesday in the office Salary 65,000 to 80,000 depending on experience Reference J13026We are partnering with an AI first SaaS business that turns complex first party data into trusted, decision ready insight at scale.You will join a collaborative data and engineering team building a modern, cloud agnostic data and AI platforms.This role is well suited to an experienced data engineer who enjoys working thoughtfully with real world data, contributing to reliable production systems, and developing clear and well-structured Python and SQL.Why join: Supportive and inclusive culture where people are encouraged to contribute and be heard Clear progression with space to develop your skills at a sustainable pace An environment where collaboration, learning, and thoughtful engineering are genuinely valuedWhat you will be doing: Contributing to the design and delivery of cloud-based data and machine learning pipelines Working with Python, PySpark and SQL to build clear and maintainable data transformations Helping shape scalable data models that support analytics, machine learning, and product features Collaborating closely with Product, Engineering, and Data Science teams to deliver meaningful production outcomesWhat we are looking for: Experience using Python for data transformation, ideally alongside PySpark Confidence working with SQL and production data models Experience working with at least one modern cloud data platform such as GCP, AWS, Azure, Snowflake, or Databricks Experience contributing to data pipelines that run reliably in production environments A collaborative mindset with clear and thoughtful communicationRight to work in the UK is required. Sponsorship is not available now or in the future.Apply to learn more and see if this could be the next step for you.If you have a friend or colleague who may be interested, referrals are welcome. For each successful placement, you will be eligible for our general gift or voucher scheme. Datatech is one of the UK’s leading recruitment agencies specialising in analytics and is the host of the critically acclaimed Women in Data event. For more information, visit (url removed)
Data Engineer
B3Living
Hertford
Hybrid
Mid - Senior
£54,835 - £60,927
RECENTLY POSTED
fabric
python
sql
Based in Cheshunt, Hertfordshire Permanent, full-time, 37 hours per week Salary: £54,835 - £60,927 per annum Reliable data is at the heart of good decision-making. We’re looking for an experienced Data Engineer to join our newly established data team and work with them to deliver reliable, high-quality data that supports informed decision-making and enables us to deliver better outcomes for our customers. In this role, you’ll design and maintain scalable data pipelines and robust data models, helping to ensure our data is accurate, accessible, and secure. You’ll also improve troubleshooting by introducing error handling and logging and optimise efficiency by monitoring data performance and applying fine tuning techniques. Writing complex queries (SQL, Python and Spark), documenting data structures and working with colleagues to respond to business needs while ensuring alignment with governance standards and GDPR are also key in this role. We’re looking for someone with…
Proven experience in data engineering or data platform development.
Experience with testing frameworks and writing test plans for data pipelines.
Strong analytical and problem-solving skills.
Strong SQL skills and experience with query optimisation.
Knowledge of Microsoft Fabric (Lakehouse, Dataflows, ADF).
An understanding of data modelling concepts (e.g. dimensional, star schema, denormalisation)
Knowledge of performance tuning techniques for ETL and SQL processes
Familiarity with data governance principles and GDPR Due to the type of data you’ll have access to in this role, you’ll be required to undertake a basic criminal record (DBS) check. We’re a social business, based in Cheshunt and across southeast Hertfordshire, helping local people by renting or selling affordable homes. We offer services designed to help our customers live comfortably in their homes, and we work to keep our buildings and estates maintained, offering support when money becomes an issue or when people get older. Our mission is to make a sustainable, positive change to the housing crisis for our customers and communities. We enjoy a benefits package that offers something for everyone, including…
27 days’ holiday plus bank holidays (pro rata for part-time colleagues).
Buy and sell holiday scheme.
Cross-organisational bonus scheme.
Up to 12% pension contribution.
Life assurance (three times salary).
Funded health cash plan or subsidised private medical insurance.
Range of special and family leave.
Car loans, cycle to work and electric car lease scheme.
Discount vouchers and more. The closing date for this vacancy is 2nd February 2026. We are a Disability Confident employer, which means that we offer an interview to a fair and proportionate number of disabled applicants who meet the minimum selection criteria for the job. Other organisations may call this role ETL Engineer, Data Pipeline Engineer, Analytics Engineer, Data Platform Engineer, or BI Developer. We’re committed to building an inclusive workplace where equity, diversity and inclusion are part of our culture, as we recognise the benefits of a diverse workforce. Our 3-year EDI strategy outlines how we’ll achieve this. We strongly welcome applications from underrepresented groups and groups which are identified as a priority within our strategy, including LGBTQIA+, Black, Asian and Minority Ethnic communities, applicants with disabilities and people under 30. We understand that some candidates, particularly from certain groups, may hesitate to apply unless they meet every requirement. While we’re looking for people with the right skills and experience, we also value diverse backgrounds and transferable skills. If you meet most of the criteria and believe you’d thrive in the role, we encourage you to apply. All our vacancies are open to flexible working arrangements, something we are really proud of. The extent to which flexible working is possible will vary between jobs according to the needs of the business and our customers. So, if you’re ready to take your next step as a Data Engineer, please apply via the button shown. This vacancy is being advertised by Webrecruit. The services advertised by Webrecruit are those of an Employment Agency
Lead Data Engineer
Tenth Revolution Group
City of London
Hybrid
Senior
£75,000 - £85,000
RECENTLY POSTED
sql
t-sql
dax
Lead Data Engineer - Hybrid - London - Azure - Databricks - £85k + Bonus I’m working with a global powerhouse that’s been setting the standard for excellence for over 60 years. With more than 1,000 projects delivered worldwide and a combined value exceeding $150 billion, they’ve earned a reputation as a trusted leader in high-value, complex projects. Today, their 2,500-strong team spans three continents, driving innovation and growth at scale. What truly makes this company stand out is its people-first culture. They champion respect, inclusion, and genuine care for their employees, backed by a flexible hybrid model that gives you control over which three office days you work each week. This is an organisation where world-class projects meet an environment that prioritises your well-being and career development. I’m looking for a Lead Data Engineer who thrives on innovation and loves tackling complex data challenges. If building scalable, cloud-based solutions excites you, this is your chance to make a real impact. You’ll work with cutting-edge technology and stay at the forefront of the data engineering field. You’ll Work With Lead the architecture, design, and delivery of Azure Data Services solutions (Data Factory, Data Lake, Azure SQL) Provide technical leadership in Agile delivery teams, mentoring engineers and influencing architectural decisions Design and implement scalable Azure-based data solutions Own the design and implementation of scalable, secure, and high‑performance Azure‑based data platforms Lead the development strategy for advanced Power BI dashboards and analytics used by global stakeholders Set best practices for data quality, governance, security, and accessibility Benefits Competitive salary up to £85k + 10% discretionary bonus 8% non-contributory pension, private medical insurance, virtual GP access 25 days annual leave (option to buy more), volunteering day, extra leave with tenure A high-performance, high-trust environment with global exposure and flexibilityKey experience Hands-on experience with Azure & Databricks Strong data engineering and modelling skills Proficiency in Power BI, T-SQL, DAX Provide technical leadership in Agile delivery teams, mentoring engineers and influencing architectural decisions Interviews are happening now don’t wait to take the next step in your career. Apply today and secure your opportunity to join a leading team
Snowflake Data Engineer
Tenth Revolution Group
London
Hybrid
Senior
£85,000 - £100,000
RECENTLY POSTED
snowflake
processing-js
aws
git
kafka
python
+4
Senior Snowflake Data Engineer - Hybrid - £85k-£100k About the Role I am looking for an experienced Senior Snowflake Data Engineer to join a dynamic team working on cutting-edge data solutions. This is an exciting opportunity to design, build, and optimise high-performance data pipelines using Snowflake, dbt, and modern engineering practices. If you are passionate about data engineering, test-driven development, and cloud technologies, we’d love to hear from you. Key Responsibilities Design, develop, and optimise scalable data pipelines in Snowflake. Build and maintain dbt models with robust testing and documentation. Apply test-driven development principles for data quality and schema validation. Optimise pipelines to reduce processing time and compute costs. Develop modular, reusable transformations using SQL and Python. Implement CI/CD pipelines and manage deployments via Git. Automate workflows using orchestration tools such as Airflow or dbt Cloud. Configure and optimise Snowflake warehouses for performance and cost efficiency. Required Skills & Experience 7+ years in data engineering roles. 3+ years hands-on experience with Snowflake. 2+ years production experience with DBT (mandatory). Advanced SQL and strong Python programming skills. Experience with Git, CI/CD, and DevOps practices. Familiarity with ETL/ELT tools and cloud platforms (AWS, Azure). Knowledge of Snowflake features such as Snowpipe, streams, tasks, and query optimisation. Preferred Qualifications Snowflake certifications (SnowPro Core or Advanced). Experience with DBT Cloud and custom macros. Exposure to real-time streaming (Kafka, Kinesis). Familiarity with data observability tools and BI integrations (Tableau, Power BI). On offer Opportunity to work with modern data technologies and large-scale architectures. Professional development and certification support. Collaborative, engineering-focused culture. Competitive salary and benefits package. Interested? Apply now with your CV highlighting your Snowflake, DBT and DevOps experience
KDB Developer
James Joseph Associates Limited
London
In office
Mid - Senior
£100,000
RECENTLY POSTED
linux
processing-js
aws
Our client is a leading, well-established digital asset quantitative trading firm. The business continues to go from strength to strength and is currently in a high-growth phase, delivering record profits.Due to this expansion, an opportunity has arisen for an accomplished KDB Developer to join the team. In this role, you will work closely with front-office quants and traders and will be responsible for designing and developing latency-sensitive data solutions.THE ROLE:Build and improve analytics capabilities that give the front office and management clearer visibility across both new and existing areas of the business.Develop scalable data pipelines to capture, clean, and store high-volume datasets from internal trading platforms and external market data feeds.Create and refine efficient data-access frameworks to support live dashboards, execution performance analysis, and market microstructure research.Design and implement complex event processing (CEP) workflows to detect data issues in real time and generate actionable signals.Diagnose problems and continuously strengthen the platforms reliability, performance, and cost efficiency.Contribute across the full software development lifecyclefrom initial research and prototyping through delivery, testing, deployment, and ongoing support.SKILL / EXPERIENCE REQUIRED:Several years of commercial experience in data engineering, analytics development in a similar high-performance environmentStrong KDB+/q development skills (core focus)Solid Linux engineering experience, including awareness of low-level networking and kernel-level understanding.Knowledge of large-scale storage systems, data distribution, and resilience/failover strategies.Familiarity configuring and operating AWS services and Kubernetes-based infrastructure.Experience working with Level 1, Level 2, and Level 3 market data feeds.Background in big data performance tuning, analytical workflows, and data visualisation.
Viva Consultant - £500PD Inside IR35 - Hybrid
Tenth Revolution Group
London
Hybrid
Mid - Senior
£400/day - £500/day
RECENTLY POSTED
fabric
dax
Viva Consultant - £500PD Inside IR35 - RemoteWe’re looking for a data professional who can define, design, and deliver a comprehensive analytics strategy for Microsoft Viva. This role goes beyond building reports from a predefined brief - you’ll take ownership of Viva analytics end-to-end, partnering with stakeholders to shape the right questions, define best-practice metrics, and translate data into meaningful insight and action.You’ll act as a trusted advisor on how Viva data should be used to measure adoption, engagement, and business impact, while also rolling up your sleeves to implement the technical solution.Key ResponsibilitiesStrategy & Stakeholder Leadership
Define and own the analytics strategy for Microsoft Viva, aligning usage data to business outcomes and employee experience goals.
Partner with senior stakeholders across HR, IT, Communications, and the business to understand needs, influence priorities, and gain buy-in.
Advise on best-practice metrics and measurement approaches for engagement, learning, productivity, and goal alignment.
Translate ambiguous or evolving requirements into clear analytical frameworks and delivery plans.
Take full ownership of Viva analytics solutions from concept through to delivery and ongoing optimisation.
Solution Design & Delivery
Design and develop usage analytics solutions across Microsoft Viva modules: Connections, Engage, Insights, Learning, Goals, and Topics.
Define KPIs and success measures for adoption, engagement, and impact across Viva capabilities.
Build scalable and automated data pipelines integrating:
Microsoft Graph
Viva Insights
Microsoft 365 usage reports
Design and deliver interactive Power BI dashboards for executive, functional, and operational audiences.
Ensure analytics solutions are intuitive, actionable, and clearly tell a story - not just report numbers.
Data Engineering & Governance
Automate data extraction, transformation, and refresh using tools such as Power Automate, Azure Data Factory, or similar.
Ensure data quality, accuracy, security, and compliance with organisational policies and privacy standards.
Document data models, definitions, and analytics logic to support transparency and reuse.
Continuously improve performance, reliability, and scalability of analytics solutions.
Skills & Experience
Strong experience designing and delivering analytics solutions end-to-end, including both strategy and implementation.
Hands-on expertise with Power BI (data modelling, DAX, visual design, performance optimisation).
Experience working with Microsoft 365 data, ideally including Microsoft Graph and Viva Insights.
Ability to define KPIs and analytics frameworks in ambiguous or evolving environments.
Proven ability to engage stakeholders, influence decisions, and drive alignment.
Strong analytical thinking with the ability to translate data into insight and recommendations.
To apply for this role please submit your CV or contact Dillon Blackburn (see below)Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We’re the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
Senior Data Engineer
Tenth Revolution Group
Multiple locations
Fully remote
Senior
£60,000 - £65,000
RECENTLY POSTED
fabric
python
sql
pyspark
About the Role We are looking for a Senior Data Engineer to join a leading Microsoft partner that is modernising data platforms and delivering innovative analytics solutions for organisations across the UK. You will work closely with clients to understand their business challenges before designing tailored solutions that improve efficiency, drive self‑service reporting and support long‑term scalability. This is a hands‑on role where you will support clients from a variety of different sectors. You will also be able to supplement this hands-on experience with the opportunity to gain Microsoft focus certifications and accreditations. Responsibilities Build and manage data pipelines using Azure Synapse, Data Factory, Databricks or Microsoft Fabric Design, implement and maintain data lakes data warehouses and ETL/ELT processes Develop scalable data models for reporting in Power BI Work closely with stakeholders to understand business needs and advise on solutions that best fit the individual needs of the businessSkills and Experience Hands‑on experience Azure services such as Synapse, Data Factory or Databricks Strong SQL skills Proficiency in Python and/or PySpark Experience with Power BI and data modellingWhat is on offer Salary up to £65,000 Fully remote working from anywhere in the UK Performance‑related bonus scheme Pension scheme and private healthcare optionsThis is just a brief overview of the role. For the full details, simply apply with your CV and we’ll be in touch to discuss it further. Tenth Revolution Group are the go‑to recruiter for Data & AI roles in the UK, offering more opportunities nationwide than any other recruitment agency. We are proud sponsors of SQLBits, Power Platform World Tour and the London Fabric User Group
Data Engineer
DTG Global
Enfield
Remote or hybrid
Junior - Mid
£50,000 - £60,000
RECENTLY POSTED
fabric
python
sql
Data Engineer needed for a UK organisation investing in its data platform to support the development of reliable, well-structured datasets for reporting and analytics. This role sits within a central data team and focuses on building, maintaining and improving data pipelines and data models that support consistent access to data across the business. The role * Build and maintain data pipelines to ingest, transform and deliver data for analytical use * Ensure data is processed efficiently and securely, in line with agreed standards * Implement monitoring, logging and error handling across data workflows * Create and maintain tests to validate data accuracy and pipeline behaviour * Monitor performance and apply optimisation techniques where required * Design and maintain relational and dimensional data models * Write and optimise SQL and transformation logic * Produce clear technical documentation covering data structures and processes * Work with colleagues across data, analytics and governance to support data quality and compliance What we’re looking for * Experience in data engineering or data platform development * Strong SQL skills with experience working on production data pipelines * Understanding of data modelling concepts for analytics and reporting * Familiarity with layered or staged data architectures * Knowledge of data quality, validation and governance practices * Strong analytical and problem-solving skills Desirable experience * Cloud data platforms such as Microsoft Fabric or Azure data services * Python or Spark for data transformation and automation * Experience writing tests for data pipelines * Ability to communicate technical concepts clearly to non-technical stakeholders * Experience working in regulated or data-sensitive environments If you’re a Data Engineer looking for a role focused on building robust data foundations rather than ad-hoc reporting, we’d be happy to discuss this further in confidence
Oracle Analytics & AI Developer - SC Clearance Required
DWI Consulting Ltd
London
Remote or hybrid
Senior
Private salary
RECENTLY POSTED
processing-js
jira
Job Description: We are seeking an experienced SC Cleared Oracle Analytics & AI Developer to play a key role on a confidential national programme. You will be responsible for designing, developing, and demonstrating innovative proof-of-concepts involving data reporting, artificial intelligence, and machine learning within the client’s environment. The ideal candidate will be a senior, hands-on technologist with a strong foundation in Oracle’s cloud ecosystem and a particular emphasis on analytics and AI solutions.Key Responsibilities:
Serve as a lead developer, focusing on creating AI, ML, and reporting proof-of-concepts on enterprise systems and data.
Design and implement solutions using advanced analytics platforms, cloud data warehouses, and AI services.
Develop and optimize Back End systems with a focus on scalable AI frameworks, including Retrieval-Augmented Generation (RAG) architectures.
Apply expertise in AI model tuning, optimization, and the fine-tuning of large language models.
Work within structured delivery frameworks (Agile/Waterfall), contributing to the full software development life cycle (SDLC).
Produce comprehensive technical documentation and articulate complex concepts to both technical and non-technical stakeholders.
Collaborate effectively with geographically dispersed teams and external partners.
Ensure all solutions adhere to stringent security best practices for cloud environments.
Required Skills & Experience:
5+ years of hands-on Back End development experience, delivering complex, large-scale systems.
Proven expertise with the Oracle technology stack, including cloud data platforms, advanced analytics tools, and AI/ML functionalities.
Demonstrable project experience with AI technologies such as Generative AI, Natural Language Processing (NLP), AI agents, and machine learning.
Strong understanding of the SDLC and experience with methodologies like Agile and tools such as JIRA.
Excellent communication, interpersonal, and stakeholder management skills.
Proven ability to work proactively in a fast-paced, dynamic programme environment.
Experience collaborating with geographically distributed teams.
Knowledge of security best practices in cloud-based architectures.
Eligibility for UK Security Clearance (a mandatory requirement).
Other Information:
Experience with Oracle Cloud Infrastructure (OCI), application development tools, or integration platforms is highly beneficial.
Knowledge of data migration strategies for large-scale enterprise systems is advantageous.
Occasional travel to client sites in Southern England may be required.
Page 1 of 6

Frequently asked questions

What types of Data Engineer jobs are available in London on Haystack?
Haystack features a wide range of Data Engineer roles in London, including positions in startups, established tech companies, financial institutions, and more, covering junior to senior levels.
Do I need London-based work authorization to apply for these jobs?
Most London-based Data Engineer jobs require valid work authorization or visa sponsorship. Job listings typically specify these requirements to help you apply accordingly.
Can I find remote or hybrid Data Engineer jobs based in London?
Yes, many employers offer remote or hybrid working options for Data Engineers in London. You can filter job listings based on work location preferences on Haystack.
What skills are most in demand for Data Engineer roles in London?
Key skills include proficiency in SQL, Python, ETL tools, cloud platforms like AWS or Azure, and experience with big data technologies such as Hadoop or Spark.
How can I apply for Data Engineer jobs on Haystack?
You can browse listings, create a profile, upload your CV, and apply directly through the platform. Many listings also provide contact details for recruiters if you prefer direct communication.