Make yourself visible and let companies apply to you.
Roles

Data Engineer Jobs

Overview

Looking for top Data Engineer jobs? Explore the latest data engineering opportunities on Haystack, your go-to IT job board. Whether you're skilled in ETL, data pipelines, or big data technologies, find the perfect role to advance your career today. Start your search for Data Engineer positions now!
Filters applied
Data Engineer
Search
Salary
Location
Remote preference
Role type
Seniority
Tech stack
Sectors
Contract type
Company size
Visa sponsorship
Business Systems & Reporting Specialist
Rise Technical Recruitment
Poole
In office
Junior - Mid
£45,000 - £50,000
RECENTLY POSTED
fabric
sql
qlikview
snowflake
Poole, Dorset 45,000 - 50,000 + Bonus + Extensive Benefits PackageThis is an excellent opportunity for a Business Systems Specialist professional to join a business in the process of a major digital evolution project. If you are a technical generalist with strong SQL and reporting skills who wants to specialise in advanced analytics and data architecture, this role offers a structured career path from systems support into a dedicated Data, Reporting & BI specialist position.This company is a leading provider of essential products for businesses across various sectors. They specialise in delivering a comprehensive range of high-quality janitorial, catering, and packaging supplies, helping organisations maintain efficiency and hygiene in their operations.In this varied role, you will initially manage the day-to-day health of existing business systems (ERP, BI, PIM, and integrations), ensuring operational reports and data integrity remain seamless. Post-ERP go-live, your focus will shift toward a dedicated Data and BI strategy. This will include building data pipelines, manage data warehouses, and design Qlik dashboards to support decision-making.The ideal candidate will be a proactive problem-solver with a strong foundation in SQL and business systems support. You will possess a solid understanding of data flows, API integrations, and automation tools, with a natural ability to adapt to new technologies quickly. While not mandatory, experience with modern data platforms like Snowflake, Azure Data Services, or Microsoft Fabric, paired with a talent for dashboard design and a firm grasp of data governance and security would be beneficial.This is a fantastic opportunity to secure a role that guarantees professional development. You will have the unique chance to learn the business’s operational foundations from the ground up before becoming the go-to expert for advanced analytics, data governance, and BI strategy in a modern, cloud-first environment.The Role:
Maintain ERP, BI, and PIM systems during transition.
Create operational reports and ensure data integrity.
Build data pipelines via Snowflake or Microsoft Fabric.
Design high-quality Qlik dashboards for business insights.
The Person:
Proficient in SQL and business systems support.
Experienced in Qlik (or similar) and dashboard design.
Knowledge of ETL, warehousing, and Azure/Snowflake.
Able to engage both technical and non-technical stakeholders.
Reference Number: BBBH(phone number removed)Rise Technical Recruitment Ltd acts an employment agency for permanent roles and an employment business for temporary roles.The salary advertised is the bracket available for this position. The actual salary paid will be dependent on your level of experience, qualifications and skill set and will be decided by our client, the employer. Rise are not responsible or liable for any hiring decisions made by the end client.We are an equal opportunities company and welcome applications from all suitable candidates.
Senior BI Analyst (Tableau and SQL)
Akkodis
Manchester
Hybrid
Senior
£45,000 - £60,000
RECENTLY POSTED
sql
tableau
hubspot
45,000 - 60,000 + strong benefitsFull Time / PermanentManchester / Hybrid (2-3 days a week in the office)The CompanyMy client are a well-established and innovative digital agency who deliver strategic consultancy, web development, and digital marketing to an impressive portfolio of high profile clients.The RoleThis is a growth related opportunity for an experienced BI Analyst who is looking for a role where they can genuinely make a big impact in a short space of time.The BI Analyst will take ownership of a cloud-based data infrastructure and reporting ecosystem playing a critical role in transforming fragmented data into trusted, structured systems that power decision-making across the business.This is a hybrid role working from my client’s Manchester head office 2-3 days a week.Skills and Experience required
Must be a proven BI Analyst with strong technical expertise and a natural curiosity for data.
Must be a self-starter who loves getting stuck in and has a real passion for finding solutions.
It is absolutely essential that have proven commercial experience using Tableau to build dashboards and reports.
Must also have strong SQL skills and be comfortable writing and editing SQL queries to manipulate and extract data.
Any experience working with cloud-based tools like Google Sheets or HubSpot would be great but are not essential and can be learned.
Please apply via the advert or contact (url removed) for more information.Modis International Ltd acts as an employment agency for permanent recruitment and an employment business for the supply of temporary workers in the UK. Modis Europe Ltd provide a variety of international solutions that connect clients to the best talent in the world. For all positions based in Switzerland, Modis Europe Ltd works with its licensed Swiss partner Accurity GmbH to ensure that candidate applications are handled in accordance with Swiss law.Both Modis International Ltd and Modis Europe Ltd are Equal Opportunities Employers.By applying for this role your details will be submitted to Modis International Ltd and/ or Modis Europe Ltd. Our Candidate Privacy Information Statement which explains how we will use your information is available on the Modis website.
Senior Data Engineer Contract Dublin Contract
Adecco
Not Specified
In office
Senior
£517/day - £604/day
RECENTLY POSTED
processing-js
python
csharp
xml
Senior Data Engineer Contract Dublin Contract 6-24 Months. My client a leading name in their respected industry is in urgent need of a talented and experienced Senior Data Engineer to join them on a contact basis.You will design and implement scalable and efficient data pipelines, ETL processes, and data integration solutions to collect, process, and store large volumes of data. You will collaborate with others in the development of data models, schema designs, and data architecture frameworks to support diverse analytical and reporting needs. You will build and optimise data processing workflows using distributed computing frameworks available on Azure, our preferred cloud provider. Integrate data from various internal and external sources including databases, APIs, and streaming platforms into centralised data repositories and data warehouses.Successful candidates will have 8 years of commercial experience working as an analyst developer or data engineer in a data-centric environment. You will have Proven experience in designing and implementing end-to-end data solutions from ingestion to consumption. You will have a Strong experience with Azure data PaaS services and data pipeline delivery on the Azure platform with Databricks. You will have experience delivering data platforms with C#, Python, JSON, XML, APIs, and message bus technology. Or similar technologies. Strong knowledge of database systems, data modelling, and data integration technologies. This a unique opportunity to work with a team that is at the beginning phase of the projects. If this sounds of interest drop me a CV so we can speak in more detail.
Business Information Analyst
London and Quadrant Housing Trust
London
Hybrid
Mid - Senior
£41,168 - £53,000
RECENTLY POSTED
mysql
dax
Title: Business Information AnalystContract Type: Permanent, Full time, 35 hours per weekSalary: £47,135 to £53,000 per annum London based or £41,168 to £47,033 per annum Regional, dependant on experience.Grade: 9Reporting Office: London, Stratford or Manchester, TraffordPersona: Agile Worker: 20% - 40% of contractual hours to be worked from reporting office (hybrid working)Closing Date: 2 nd February at 5pmInterview Dates: 10 th February 2026 at our office in StratfordEarly applications are encouraged as we reserve the right to close the advertisement and interview earlier than stated.Please click here for the role profileBenefits include: Excellent pension plan (up to 6% double contribution), 28 days Annual Leave rising to 31 days with length of service + Bank Holidays, Westfield Health Cash Plan, non-contributory life assurance, up to 21 hours volunteering paid days, lifestyle benefits, Employee Assistance Programme and many more …Join our Service Insights and Improvement Team at L&Q:Our department has expanded, and we are increasing and improving our reporting functionality as we incorporate new services, systems and data sources moving to one version of the truth.As a key Analyst of Maintenance Services’ data analytics, this role ensures that KPI reporting is developed and refined. You will also provide operational reporting which constantly reflects and anticipates business needs.The successful candidate must have experience designing and delivering end to end BI solutions using Microsoft Power BI including data modelling, DAX and Power Query.You can read and write TSQL & MySQL Syntax at an advanced level and perform select and join queries to return data, with the ability to optimise performance for large datasets.Experience with configuring and supporting reporting development using visualisation tools, Power Paginated reports is a must.We are looking for someone who can work well under pressure, who can fulfil ad-hoc data queries, data requests from around the business as well as support our drive to enable transparency of performance and support new initiatives with dataThe role reports to the Service Insights and Improvement Manager in Maintenance Services.Our core function is to deliver over 220,000 repairs a year to our residents using a mix of inhouse teams and contractors striving to improve first time fix and wait timesYour impact in the role:The Business Information Analyst ensures that Maintenance Service’s operational reporting supports data driven decision making through the creation, development and provision of performance reports, dashboards and insights. Developing robust dashboards, scorecards and reports.Drive the comprehensive delivery of reporting requests/business problems, from initial conception to the deployment of new dashboards/reports, utilising project management methodologies to meticulously track and report progress.Maintain a forward-looking approach to data analysis, identifying areas for investigation and improvement before they become critical.The Business Information Analyst will work with colleagues in Maintenance Services Service Improvement Team and other teams across the department to provide information to make the case for business change and to monitor the embedding of changes made through the delivery of improvement projects.You will help the department achieve its goals by providing strategic data support, anticipating future challenges, developing advanced reporting and analytics for the Heads of Service, and Directors, and ensure data accuracy and improvementWhat you’ll bring:To be shortlisted candidates will need to be able to demonstrate the following -• Read and write TSQL, Syntax at an advanced level (essential)• Experience with Power BI, including the writing of queries and measures in ‘M’ and ‘DAX’ (essential)• Experience with configuring and supporting reporting using visualisation tools. PBI/Power Paginated Report or similar (essential)• Strong I.T skills and advanced knowledge of MS Office 365, Microsoft Excel (essential)• Desirables are Automation, PowerApps, and SharePoint knowledge and a wide experience of sourcing data into BI (SharePoint, Exchange, API etc.)• Commitment to providing high levels of customer service• Experience of producing regular KPI reports• Demonstrates an understanding of business improvement methods• First class communication skills, with an ability to present data and analysis to stakeholders• Capable of embracing constantAgile working can be based either from our office in London or from Manchester. However, candidates based in the North West will be required to attend the London office for monthly meetings in person.If you require any reasonable adjustments at any stage during this process, including application stage, please emailAbout L&Q:We’re one of the UK’s leading housing associations and developers. We were founded on a simple belief: high quality housing is vital for people’s health, happiness and security. Everyone deserves a quality home that gives them the chance to live a better life.250,000 people call our properties ‘home’, and we’re proud to serve diverse communities across London, the South East and North West of England.People are at the heart of our business and our success depends on employing the best people and getting the best from them. The foundation of everything that we are is built on our corporate values and behavioural framework , which outlines our core expectations and should be demonstrated at all times, and all levels, when representing L&Q.At L&Q, we know that diversity and inclusion make us stronger - and they’re at the heart of everything we do.When we recruit, we look at what really matters: your skills, experience, and potential. We’re proud to be recognised for creating an inclusive workplace. We’re a Disability Confident Leader (Level 3) and we’ve introduced our own Recruitment Advocate scheme to make sure every step of our hiring process is fair, transparent, and consistent. It’s all part of our commitment to ending discrimination and making L&Q a place where everyone feels welcome. Fine out more here .Sustainability is also at the heart of what we do. We recognise the responsibility we hold as one of the UK’s largest housing associations.Click here to find out more about L&Q and why you should join us!#TJ
Postgres Data Architect
Stackstudio Digital Ltd.
London
Hybrid
Mid - Senior
£500/day - £550/day
RECENTLY POSTED
aws
terraform
kafka
postgresql
gitlab
db2
Job DetailsRole / Job Title:Postgres Data Architect with CDC SkillsWork Location:250 Bishopsgate, London, UKOffice Presence (Hybrid):2 days per weekThe RolePostgreSQL Data Architect with strong hands-on experience in Change Data Capture (CDC). The candidate will design and implement robust data migration strategies, ensuring seamless integration between legacy systems and modern cloud-based architectures.Responsibilities
Architect CDC Pipelines: Design and optimize Change Data Capture workflows (IBM CDC or equivalent), including subscription design, bookmarks, resync, and replay strategies
Cloud Migration & Hosting: Lead PostgreSQL migration from on-premises/mainframe to cloud platforms (AWS Aurora preferred), ensuring performance, security, and scalability
Integration & ETL Pipelines: Build robust pipelines using CDC Kafka/S3 Aurora with UPSERT/MERGE patterns; guarantee idempotency, ordering, and reliable delivery
Data Encoding & Validation: Manage EBCDIC UTF-8 conversions, packed decimal/binary numeric, and validate transformations with automated test suites
Cutover & Governance: Execute dual-run validations, reconciliation (counts, checksums), rollback strategies, and ensure compliance with masking, encryption, and IAM policies
Performance & Observability: Monitor lag, throughput, and error rates; develop dashboards (CloudWatch/Grafana) and operational runbooks for proactive alerting
Automation & Tooling: Utilize schema conversion tools, IaC (Terraform), CI/CD pipelines (GitLab), and AWS services (Glue, Athena, Redshift) for downstream analytics
Data Modelling & Conversion (Good to have): Transform Db2 schemas to Aurora PostgreSQL; design logical/physical models, enforce referential integrity, and apply best practices for normalization/denormalization
Your ProfileEssential Skills / Knowledge / Experience
Strong hands-on experience with PostgreSQL (Aurora preferred) and advanced data modelling
Expertise in CDC tools (IBM CDC or similar) and data migration strategies
Proven experience in PostgreSQL cloud hosting and migration
Proficiency in ETL/ELT pipelines, Kafka, and AWS ecosystem
Solid understanding of data encoding, transformation, and validation techniques
Familiarity with IaC, CI/CD, and observability frameworks
Excellent problem-solving and communication skills
Desirable Skills / Knowledge / Experience
Mainframe Modernization
Data Architect Senior
Stackstudio Digital Ltd.
Glasgow
In office
Senior
£550/day - £600/day
RECENTLY POSTED
react
aws
javascript
python
elasticsearch
pandas
+3
Role DetailsRole / Job Title:Data Architect SeniorJob Type:ContractingLocation:Glasgow (5 days onsite)Role Administration DetailsRole OverviewWe are seeking an experienced Senior Data Architect to join our Market Data Services team, focusing on our high-impact 3rd Party Data Project. This is a contract role for an initial 6-month period, based in our Glasgow office, with full-time onsite presence required.As a key member of the team, you will play a pivotal role in modelling currently available data, defining and structuring new data catalogues of marketplace data products, and validating existing contracts and data usage rights. Your expertise in RDF data and data architecture will be essential as you review and enhance our data management practices, develop new data architecture policies, and strategically structure data flows for the project.You will be working within an Agile project environment, participating in two-week sprint cycles, and collaborating closely with cross-functional teams to deliver high-quality solutions. The Engineering Team develop predominantly be in Python, with Pandas and SPARQ Data frames for backend APIs, and with JavaScript with React for the front end and an understanding of these technologies is beneficial.Key Responsibilities
Model and structure currently available marketplace data and define new data catalogs.
Validate existing data contracts and usage rights, ensuring compliance and optimal utilization.
Review current data management practices and develop robust, future-proof data architecture policies.
Design and implement strategic data flows for the 3rd Party Data Project.
Work extensively with RDF data, leveraging SPARQL for graph queries and data models.
Utilize SQL for Oracle databases and No-SQL DSL for ElasticSearch to manage and query data.
Collaborate with stakeholders to ensure data solutions align with business and regulatory requirements.
Actively participate in Agile ceremonies and two-week sprint cycles.
Required Skills & Experience
Proven experience in data architecture and data modeling within professional services or contract environments.
Strong hands-on expertise with RDF data and SPARQL graph query languages.
Deep familiarity with W3C standards for data modeling and interoperability.
Experience with AWS Neptune, AWS Glue, ElasticSearch, and Oracle database products.
Proficient in SQL for Oracle and No-SQL DSL for ElasticSearch.
Demonstrated ability to validate data contracts and usage rights.
Experience working in Agile teams.
Track record of developing and implementing data management policies and best practices.
Excellent communication and stakeholder management skills.
Managing expectations of non-technical stakeholders.
Good to Have (As Applicable)
Familiar with AWS via hands on experience or certification
Familiarity with orchestration tools like Airflow
Familiarity with BASEL regulatory reporting framework
Familiarity with engineering in a regulatory controlled environmen
Machine Learning Engineer
Anson McCade
London
Hybrid
Mid - Senior
£65,000
RECENTLY POSTED
aws
tensorflow
python
docker
pytorch
£45,000 - £65,000 GBP £7,000 DV Bonus Hybrid WORKING Location: Central London, Greater London - United Kingdom Type: PermanentTitle: Machine Learning EngineerArea: National Security ProjectsLocation: London (Hybrid) - 3 days per weekSecurity: Eligibility for Developed Vetting Clearance with the UK GovernmentSalary: Up to £65k + £7k annual DV bonus (once obtained)About the Role:We’re seeking Machine Learning Engineers to design and deploy ML models for national security applications. You’ll work on GenAI, LLMOps, and traditional ML solutions using AWS infrastructure, unique datasets collaborating with data scientists and software engineers to build greenfield solutions.What You’ll Do:
Build and optimise Machine Learning pipelines
Develop LLM-powered solutions and apply responsible AI practices
Transition experiments into production-ready solutions
Implement experiment tracking and monitoring
Collaborate with data scientists and engineers
What We’re Looking For:
Proficiency in Python and ML frameworks (scikit-learn, PyTorch, TensorFlow)
Hands-on experience with AWS ML services (SageMaker, Lambda)
Familiarity with containerisation (Docker) and orchestration (Kubernetes/ECS)
Knowledge of LLMOps and GenAI tools (LangChain, LangSmith)
Understanding of feature engineering and vector databases
Strong grasp of CI/CD practices for ML deployment
Why Join Us:
Work with niche datasets and cutting-edge tech
Hybrid working and flexible benefits
£7k tax-free DV bonus once clearance completes
Apply today and make a real-world impact.Reference: AMC/JWH/MLEL1#jawh
Junior Data Engineer
Pontoon
Bristol
Remote or hybrid
Junior
Private salary
RECENTLY POSTED
airflow
sql
Job Title: Junior Data Engineer Are you ready to kickstart your career in the exciting world of data engineering? Our client, a leading organization in the IT & Digital industry, is on the hunt for a passionate Junior Data Engineer to join their dynamic team! If you have a knack for problem-solving and are eager to learn, this is the perfect opportunity for you! Pay Rate: Competitive (Umbrella) Duration: 6 months with a view to extend to 9 months Working Pattern: Remote (occasional travel to Bristol for training) Start Date: ASAP What You will Do: As a Junior Data Engineer, you will work closely with a Senior Data Engineer to support a critical application launch. This hands-on role focuses on maintaining data integrity and implementing SQL-based fixes and patches in a fast-paced production environment. Your contributions will be vital in ensuring smooth operations through: * Writing and executing queries for code fixes, patches, and data corrections * Supporting the Senior Data Engineer in maintaining database stability and performance as the application goes live * Troubleshooting and resolving data-related issues promptly * Implementing database patches and updates * Performing data validation and quality checks to ensure accuracy * Documenting technical processes and maintaining clear records of changes made What We are Looking For: To thrive in this role, you should possess a solid foundation in data engineering concepts and technologies. Here is what you need: Essential Skills: * Proficiency in Microsoft SQL Server (mandatory) * Solid understanding of query writing, optimization, and debugging * Experience with database maintenance, data fixes, and patch implementation * Ability to work under pressure in a production environment * Good problem-solving skills and attention to detail * Excellent communication skills to work effectively with senior engineers and stakeholders Desirable Skills: * Experience with Apache Airflow for workflow orchestration * Familiarity with Alteryx for data preparation * Knowledge of REST APIs and integration patterns * Awareness of Agile methodologies Qualifications: * Bachelor’s degree in computer science, Engineering, or a related technical field * Technology certifications in database administration or data engineering are a plus Why Join Us? Our client values creativity, teamwork, and continuous learning. By joining their talented team, you will have the opportunity to: * Work on impactful projects that drive innovation * Collaborate with experienced professionals who are eager to mentor and support you * Enjoy a flexible work environment that promotes work-life balance * Develop your skills in a thriving industry Ready to Apply? If you are excited about this opportunity and meet the qualifications, we would love to hear from you! Join our client in shaping the future of data engineering. Apply Now! Please note if you do not hear back regarding your application within 5 working days you have unfortunately been unsuccessful currently, but we thank you for your interest. Adecco is a disability-confident employer. It is important to us that we run an inclusive and accessible recruitment process to support candidates of all backgrounds and all abilities to apply. Adecco is committed to building a supportive environment for you to explore the next steps in your career. If you require reasonable adjustments at any stage, please let us know and we will be happy to support you. We use generative AI tools to support our candidate screening process. This helps us ensure a fair, consistent, and efficient experience for all applicants. Rest assured, all final decisions are made by our hiring team, and your application will be reviewed with care and attention
Azure Data Engineer
Opus Recruitment Solutions
Bristol
Hybrid
Mid - Senior
£350/day - £450/day
RECENTLY POSTED
azure-databricks
delta-lake
sql
Azure Data Engineer | £(Apply online only) per day | Bristol | Hybrid | 6‑Month Initial Term |A large‑scale data transformation programme is underway, and our client is looking for an experienced Azure Data Engineer to support the rebuild of their cloud data platform. This role is hands‑on and delivery‑focused — you’ll be designing and developing Azure‑native data pipelines, working extensively with Databricks, and shaping scalable data models across the Microsoft ecosystem. The role would require you to be on site in Bristol 3 days per week.What you’ll be doingBuild, enhance and maintain data pipelines using Azure Databricks, Data Factory, and Delta LakeDevelop and optimise Lakehouse components and cloud‑based data flowsCreate robust data models to support analytics, MI and downstream reportingAssist in migrating legacy warehouse assets into a modern Azure environmentContribute to cloud architecture decisions, data standards and best‑practice engineering patternsWhat you’ll bringStrong hands‑on experience across Azure Data Services (ADF, ADLS, Synapse, Databricks)Excellent SQL skills, with experience in performance tuning and optimisationSolid understanding of data modelling (star schema, medallion, ETL frameworks)Ability to work with complex, inconsistent or legacy data sourcesExperience building scalable, production‑ready pipelines in a cloud environmentAzure Data Engineer | £(Apply online only) per day | Bristol | Hybrid | 6‑Month Initial Term
Data Engineer
Hays Technology
York
Hybrid
Mid - Senior
£500/day - £600/day
RECENTLY POSTED
sql
snowflake
Our client, a leading organisation within the public sector, is seeking an experienced Data Engineer to join their Digital & Data function. The role is focused on transforming and integrating data across the institution, supporting strategic reporting, analytics, and operational needs. With major ongoing development of their Snowflake-based data platform, this is an excellent opportunity to take ownership of high-impact data engineering work in a complex, data-rich environment. What You Will Be Doing Designing, developing, and maintaining data pipelines to ingest, clean, transform, and deliver high‑quality datasets into Snowflake. Applying strong SQL skills to build transformation logic, optimise performance, and ensure efficient query execution. Supporting data integration across core institutional systems, including potential exposure to Workday data imports. Collaborating with analysts, stakeholders, and cross‑functional teams to ensure data is accurate, timely, and fit for purpose. Participating in troubleshooting, performance tuning, monitoring, and continuous improvement of data workflows. Ensuring best practices in data modelling, documentation, governance, and platform optimisation. What You Will Need (Responsibilities / Requirements) Strong, demonstrable expertise in SQL and hands-on experience transforming data. Experience delivering data pipelines and ETL/ELT processes into Snowflake (or similar cloud data warehousing platforms). Understanding of modern data engineering concepts: modelling, integration, quality, monitoring, optimisation. Experience working with large datasets and multiple data sources. Ability to work collaboratively, communicate effectively, and engage confidently with stakeholders. Workday inbound data experience is beneficial but not essential. What you will get in returnThis is a great opportunity to work for a highly esteemed organisation on a contract basis, paying around £550.00 per day Inside IR35. Some travel to site in York will be required, with flexibility. What you need to do now If you’re interested in this role, click ‘apply now’ to forward an up-to-date copy of your CV, or call us now. If this job isn’t quite right for you, but you are looking for a new position, please contact us for a confidential discussion about your career. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C’s, Privacy Policy and Disclaimers which can be found at (url removed)
Senior Data Scientist
Datatech
London
Hybrid
Senior
£70,000 - £75,000
RECENTLY POSTED
aws
tensorflow
python
pytorch
pandas
scipy
Senior Data Scientist - Customer Data Salary: £70,000 - £85,000 (DoE) Location: Hybrid - 2/3 days per week in a Central London office Job Reference: J13015 Full UK working rights required - no sponsorship available Immediate requirement - strong leadership and senior stakeholder skills We are seeking an experienced, passionate, and highly motivated Senior Data Scientist to play a pivotal role in unlocking the value of customer data and shaping how it is used across the business. This is a senior, highly autonomous position, acting as the number two to the Director of Customer Data, where you will operate at a strategic level while remaining hands-on. This is an excellent opportunity for a senior-level data scientist who wants real ownership, influence, and visibility, and to be part of a business at a transformative point in its data maturity. The company has recently implemented a new Customer Data Platform (CDP) and is at a genuinely exciting stage of its data journey. You will be instrumental in helping define best practice, drive advanced analytics use cases, and influence how customer data is activated across products, CRM, and marketing. While experience with personalisation and recommender systems would be highly desirable, it is not essential. The role is broader in scope and suited to someone who enjoys owning complex customer data problems end-to-end and shaping the direction of advanced data science initiatives. ________________________________________ The Role • Act as a senior technical and strategic lead within the Customer Data team, working closely with (and deputising for) the Director. • Take full ownership of your role, with the autonomy to shape priorities, define approaches, and mould the position to maximise impact. • Lead the development of advanced machine learning solutions across customer data use cases, including (but not limited to) personalisation, segmentation, propensity modelling, and customer insight. • Contribute to the evolution and activation of the newly implemented CDP, helping the organisation realise its full value. • Own the full machine learning lifecycle - from problem definition and model design through to deployment, monitoring, and optimisation. • Collaborate closely with CRM, marketing, product, engineering, and regional teams to ensure solutions are aligned to business goals. • Partner with data engineering and platform teams to ensure scalable, robust, and production-ready solutions. • Act as a senior stakeholder, able to clearly communicate complex concepts and influence decision-making at all levels. _______________________________________ Skills & Experience • Strong, hands-on experience in machine learning and applied data science within customer or commercial domains. • Experience with recommender systems, personalisation, or deep learning is desirable but not essential. • Solid Python skills and experience with ML libraries such as pandas, numpy, scipy, scikit-learn, TensorFlow or PyTorch. • Experience working across cloud environments (GCP, AWS, or Azure) and analytics platforms such as Dataiku. • Good understanding of MLOps practices, including deployment, monitoring, and retraining pipelines. • Proven ability to work cross-functionally with marketing, CRM, product, and engineering teams. • Excellent communication, leadership, and stakeholder management skills. • Experience operating in a global or multi-regional environment is a plus. ________________________________________ If you would like to hear more, please do get in touch. Alternatively, you can refer a friend or colleague by taking part in our fantastic referral schemes. For each relevant candidate you introduce to us (there is no limit) and we place, you will be entitled to our general gift/voucher scheme. Datatech is one of the UK’s leading recruitment agencies in analytics and host of the critically acclaimed Women in Data event. For more information, visit (url removed)
Data Engineer
Datatech
Mansfield
Fully remote
Junior - Mid
£45,000 - £45,000
RECENTLY POSTED
aws
git
python
airflow
sql
snowflake
+1
Data Engineer, Remote Modern Cloud Data Stack £45,000 DOE No sponsorship, post-grad visa not available A high-visibility opportunity with a values-led organisation modernising its data platform and refreshing its data strategy. You’ll be trusted early, work directly with stakeholders across the business, and build the foundations that power better insight, smarter decisions, and real-world impact. This suits someone with 2+ years’ experience who wants to step up, take ownership, and grow quickly in a supportive environment. Communication is central here, you’ll succeed by translating business questions into robust, trusted data assets, and by bringing people with you on the journey. What you’ll do • Help shape and deliver a refreshed data strategy and modern intelligence platform • Build reliable, scalable ELT/ETL pipelines into a cloud data warehouse (Snowflake, Databricks, or similar) • Develop and optimise core data models and transformations (dimensional, analytics-ready, built to last) • Create trusted data products that enable self-service analytics across the organisation • Improve data quality, monitoring, performance, and cost efficiency • Partner with analysts, BI, and non technical stakeholders to turn questions into production-grade data assets • Contribute to standards, best practice, and reusable engineering frameworks • Support responsible AI tooling, including programmatic LLM workflows where relevant What you’ll bring • 2+ years’ experience in data engineering within a modern cloud stack • Strong SQL, plus a solid data modelling foundation • Python preferred (or similar) for pipeline development and automation • Cloud exposure (AWS, Azure, or GCP) • Familiarity with orchestration and analytics engineering tools (dbt, Airflow, or similar) • Strong habits around governance, security, documentation, Git, and CI/CD What will make you stand out in this business • Clear, confident communication, you can explain technical choices in plain English • Strong stakeholder mindset, you ask the right questions and align on outcomes early • Ownership, curiosity, and a bias for building things properly Excited? Apply now
Power BI Specialist
Gleeson Recruitment Group
Birmingham
Remote or hybrid
Mid - Senior
Private salary
RECENTLY POSTED
fabric
Power BI Specialist - Contract OUTSIDE IR35Remote role - Office based in BirminghamWe’re looking for a Power BI Specialist to turn sales data into real-time, high-impact dashboards that drive decision-making.You’ll build and maintain live Power BI dashboards, working from a newly implemented Microsoft Fabric data warehouse, with scope to shape and enhance the data model as it evolves. You’ll also liaise with third-party partners to ensure smooth integration and delivery.What you’ll do:
Build real-time Power BI dashboards from sales data
Develop and enhance a new Fabric-based data warehouse
Work with stakeholders and third-party providers
Turn complex data into clear, actionable insights
What we’re looking for:
Strong Power BI experience
Solid understanding of data warehousing (Fabric experience a big plus)
Confident communicator who can work with internal teams and external partners
If you like fast-moving projects, clean data models, and dashboards that actually get used - this could be for you!Please apply asap if interested. GleeITAt Gleeson Recruitment Group, we embrace inclusivity and welcome applicants of all backgrounds, experiences, and abilities. We are proud to be a disability confident employer.By applying you will be registered as a candidate with Gleeson Recruitment Limited. Our Privacy Policy is available on our website and explains how we will use your data.
Senior Data Engineer - (ML and AI Platform)
Datatech
London
Hybrid
Senior
£65,000 - £80,000
RECENTLY POSTED
aws
python
sql
pyspark
snowflake
Senior Data Engineer (ML and AI Platform) Location London with hybrid working Monday to Wednesday in the office Salary 65,000 to 80,000 depending on experience Reference J13026We are partnering with an AI first SaaS business that turns complex first party data into trusted, decision ready insight at scale.You will join a collaborative data and engineering team building a modern, cloud agnostic data and AI platforms.This role is well suited to an experienced data engineer who enjoys working thoughtfully with real world data, contributing to reliable production systems, and developing clear and well-structured Python and SQL.Why join: Supportive and inclusive culture where people are encouraged to contribute and be heard Clear progression with space to develop your skills at a sustainable pace An environment where collaboration, learning, and thoughtful engineering are genuinely valuedWhat you will be doing: Contributing to the design and delivery of cloud-based data and machine learning pipelines Working with Python, PySpark and SQL to build clear and maintainable data transformations Helping shape scalable data models that support analytics, machine learning, and product features Collaborating closely with Product, Engineering, and Data Science teams to deliver meaningful production outcomesWhat we are looking for: Experience using Python for data transformation, ideally alongside PySpark Confidence working with SQL and production data models Experience working with at least one modern cloud data platform such as GCP, AWS, Azure, Snowflake, or Databricks Experience contributing to data pipelines that run reliably in production environments A collaborative mindset with clear and thoughtful communicationRight to work in the UK is required. Sponsorship is not available now or in the future.Apply to learn more and see if this could be the next step for you.If you have a friend or colleague who may be interested, referrals are welcome. For each successful placement, you will be eligible for our general gift or voucher scheme. Datatech is one of the UK’s leading recruitment agencies specialising in analytics and is the host of the critically acclaimed Women in Data event. For more information, visit (url removed)
Data Engineer
B3Living
Hertford
Hybrid
Mid - Senior
£54,835 - £60,927
RECENTLY POSTED
fabric
python
sql
Based in Cheshunt, Hertfordshire Permanent, full-time, 37 hours per week Salary: £54,835 - £60,927 per annum Reliable data is at the heart of good decision-making. We’re looking for an experienced Data Engineer to join our newly established data team and work with them to deliver reliable, high-quality data that supports informed decision-making and enables us to deliver better outcomes for our customers. In this role, you’ll design and maintain scalable data pipelines and robust data models, helping to ensure our data is accurate, accessible, and secure. You’ll also improve troubleshooting by introducing error handling and logging and optimise efficiency by monitoring data performance and applying fine tuning techniques. Writing complex queries (SQL, Python and Spark), documenting data structures and working with colleagues to respond to business needs while ensuring alignment with governance standards and GDPR are also key in this role. We’re looking for someone with…
Proven experience in data engineering or data platform development.
Experience with testing frameworks and writing test plans for data pipelines.
Strong analytical and problem-solving skills.
Strong SQL skills and experience with query optimisation.
Knowledge of Microsoft Fabric (Lakehouse, Dataflows, ADF).
An understanding of data modelling concepts (e.g. dimensional, star schema, denormalisation)
Knowledge of performance tuning techniques for ETL and SQL processes
Familiarity with data governance principles and GDPR Due to the type of data you’ll have access to in this role, you’ll be required to undertake a basic criminal record (DBS) check. We’re a social business, based in Cheshunt and across southeast Hertfordshire, helping local people by renting or selling affordable homes. We offer services designed to help our customers live comfortably in their homes, and we work to keep our buildings and estates maintained, offering support when money becomes an issue or when people get older. Our mission is to make a sustainable, positive change to the housing crisis for our customers and communities. We enjoy a benefits package that offers something for everyone, including…
27 days’ holiday plus bank holidays (pro rata for part-time colleagues).
Buy and sell holiday scheme.
Cross-organisational bonus scheme.
Up to 12% pension contribution.
Life assurance (three times salary).
Funded health cash plan or subsidised private medical insurance.
Range of special and family leave.
Car loans, cycle to work and electric car lease scheme.
Discount vouchers and more. The closing date for this vacancy is 2nd February 2026. We are a Disability Confident employer, which means that we offer an interview to a fair and proportionate number of disabled applicants who meet the minimum selection criteria for the job. Other organisations may call this role ETL Engineer, Data Pipeline Engineer, Analytics Engineer, Data Platform Engineer, or BI Developer. We’re committed to building an inclusive workplace where equity, diversity and inclusion are part of our culture, as we recognise the benefits of a diverse workforce. Our 3-year EDI strategy outlines how we’ll achieve this. We strongly welcome applications from underrepresented groups and groups which are identified as a priority within our strategy, including LGBTQIA+, Black, Asian and Minority Ethnic communities, applicants with disabilities and people under 30. We understand that some candidates, particularly from certain groups, may hesitate to apply unless they meet every requirement. While we’re looking for people with the right skills and experience, we also value diverse backgrounds and transferable skills. If you meet most of the criteria and believe you’d thrive in the role, we encourage you to apply. All our vacancies are open to flexible working arrangements, something we are really proud of. The extent to which flexible working is possible will vary between jobs according to the needs of the business and our customers. So, if you’re ready to take your next step as a Data Engineer, please apply via the button shown. This vacancy is being advertised by Webrecruit. The services advertised by Webrecruit are those of an Employment Agency
Lead Data Engineer
Cathcart Technology
Edinburgh
Fully remote
Senior
£100,000
RECENTLY POSTED
python
snowflake
I’m partnering with a fast-growing, highly respected analytics and consultancy organisation working at the forefront of the energy, chemicals and low-carbon sectors to hire a Lead Snowflake Data Engineer . Backed by major investors and trusted by global clients, it’s a great time to be joining their team (fully remote - MUST be UK based). This is a rare greenfield leadership opportunity where you’ll define the data strategy, architecture and tooling from day one, while building and mentoring a high-performing data engineering team. You’ll deliver scalable cloud-native pipelines on Snowflake , help unify data across multiple acquired businesses, and develop machine learning solutions that drive real commercial insight. Working closely with senior stakeholders , you’ll turn complex requirements into robust technical solutions and set standards for data quality and governance. With a modern Python-first stack , no legacy constraints and strong business backing, the role offers genuine technical ownership and strategic impact in a fully remote position . You’ll ideally have most of the following: ** Hands-on experience with Snowflake (essential) \ Proven experience leading data engineering projects end-to-end ** Strong Python background and modern data engineering practices \ Deep understanding of ETL / ELT, data modelling and transformations ** Practical machine learning experience (desirable) \ Strong communication with technical and non-technical teams You’ll be joining at a pivotal time, with the chance to build a modern data function from the ground up and make a visible, long-term impact on the business. The role offers significant technical autonomy , real influence at senior stakeholder level, and the opportunity to work on meaningful problems connected to the global energy transition. In return they offer a very competitive salary and strong benefits package , flexible remote working across the UK with occasional travel , and clear long-term progression path into senior data leadership. It’s genuinely a really exciting opportunity to combine hands-on engineering, strategic thinking and team leadership in an environment that actively supports innovation, learning and ambitious technical ownership. If you’re keen to learn more, please apply or drop Matthew MacAlpine at Cathcart Technology a message. Cathcart Technology is acting as an Employment Agency in relation to this vacancy.TPBN1_UKTJ
Performance & Reporting Analyst
Reed Specialist Recruitment
Nottingham
In office
Junior - Mid
£30/hour
RECENTLY POSTED
sql
salesforce
Location: Beeston, NottinghamJob Type: TemporaryHourly Rate: 29.97We are seeking a Performance & Reporting Analyst to join a leading Housing provider. This role is critical in ensuring regulatory compliance in social housing, with a focus on data analysis and reporting to support our property services strategy. The ideal candidate will be proficient in SQL, Salesforce, and Power BI, and will play a key role in monitoring and reporting on SLA/KPIs to ensure the safety and compliance of our housing services.Day-to-day of the role:
Collate and analyse statistical data to produce weekly and monthly reports for the senior leadership team.
Support the creation and maintenance of a real-time KPI dashboard using Power BI and other visualisation tools to improve performance against key targets and provide assurance on statutory compliance obligations.
Perform ETL actions using SQL and other coding languages on asset and housing databases to produce insightful reports.
Use data asset systems to support forecasting and compliance regulation needs across various workstreams.
Produce ad hoc reports as requested by internal stakeholders, ensuring high standards of data integrity and quality.
Remain actively involved with all service leads to maintain high standards of safety and service for our residences.
Required Skills & Qualifications:
Excellent interpersonal skills with the ability to build and sustain positive working relationships.
Strong planning and organisational skills with meticulous attention to detail.
Educated to Degree level or equivalent.
Proficient in SQL, Salesforce, and Power BI.
Experience with large data sets and creating reporting dashboards.
Knowledge of Asset Management, Housing Management, or other CAFM systems. Experience with Northgate, True Compliance, or Riskbase is desirable.
To apply for this Performance & Reporting Analyst position, please submit your CV detailing your experience with SQL, Salesforce, and Power BI, and why you are suited for this role.
SQL Developer
ITSS Recruitment
Leeds
Hybrid
Mid
£40,000 - £55,000
RECENTLY POSTED
sql
fabric
python
ssis
SQL Developer - Hybrid (2 days in office) - Up to 55K + Bonus + 10% Pension + 26 days Holiday + Bupa Healthcare + Microsoft Certifications - LeedsWe are looking for a highly motivated and skilled SQL Developer to join an established data / BI team based in central Leeds.This exciting opportunity will suit a talented SQL Developer who is well versed in the Microsoft Stack. You will be working in an established Data team of 16, who contribute to smooth running of a multi-million pound organisation with over 2000 employees.You will be working in a team comprised of DBAs, BI and SQL developers working on a range of projects using the latest technologies. They are a big believer in sharing thoughts and encouraging and supporting innovation and creativity. You will also be given the chance to be involved in all aspects of the project process from conception through to completion and launch.The organisation are huge on aiding with personal and professional development, including fully subsidised Microsoft certifications (DP-600/700 etc)SQL Developer Key skills:SQL Server Cloud platforms, ideally Azure Data pipeline tools, SSIS / Spark Data warehousing and data modelling principles ETL MS Fabric PythonYou will be a motivated SQL Developer with good communication skills and have prior experience within a similar position. The successful SQL Developer should have strong problem solving abilities, organisational skills and the ability to work as part of a team.We are interviewing currently so apply now for immediate consideration or contact George Harvey at ITSS Recruitment for further information.
Integration Developer
Shaw Daniels Solutions Ltd
Not Specified
Fully remote
Mid - Senior
Private salary
RECENTLY POSTED
fabric
javascript
dot-net
powershell
kanban
csharp
+7
RemoteAs a seasoned and flexible Integrations Developer, you will play a key role as part of a skilled and multi-disciplinary development team delivering innovative solutions as part of the DX Programme. You will be responsible for designing, developing and implementing solutions using the MS Integration Services & MS Power Platform suite including but not limited to Logic Apps, Service Bus, Event Grid, APIM, Functions & Function Apps, SQL Server, Azure SQL, Dataverse, D365 modules, 3rd-party APIs, Azure Synapse, Fabric and Key Vault. You will be helping design end-to-end solutions involving a variety of technologies, endpoints and integration patterns along with utilising your problem-solving skills to understand client pain points and troubleshoot as challenges arise.Overall Role Objectives
Design, develop, and implement successful custom integrations solutions using MS Integrations Services and Power Platform components to automate processes, streamline operations and extend functionality of core applications.
Build out an enterprise class data platform interfacing with a multitude of differing endpoints, technologies and data structures
Provide expert guidance on Azure integration component implementation, configuration and customisation to meet business requirements.
As part of Service Delivery, provide support to troubleshoot and resolve issues with data and application integrations and related environments.
Collaborate with stakeholders at all levels to gather requirements, analyse processes and recommend optimal solutions.
Stay up to date with the latest MS Azure, Power Platform and D365 features and best practices.
Fully participate in team planning and work with colleagues to continuously improve the team’s performance.
Tasks/Responsibilities Technical Excellence
Extensive experience in the Microsoft Azure Integrations Service technologies and components -
Developing robust, scalable, secure & efficient enterprise class integrations interfacing with a wide variety of in-house and 3rd-party applications and services.
Implement error logging, error trapping, and checkpoints.
Implement secrets and Key Vault stores for optimum security.
Deploy integrations via CI/CD pipelines in DevOps across multiple environments in a fully controlled and traceable process.
Maintain code and object repositories in DevOps.
Provide ongoing support and troubleshooting for deployed integrations.
Work as part of a collaborative, medium-sized multi-disciplinary team, excellent communication skills.
Able to work on own initiative when called upon.
Proficient in aiding with design and architecture discussions and decision making.
Proficient in reverse engineering existing integrations and processes to aid with transposing into new data platform.
Proficient in interpreting designs and patterns provided by other team members and approved 3rd parties into workable, scalable, reliable solutions.
Proficient in creating code, processes, JavaScript, T-SQL, formulas, C#, PowerShell and other scripts as required.
Proficient in manipulating and working with a range of data formats including JSON, XML, delimited text, Excel, etc.
Familiarity with .Net development, Microsoft tools and DevOps.
Responsible for the development of Azure Logic Apps, Azure Function Apps, workflows and other components as required.
Provide architecture, configuration, administration, and functional support to expand capabilities in Microsoft 365 (Dynamics 365 is a plus).
Assist in implementing best practice for information and document management.
Gather requirements, make recommendations and estimate effort to complete work.
Offering mentoring and support to less experienced members of the team.
Interpret and design database models (SQL Server, Azure DB etc).
Good foundational knowledge of Office 365 platforms, including Azure AD, and Azure ecosystem.
Working knowledge of medallion architecture and how it fits within a data platform.
Working knowledge of Log Analytics, KQL and how it fits within monitoring solutions.
Essential Knowledge, Skills & Experience Experience/Knowledge
Integrations development involving Azure Integrations, SQL Server / Azure SQL, APIs, Saas & PaaS systems
CI/CD
Experience in an Agile development life cycle (SCRUM, RAD, KANBAN) using Azure DevOps or similar
Working as part of medium sized development team (5+)
Desirable Knowledge, Skills & Experience Experience/Knowledge
Exposure to multi-platform integration (MS tools preferred).
On premise SQL environments, legacy SSIS, SSRS and other SQL-related technologies employed in complex ETL or ELT patterns
Synapse Link for Dataverse
Dataverse, Data Flows, Cloud Flows, DAX and Power Platform implementations
Synchronisation methods for Synapse and Fabric from D365
FTP / STFP configurations and services
Web services, SaaS and PaaS development
Page 1 of 14

Frequently asked questions

What qualifications do I need to become a Data Engineer?
Typically, a Data Engineer should have a strong background in computer science or related fields, proficiency in programming languages like Python or Java, and experience with data warehousing, ETL processes, and big data technologies such as Hadoop or Spark.
What types of Data Engineer jobs can I find on Haystack?
Haystack features a wide range of Data Engineer positions, including roles in startups, large enterprises, and remote opportunities. You can find jobs specializing in cloud data engineering, real-time data processing, data pipeline development, and more.
How can I improve my chances of getting hired as a Data Engineer?
To improve your chances, tailor your resume to highlight relevant skills and projects, gain hands-on experience with popular data tools, contribute to open-source projects, and stay updated with the latest trends in data engineering.
Are entry-level Data Engineer jobs available on Haystack?
Yes, Haystack lists entry-level Data Engineer roles suitable for recent graduates or professionals transitioning into data engineering, as well as internships and junior positions to help you start your career.
Can I find remote Data Engineer positions on Haystack?
Absolutely. Haystack offers many remote and flexible Data Engineer job listings to suit your preferred working style and location.