Make yourself visible and let companies apply to you.
Roles

Data Engineer Jobs

Overview

Looking for top Data Engineer jobs? Explore the latest data engineering opportunities on Haystack, your go-to IT job board. Whether you're skilled in ETL, data pipelines, or big data technologies, find the perfect role to advance your career today. Start your search for Data Engineer positions now!
Filters applied
Data Engineer
Search
Salary
Location
Remote preference
Role type
Seniority
Tech stack
Sectors
Contract type
Company size
Visa sponsorship
Lead/ VP AWS Data Engineer
Tenth Revolution Group
London
In office
Mid
£115k - £120k
RECENTLY POSTED
aws
processing-js
python
sql
pyspark
AWS Data Engineer - VP Level
Location: London (Canary Wharf)
Salary: (phone number removed) - (phone number removed)
A leading financial institution is seeking a Senior AWS Data Engineer to join a greenfield initiative focused on transforming market data access and utilization. This VP-level role offers strategic influence, technical leadership, and cross-functional collaboration across business units.
Key Responsibilities:
Design and maintain scalable data pipelines, warehouses, and lakes using AWS technologies.
Implement robust data governance, quality, and security practices.
Collaborate with data scientists to deploy machine learning models.
Lead or contribute to strategic planning, risk management, and stakeholder engagement.
Drive innovation through advanced analytics and research-based problem solving.
To be successful you should have:
10 years hands-on experience in AWS data engineering technologies, including Glue, PySpark, Athena, Iceberg, Databricks, Lake Formation, and other standard data engineering tools.
Previous experience in implementing best practices for data engineering, including data governance, data quality, and data security.
Proficiency in data processing and analysis using Python and SQL.
Experience with data governance, data quality, and data security best practices.
Strong knowledge of market data and its applications.
Some other highly valued skills may include:
Experience with other data engineering tools and technologies.
Knowledge of machine learning and data science concepts.
Familiarity with Barclays’ data strategy and practices.
Please send me your CV is you meet these requirements and we can then arrange call
Data Manager
Robert Half
Manchester
In office
Mid
£65k
RECENTLY POSTED
TECH-AGNOSTIC ROLE
Data Manager - Manchester (5 days on site) - circa 65k plus benefits
Robert Half has been retained by a leading infrastructure consulting business to recruit a Data Manager. This is an exciting opportunity for an experienced professional to play a critical role in managing data securely, effectively, and in compliance with legal and project requirements.
About the Role
The Data Manager will be responsible for:
Creation, development and management of a Common Data Environment (CDE).
Enforcing compliance with the main contractors Integrated Management System and the end clients Information Management Requirements.
Overseeing data security protocols, including corrective and preventive actions.
Ensuring compliance with data protection and legislative requirements outlined in contract documents.
Coordinating the management and transfer of data between contractors and the wider supply chain
Key Responsibilities
Information Systems:
Maintain compliance with the data management criteria defined in Schedule 6 of the MSA.
Data Protection and Security:
Safeguard all confidential information under the guidelines of Clause 62 and Schedule 32 of the Project Agreement.
Develop and maintain an information security programme to ensure data confidentiality, integrity, and availability.
Respond promptly to data breaches or unauthorised access incidents and maintain a detailed log of actions taken.
Records and Information Management:
Maintain up-to-date, accessible, and secure electronic records for the project.
Manage and support the population of the Common Data Environment (CDE) for all stakeholders.
Fulfil compliance requirements set forth in Schedule 24 of the Project Agreement.
Cybersecurity Oversight:
Implement measures to protect against potential security risks, including unauthorised access, malware, and other vulnerabilities.
Ensure all third parties and subcontractors comply with established information security practices.
Personnel Management:
Ensure all employees working with sensitive data have valid DBS certifications.
Maintain an up-to-date log of certifications and renewal dates.
Qualifications and Experience
Degree in Information Management, IT, Engineering, or a related field.
Relevant certification, such as Certified Information Systems Security Professional (CISSP).
At least 5 years of experience in data management within large-scale project environments (preferably in construction, infrastructure, renewables or utilities).
Proficiency in managing Common Data Environments (CDEs) and implementing robust data security protocols.
Experience with maintaining ISO 27001 compliance for data security.
What We’re Looking For
An analytical and detail-oriented individual who thrives in complex project environments. You’ll need a firm understanding of data protection legislation, strong organisational skills, and a collaborative mindset for engaging with stakeholders ranging from contractors to utility companies.
Robert Half Ltd acts as an employment business for temporary positions and an employment agency for permanent positions. Robert Half is committed to diversity, equity and inclusion. Suitable candidates with equivalent qualifications and more or less experience can apply. Rates of pay and salary ranges are dependent upon your experience, qualifications and training. If you wish to apply, please read our Privacy Notice describing how we may process, disclose and store your personal data:
Senior Data Engineer
Primus Connect
Edinburgh
Hybrid
Senior
£550/day - £615/day
RECENTLY POSTED
terraform
python
vault
delta-lake
pyspark
Edinburgh 4 days per week on-site
6 months (likely extension)
550 - 615 per day outside IR35
Primus is partnering with a leading Financial Services client who are embarking on a greenfield data transformation programme. Their current processes offer limited digital customer interaction, and the vision is to modernise these processes by:
Building a modern data platform in Databricks
Creating a single customer view across the organisation.
Enabling new client-facing digital services through real-time and batch data pipelines.
You will join a growing team of engineers and architects, with strong autonomy and ownership. This is a high-value greenfield initiative for the business, directly impacting customer experience and long-term data strategy.
Key Responsibilities:
Design and build scalable data pipelines and transformation logic in Databricks
Implement and maintain Delta Lake physical models and relational data models.
Contribute to design and coding standards, working closely with architects.
Develop and maintain Python packages and libraries to support engineering work.
Build and run automated testing frameworks (e.g. PyTest).
Support CI/CD pipelines and DevOps best practices.
Collaborate with BAs on source-to-target mapping and build new data model components.
Participate in Agile ceremonies (stand-ups, backlog refinement, etc.).
Essential Skills:
PySpark and SparkSQL.
Strong knowledge of relational database modelling
Experience designing and implementing in Databricks (DBX notebooks, Delta Lakes).
Azure platform experience.
ADF or Synapse pipelines for orchestration.
Python development
Familiarity with CI/CD and DevOps principles.
Desirable Skills
Data Vault 2.0.
Data Governance & Quality tools (e.g. Great Expectations, Collibra).
Terraform and Infrastructure as Code.
Event Hubs, Azure Functions.
Experience with DLT / Lakeflow Declarative Pipelines:
Financial Services background.
Senior Data Engineer
The Bridge IT Recruitment
Leeds
Hybrid
Senior
£55k - £65k
RECENTLY POSTED
github
aws
processing-js
python
docker
pandas
+6
Purpose of the Job:
Design, build, and maintain robust data systems and pipelines that support data storage, processing, and analysis on the Cloud.
Work with large datasets, ensuring data quality, scalability, and performance, while collaborating.
closely with data scientists, analysts, and other engineering teams to understand their data needs and provide them with high-quality, accessible data.
They are responsible for ensuring that the underlying data infrastructure supports the organizations broader data and business goals, enabling more effective data-driven decision-making.
Key Accountabilities:
Design and implement scalable, efficient, and secure data architectures, ensuring optimal data flow across systems in order to achieve high service levels of support, maintenance and development
You will own development and change projects to ensure requirements are met in the most cost-effective manner while minimising associated risk to expected standards.
Responsibly for cloud data platform development, data modelling, shaping and technical planning
You will be a mentor among the owning decision making and evaluation of requirement suitability, facilitate reliable estimates, technical project management, stakeholder management with a project
Ensure that resource requirements are understood and planned/estimated effectively against demand, including identification of additional temporary resource capability within projects
Maintain appropriate process procedures, compliance and service level monitoring, performance reporting and vendor management.
Implementing best practices around data security, privacy, and compliance for the teams compliance with cyber security and data protection and supporting along with BI lead
Strong stakeholder management will be required for maintaining relationships with our business users to clarify and influence requirements. Including liaising with internal business departments and functions to manage the service level expected from the data team.
Collaborating with external organisations and third-party software/service suppliers for ongoing support, maintenance and development of systems.
You will be able to demonstrate you are quality focused to ensure that they solutions are built to an appropriate standard whilst being balanced with a drive to deliver against tight deadlines. Support in developing and implementing best practices and process across the team along with BI lead.
Influence the evolution of business and system requirements and contribute to the design of technical solutions to feed a delivery pipeline that increasingly employs Agile methods such as SCRUM and Kanban
You will be required to develop unit tested code and then support test cycles including post implementation validation. You will be required to contribute to the transition into service and ongoing support of the applications in the area which provides the opportunity to reduce technical debt and rationalise our technical footprint
Mentor data engineers, supporting their professional growth and development
Outcome, Results and Key Performance Indicators:
Delivery of projects to expected timely, cost and quality standards
Excellent levels of application availability and resilience as required by business operations.
Necessary governance and control requirements defined - design, code and test standards and guidelines.
Ensure data systems comply with necessary governance and control requirements.
Internally-developed data solutions are fit for purpose and fit correctly within the data architecture. Built and tested to user requirements, performing to defined performance and capacity requirements.
Company data is secure, accurate, maintained and available according to requirements.
Technical risks and issues correctly mitigated and managed on Projects and Production support.
High quality software delivered in to production - zero critical and high defects before production release.
Dimensions of Job:
This role is part of a well-established data team, the role offers a great opportunity for the right candidate to hone their modern data management skills in a friendly and supportive environment.
This role requires attendance to a Leeds based office as often as needed with a minimum 2 days a week.
Able to work effectively as part of a remote team.
A great opportunity for a motivated data engineer seeking a new opportunity with a friendly, newly formed data team and able to contribute to the team’s growth with their technical expertise
Key Relationships:
Internal: Wider technical teams (including apps, test, dev ops and more), Project managers, business SME’s, data teams and communities , Data scientists, BI Lead, Head of Data
External: software & service suppliers, consultants.
Knowledge and Skills:
Knowledge
Broad data management technical knowledge so as to be able to work across full data cycle.
Proven Experience working with AWS data technologies (S3, Redshift, Glue, Lambda, Lake formation, Cloud Formation), GitHub, CI/CD
Coding experience in Apache Spark, Iceberg or Python (Pandas)
Experience in change and release management.
Experience in Database Warehouse design and data modelling
Experience managing Data Migration projects.
Cloud data platform development and deployment.
Experience of performance tuning in a variery of database settings.
Experience of Infrastructure as code practises.
Proven ability to organise and produce work within deadlines.
Skills
Good project and people management skills.
Excellent data development skills.
Excellent data manipulation and analysis skills using a variety of tools including SQL, Phyton, AWS services and the MSBI stack.
Ability to prioritise and be flexible to change those priorities at short notice.
Commercial acumen.
Able to demonstrate a practical approach to problem solving.
Able to provide appropriate and understandable data to a wide ranging audience.
Well-developed and professional communication skills.
Strong analytical skills - ability to create models and analyse data in order to solve complex problems or reinforce commercial decisions.
Able to understand business processes and how this is achieved/influenced by technology.
Must be able to work as part of a collaborative team to solve problems and assist other colleagues.
Ability to learn new technologies, programs and procedures.
Technical Essentials:
Expertise across data warehouse and ETL/ ELT development in AWS preferred with experience in the following:
Strong experience in some of the AWS services like Redshift, Lambda,S3,Step Functions, Batch, Cloud formation, Lake Formation, Code Build, CI/CD, GitHub, IAM, SQS, SNS, Aurora DB
Good experience with DBT, Apache Iceberg, Docker, Microsoft BI stack (nice to have)
Experience in data warehouse design (Kimball and lake house, medallion and data vault) is a definite preference as is knowledge of other data tools and programming languages such as Python & Spark and Strong SQL experience.
Experience is building Data lake and building CI/CD data pipelines
A candidate is expected to understand and can demonstrate experience across the delivery lifecycle and understand both Agile and Waterfall methods and when to apply these.
Experience:
This position requires several years of practical experience in a similar environment. We require a good balance of technical and personal/softer skills so successful candidates can be fully effective immediately.
Proven experience in developing, delivering and maintaining tactical and enterprise data management solutions.
Proven experience in delivering data solutions using cloud platform tools.
Proven experience in assessing the impact of proposed changes on production solutions.
Proven experience in managing and developing a team of technical experts to deliver business outcomes and meet performance criteria.
Exposure to Energy markets, Energy Supply industry sector
Developing and implementing operational processes and procedures.
Senior Data Engineer
Arm
London
Hybrid
Senior
£65k - £75k
RECENTLY POSTED
aws
ansible
java
python
oauth2
Hybrid working - 1 day a week onsite in either London or Portsmouth
Permanent - Up to 75k
Overview:
We are seeking a talented Senior Data Engineer specialising in Starburst (Trino) and Dell Data Lakehouse to join our AI & Data team. You will be responsible for deploying, maintaining and optimising Starburst installations & Dell Data Lakehouse, enabling our clients to seamlessly access their data across multiple platforms. The ideal candidate will have excellent communication skills, an advanced understanding of Starburst & Dell Data Lakehouse, and proficiency with troubleshooting and root cause analysis.
Responsibilities:
Deploy and manage Starburst Enterprise/Galaxy and Dell Data Lakehouse installations, overseeing environment setup, configuration, maintenance, upgrades, and ensuring optimal performance.
Configure various server and application settings and parameters.
Integrate Starburst with various data sources to create a unified data platform.
Design and tune the container solution for performance and scalability.
Set up and configure data catalogs in various modes.
Implement robust security controls for data access, ensure compliance with data regulations, and manage potential vulnerabilities.
Coordinate with various support partners and vendor teams.
Troubleshoot and investigate server related issues and provide root cause analysis for incidents.
Perform daily server administration and monitoring, and leverage automation (such as Ansible) for efficient maintenance.
Plan and execute disaster recovery testing.
Create documentation and provide training on Starburst administration and best practices.
Qualifications:
Required Skills & Experience:
Bachelor’s degree in Computer Science, Information Systems, Data Science, Engineering or related field (or equivalent work experience).
Proven experience with Trino/Starburst Enterprise/Galaxy administration / CLI.
Implementation experience with container orchestration solutions (Kubernetes/OpenShift).
Knowledge of Big Data (Hadoop/Hive/Spark) and Cloud technologies (AWS, Azure, GCP).
Understanding of distributed system architecture, high availability, scalability, and fault tolerance.
Familiarity with security authentication systems such as LDAP, Active Directory, OAuth2, Kerberos.
Excellent Unix/Linux skills.
Familiarity with JDBC / ODBC
Preferred Skills:
Certification: Starburst Certified Practitioner.
Experience Python and/or Java programming.
Proficient with infrastructure automation tools such as Ansible.
Knowledge of data requirements for AI and machine learning workloads.
Familiarity with Data Federation and Cached Services
Familiarity with Data pipeline (Series of steps that move and transform data from one source to another for analyses and storage)
Experience with Dell Data Lakehouse administration.
Experience in Demand Driven Adaptive Enterprise (DDAE) administration
Working Conditions
This position may require evening and weekend work for time-sensitive project implementations.
Disclaimer:
This vacancy is being advertised by either Advanced Resource Managers Limited, Advanced Resource Managers IT Limited or Advanced Resource Managers Engineering Limited (“ARM”). ARM is a specialist talent acquisition and management consultancy. We provide technical contingency recruitment and a portfolio of more complex resource solutions. Our specialist recruitment divisions cover the entire technical arena, including some of the most economically and strategically important industries in the UK and the world today. We will never send your CV without your permission.
Data Integration Engineer
JAM Recruitment Ltd
Bristol
Hybrid
Mid
£650/day - £745/day
RECENTLY POSTED
aws
graphql
nodejs
java
nosql
python
+4
SC Cleared Data Integration Engineer
Bristol or London (2 days per week onsite)
Up to 745 per day (Umbrella, Inside IR35)
Must hold live and transferrable SC Clearance used within the last 12 months
About the Role
We’re looking for a highly skilled Software Engineer / Data Integration Specialist to join a mission-critical programme, ensuring data and services move seamlessly, securely, and efficiently across systems. This role blends backend software engineering with data integration expertise, offering the chance to work on high-impact projects within a collaborative, technically advanced environment.
Key Responsibilities
Integration Development - Design, build, and maintain integrations between internal systems and third-party platforms via APIs and related technologies.
Data Pipeline Engineering - Create scalable, reliable pipelines to ingest, transform, and deliver data across multiple environments.
Collaboration - Work closely with software engineers, DevOps, and product teams to translate integration requirements into effective solutions.
Troubleshooting & Optimisation - Resolve integration challenges including data mismatches, authentication issues, and performance bottlenecks.
Documentation - Produce clear documentation for integration workflows, processes, and architecture.
Monitoring & Maintenance - Implement robust logging, alerting, and performance monitoring for integrations.
Continuous Improvement - Champion enhancements to integration architectures and best practices.
Skills & Experience Required
Experience with workflow orchestration tools (e.g., Apache Airflow).
Proven track record in backend development (e.g., Node.js, Python, Java).
Strong knowledge of API design, integration methods (REST, Webhooks, GraphQL), and authentication protocols (OAuth2, JWT).
Experience with ETL/ELT tools and frameworks.
Solid database skills (SQL and/or NoSQL).
AWS cloud platform experience, particularly in integration services.
Strong debugging, problem-solving, and communication abilities.
Comfortable in agile environments and familiar with CI/CD and DevOps practices.
Understanding of data security, privacy, and compliance (e.g., GDPR).
Apply Now
Bring your software engineering and data integration expertise to a role where your work will make a measurable impact on secure, data-driven systems.
Data Engineer - Yorkshire
Tenth Revolution Group
Multiple locations
Hybrid
Mid
£35k - £45k
fabric
python
sql
dax
Data Engineer - Hybrid - Public Sector - Fixed-Term Contract
Location: Bradford (Hybrid working model)
Contract: 2-Year Fixed Term
Salary: Up to 45,000
Work Type: Full-time
Are you a skilled Data Engineer looking to make a meaningful impact in a purpose-driven organisation?
This is a unique opportunity to contribute to life-saving decisions through data. You’ll be part of a forward-thinking team that values innovation, collaboration, and integrity. You’ll enjoy flexible working arrangements, including hybrid options, and a comprehensive benefits package.
What You’ll Be Doing:
Designing and building scalable data pipelines using Microsoft Fabric and Azure
Creating robust data models and architectures to support strategic decision-making
Collaborating with analysts and data quality teams to ensure clean, trusted data
Integrating data from operational systems, APIs, and cloud platforms
Championing data governance, security, and ethical data use
What We’re Looking For:
Proven experience in data engineering (Microsoft Fabric, Azure, SQL, DAX or Python)
Strong understanding of data modelling and performance optimisation
Ability to communicate complex data concepts to non-technical stakeholders
Benefits:
Flexitime
Free Parking,
Pension Scheme,
Employee Assistance Programme,
On-site Gym Access
Please Note: This is a permanent role for UK residents only. This role does not offer Sponsorship. You must have the right to work in the UK with no restrictions. Some of our roles may be subject to successful background checks including a DBS and Credit Check.
Contact me: (url removed)
Data Engineer - Remote
Tenth Revolution Group
Multiple locations
Fully remote
Mid
£50k - £60k
sql
ssis
ssas
dax
A leading financial services organisation are looking for a Data Engineer to join their team on a fully-remote basis - as such, this role is open to candidates across the UK.
Joining their Data Engineering team, you’ll work with Microsoft technologies to build high-quality enterprise-level solutions that enable data-driven decision making, and empower employees to deliver first-class customer experiences.
A lot of their current project work is on-premise, so you’ll be using the likes of SQL Server, the BI Stack (SSIS, SSAS, SSRS) and Power BI.
That being said, they’re starting to explore Azure technologies, so you also have the chance to gain hands-on skills with the likes of Data Factory, Synapse etc. going forward.
They pride themselves on being a people-first business, with a focus on personal and professional growth whilst supporting a healthy work-life balance.
Requirements:
Hands-on experience with SQL, SSIS, SSRS and SSAS
Experience developing reporting solutions in Power BI with use of DAX
Strong understanding of Data Warehousing principles
Experience working in Agile environments
Experience working in regulated environments, ideally financial services
Knowledge of Azure data platform technologies would be desirable but not essential
Benefits:
Salary up to 60,000 depending on experience
Discretionary bonus
25 days annual leave plus bank holidays
Holiday purchase scheme
Pension
Private medical and dental insurance
Health cash plan
Critical illness insurance
Health assessment
Life assurance
Travel insurance
Please Note: This is a role for UK residents only. This role does not offer Sponsorship. You must have the right to work in the UK with no restrictions. Some of our roles may be subject to successful background checks including a DBS and Credit Check.
Tenth Revolution Group / Nigel Frank are the go-to recruiter for Data and AI roles in the UK, offering more opportunities across the country than any other. We’re the proud sponsor and supporter of SQLBits, and the London Power BI User Group. To find out more and speak confidentially about your job search or hiring needs, please contact me directly at (url removed)
Data Engineer - SaaS Start-up
Tenth Revolution Group
Multiple locations
Hybrid
Mid
£50k - £75k
typescript
python
airflow
sql
pyspark
dbt
A rapidly growing company in the B2B Software as a Service (SaaS) space are looking for a Deployed Engineer to join their expanding team in London (hybrid working - 2-3 days a week in their modern office space).
Their product is a platform that acts as a digital twin of a business - integrating internal and external data from a variety of sources to act as a single source of truth, which powers actionable insights at scale. When combined with AI algorithms, the platform drives strategic decision-making, and enables planning and effective execution, allowing businesses to achieve their targeted state. They are a true pioneer in their field!
They believe the future of B2B SaaS is about delivering tailored, dynamic solutions for their clients, rather than implementing static tools. This is where you come in - you’ll be working within a team who believe value is created not just in the codebase, but in the implementation layer - making this role ideal for someone who thrives in dynamic, customer-facing environments.
The role:
Adapt and deploy a powerful data platform to solve complex business problems
Design scalable generative AI workflows using modern platforms like Palantir AIP
Execute advanced data integration using PySpark and distributed technologies
Collaborate directly with clients to understand priorities and deliver outcomes
What We’re Looking For:
Strong skills in PySpark, Python, and SQL
Ability to translate ambiguous requirements into clean, maintainable pipelines
Quick learner with a passion for new technologies
Experience in startups or top-tier consultancies is a plus
Nice to Have:
Familiarity with dashboarding tools, Typescript, and API development
Exposure to Airflow, DBT, Databricks
Experience with ERP (e.g. SAP, Oracle) and CRM systems
What’s On Offer:
Salary: 50,000- 75,000 + share options
Hybrid working: 2-3 days per week in a vibrant Soho office
A highly social culture with regular team events and activities
Work alongside seasoned tech and business leaders
Be part of a mission-driven company with a strong social impact ethos
If you’re excited by the idea of working at the intersection of AI, data, and enterprise transformation - and want to be part of a fast-scaling, values-led team - we’d love to hear from you.
Please Note: This is a role for UK residents only. This role does not offer Sponsorship. You must have the right to work in the UK with no restrictions. Some of our roles may be subject to successful background checks including a DBS and Credit Check.
Tenth Revolution Group / Nigel Frank are the go-to recruiter for Data and AI roles in the UK, offering more opportunities across the country than any other. We’re the proud sponsor and supporter of SQLBits, and the London Power BI User Group. To find out more and speak confidentially about your job search or hiring needs, please contact me directly at (url removed)
AWS Data Engineer - Permanent
Tenth Revolution Group
Multiple locations
Hybrid
Mid
£50k - £60k
aws
docker
Data Engineer - Hybrid (London-Based)
Full-Time
A leading creative technology company is seeking a skilled Data Engineer to join its data team. This role is ideal for someone passionate about building scalable data infrastructure and enabling data-driven decision-making across the business.
Role Overview
As a Data Engineer, you’ll be responsible for designing and maintaining robust data pipelines and models that support analytics and reporting. You’ll work with diverse datasets and collaborate with cross-functional teams to ensure data is accurate, accessible, and actionable.
Key Responsibilities
Build and maintain scalable ETL/ELT pipelines.
Design data models and schemas to support analytics and reporting.
Integrate data from APIs, internal systems, and streaming sources.
Monitor and ensure data quality and availability.
Collaborate with analysts, engineers, and stakeholders to deliver clean datasets.
Optimise data architecture for performance and reliability.
Share best practices and contribute to team knowledge.
Required Skills
3+ years in a data engineering role.
Proficient in SQL and Python.
Strong experience with AWS services (e.g., Lambda, Glue, Redshift, S3).
Solid understanding of data warehousing and modelling: star/snowflake schema
Familiarity with Git, CI/CD pipelines, and containerisation (e.g., Docker).
Ability to troubleshoot BI tool connections (e.g., Power BI).
Desirable Skills
Experience with Infrastructure as Code (e.g., CloudFormation).
Please send me a copy of your CV if you meet the requirements
Data Engineer
CPS Group (UK) Limited
Cardiff
Hybrid
Mid
£55k - £62k
sql
Role: Data Engineer
Location: Cardiff (Hybrid - 3 days WFH)
Salary: Up to 62,000 + 10% Bonus + Excellent Benefits
The Role
CPS Group are partnering with a leading organisation to hire an experienced Data Engineer. You’ll design, build, and optimise data pipelines and integrations, supporting ERP migration projects and shaping a centralised enterprise data platform. This is a key role ensuring reliable, high-quality data for business intelligence and decision-making.
Key Responsibilities:
Build and maintain scalable data pipelines (SQL / Azure).
Support ERP migration and system integrations.
Ensure data quality, validation, and performance.
Collaborate with BI and Data teams to deliver business insights.
Apply DataOps best practices (version control, CI/CD, documentation).
Skills & Experience:
Strong SQL development and optimisation skills.
Experience with MS SQL Server, Azure SQL, Azure Data Factory.
Proven track record in data engineering, ETL/ELT, and system migrations.
Knowledge of data modelling, governance, and quality assurance.
Excellent communication and problem-solving abilities.
Benefits:
Up to 62,000 + 10% Bonus
Private Healthcare
6% Employer Contribution Pension
25 days holiday + bank holidays
Flexible hybrid working
Free Parking
Life Assurance (3x Salary)
If interested in finding out more details, please email a copy of your CV and time to arrange a call.
By applying to this advert you are giving CPS Group (UK) Ltd authority to hold and process your data for this specific role and any other roles we may deem suitable to you over time. We will not pass your data to any third party without your verbal or written permission to do so. All incoming and outgoing calls are recorded for training and compliance purposes. CPS Group (UK) Ltd is acting as an Employment Agency in relation to this vacancy. Our new privacy policy can be found here (url removed)>
Business Intelligence Strategist / Lead Data Analyst
Red Kite Recruitment Group
Wakefield
In office
Leader
£60k - £70k
looker
sql
tableau
metabase
NATIONAL BUSINESS REQUIRES PROACTIVE BUSINESS INTELLIGENCE STRATEGIST / LEAD DATA ANALYST TO SPEARHEAD DATA AND REPORTING STRATEGY, BUILDING A DATA-DRIVEN CULTURE FROM THE GROUND UP WITH A DATA WAREHOUSE-FIRST MINDSET.
TITLE: Business Intelligence Strategist / Lead Data Analyst
SALARY: £(phone number removed)
LOCATION: Office Based near Wakefield
You may have been a: Data Officer, Data Strategist, Data Analytics Manager, Business Intelligence Manager, Data Governance Manager, Reporting and Analytics Manager, Data Operations Manager, Head of Data Strategy, Business Intelligence Strategist, Data Insights Manager, Enterprise Data Manager, Data Program Manager, Analytics and Reporting Lead, Data Solutions Architect
RESPONSIBILITIES: Business Intelligence Strategist / Lead Data Analyst
Engage with non-technical stakeholders to identify business challenges, define requirements, and create effective reporting solutions.
Lead the transition from reporting on production databases to a robust data warehouse as the single source of truth.
Design and build new ETL/data pipelines to replace outdated nightly rebuild processes.
Oversee the selection of a BI platform and manage the migration of reports, integrating them into internal systems ( 300 users) and customer-facing portals (thousands of users).
Shift the reporting culture from Excel-heavy, tabular outputs to streamlined dashboards and KPI-focused insights.
Develop a range of reports, from operational to advanced dashboards, including financial and forecasting models.
Establish data standards and governance to ensure consistency and reliability across all reports.
Explore AI/ML opportunities to enhance analytics capabilities.
Mentor and guide a junior data analyst to support their development and contributions.
ESSENTIAL EXPERIENCE: Business Intelligence Strategist / Lead Data Analyst
Advanced, hands-on SQL expertise.
Proven experience designing and implementing data warehouses and ETL pipelines.
Demonstrated ability to work with non-technical stakeholders to define requirements and deliver actionable insights.
Comfortable navigating ambiguous or evolving requirements.
Pragmatic approach, prioritizing timely, effective solutions over perfectionism.
Experience mentoring or managing team members.
PREFERRED EXPERIENCE: Business Intelligence Strategist / Lead Data Analyst
Experience with SSRS
Knowledge of embedded reporting solutions.
Familiarity with API-driven data integration.
Proficiency with modern BI tools (e.g., Power BI, Tableau, Looker, Metabase).
Understanding of AI/ML applications in business intelligence.
You may have been a: Data Officer, Data Strategist, Data Analytics Manager, Business Intelligence Manager, Data Governance Manager, Reporting and Analytics Manager, Data Operations Manager, Head of Data Strategy, Business Intelligence Strategist, Data Insights Manager, Enterprise Data Manager, Data Program Manager, Analytics and Reporting Lead, Data Solutions Architect
Data Engineer
Spectrum IT Recruitment
Fareham
In office
Mid
£40k - £60k
chef
aws
looker
Data Integration, ETL pipelines, BigQuery
Fareham, Hampshire. This role is to work in the office full time.
Salary circa 60,000 plus benefits
In 2023, cybercrime cost UK businesses an estimated 21 billion. But don’t think its the just the big corporates at risk, the average cybercrime value in the UK is just over 10,000 demonstrating that personal finance and small business cybercrime is rife.
Would you like to be part of the solution?
We are working with an award winning leader in the field of cyber security. They are on a mission to build a safer digital world for you and your future self! They have built a suite of innovative products designed to offer superior protection against a broad spectrum of online threats.
The role of Data Engineer is a new position in the team demonstrating the increasing demand for a robust cyber solution.
Working with a talented software team you will be helping to build and maintain my clients data infrastructure with a specific emphasis on BigQuery and Looker for business intelligence and reporting. You will be responsible for developing and managing ETL pipelines that seamlessly integrate data from various sources, transforming it into a clean, structured, and easily accessible format.
On top of a competitive salary (approx 60k) the company offer some fantastic financial and lifestyle benefits including; free access to local gym, onsite chef (free cooked breakfast & lunch!), childcare vouchers, cycle to work scheme, pension, BUPA healthcare, investment in training and personal development.
Key Responsibilities
Data Integration and ETL
Data Warehousing and Modelling
Business Intelligence and Reporting
Data Quality and Governance
Technical Skills:
ETL
Looker
BigQuery
Cloud technologies such as GCP or AWS
SCSS / The company offers a competitive salary, exposure to new technology, career progression and a great working environment with a talented team.
Interviews are being held within the next couple of weeks so please get in touch via email or give me a call on the following (url removed) or call (phone number removed)
Spectrum IT Recruitment (South) Limited is acting as an Employment Agency in relation to this vacancy.
Lead Data Engineer
Retelligence
London
In office
Leader
£100k - £110k
processing-js
kafka
python
sql
Salary/Rate: £100,000 - £110,000 per annum + Bonus
Location: North London
Company: Retelligence
About Retelligence
Retelligence is partnering with a high-growth, forward-thinking organization that specializes in digital innovation and marketing across international markets. The company is on an exciting journey, rapidly scaling its capabilities and leveraging advanced technology to deliver cutting-edge solutions. Join a dynamic team within a business that values innovation, supports professional development, and offers exceptional career progression.
The Role
Retelligence is seeking a Lead Data Engineer to take a hands-on role in designing and delivering robust, real-time data pipelines and infrastructure in a Google Cloud Platform (GCP) environment. The company is particularly interested in candidates with strong expertise in SQL. As the Lead Data Engineer, you ll play a critical role in shaping their data architecture and driving transformation. You ll partner closely with engineering, product, and analytics teams to ensure efficient, high-performance data systems that enable the business to thrive in a fast-paced environment.
Key Responsibilities:
Design, develop, and maintain scalable, data pipelines and infrastructure in a GCP environment.
Integrate multiple data sources to ensure seamless data flow across the organization.
Build and optimize data models for querying and analytics use cases.
Develop fault-tolerant, highly available data ingestion and processing pipelines.
Continuously monitor and improve pipeline performance for low-latency and high-throughput operations.
Ensure data quality, integrity, and security across all systems.
Implement effective monitoring, logging, and alerting mechanisms.
Collaborate with product, engineering, and analytics teams to deliver tailored solutions that meet business needs.
About You
Strong hands-on experience in data engineering with expertise in Python.
Proven track record of building and managing data pipelines.
In-depth experience with Google Cloud Platform (GCP) and its associated tools for data ingestion and processing.
Familiarity with distributed streaming platforms such as Kafka or similar technologies.
Advanced knowledge of SQL.
Experience with data orchestration tools.
Ability to optimize and refactor data pipelines for improved performance and scalability.
Strong problem-solving skills and the ability to thrive in a collaborative, fast-paced environment.

Frequently asked questions

What qualifications do I need to become a Data Engineer?
Typically, a Data Engineer should have a strong background in computer science or related fields, proficiency in programming languages like Python or Java, and experience with data warehousing, ETL processes, and big data technologies such as Hadoop or Spark.
What types of Data Engineer jobs can I find on Haystack?
Haystack features a wide range of Data Engineer positions, including roles in startups, large enterprises, and remote opportunities. You can find jobs specializing in cloud data engineering, real-time data processing, data pipeline development, and more.
How can I improve my chances of getting hired as a Data Engineer?
To improve your chances, tailor your resume to highlight relevant skills and projects, gain hands-on experience with popular data tools, contribute to open-source projects, and stay updated with the latest trends in data engineering.
Are entry-level Data Engineer jobs available on Haystack?
Yes, Haystack lists entry-level Data Engineer roles suitable for recent graduates or professionals transitioning into data engineering, as well as internships and junior positions to help you start your career.
Can I find remote Data Engineer positions on Haystack?
Absolutely. Haystack offers many remote and flexible Data Engineer job listings to suit your preferred working style and location.