Make yourself visible and let companies apply to you.
Roles

Remote Data Engineer Jobs

Overview

Find top remote Data Engineer jobs with Haystack, your go-to IT job board for flexible, work-from-anywhere opportunities. Explore the latest openings in data engineering, build scalable data pipelines, and work with cutting-edge technologies—all from the comfort of your home. Start your remote Data Engineer career today!
Filters applied
Remote
Data Engineer
Search
Salary
Location
Remote preference
Role type
Seniority
Tech stack
Sectors
Contract type
Company size
Visa sponsorship
Business Systems Developer
RPS Group Plc
Abingdon
Fully remote
Junior - Mid
Private salary
RECENTLY POSTED
javascript
csharp
java
sql
vault
microsoft-azure
+1
We’re looking for a Business Systems Developer to join our EMEA Business Systems team at Tetra Tech. You’ll be part of a collaborative, international team that values initiative, quality, and clear communication across cultures and time zones.About the roleThe role encompasses a broad range of technologies and it’s not expected that you’ll have expertise in everything. This role will suit a self-starter, keen to learn and improve. Although this is a remote-first role, the position is based in the United Kingdom, with the main office in Abingdon, Oxfordshire. Occasional travel to the Abingdon office or other UK offices may be required for team meetings and collaboration sessions.About YouAs a Business Systems Developer, you’ll be responsible for designing, developing and supporting integrations between our global Oracle EBS system and various internal business systems and external services. You’ll support a cross-platform suite of services relying on both the Oracle and Microsoft Azure ecosystems utilizing technologies including Data Factory, SQL, Java and C#. You’ll work on new initiatives including application development using Oracle APEX and data analysis in Oracle Analytics Cloud and AutoML as well as supporting existing systems.Your Responsibilities
Assist in developing and maintaining integrations between Oracle EBS and internal/external systems using Java and Azure Data Factory
Assist in the development of business applications primarily using Oracle APEX as well as JavaScript frameworks
Support existing applications, services and infrastructure
Write and maintain high-quality technical documentation
Collaborate closely with stakeholders (business leads, IT functional teams) across multiple countries
Provide 2nd and 3rd line support for incidents and service requests
Skills & Experience
Solid experience with SQL and software development processes using Java
Experience with Microsoft Azure services, in particular Data Factory, SQL Server, Logic Apps, Storage Accounts and Key Vault
Experience or working knowledge of JavaScript and knowledge of any JavaScript framework
Experience with Oracle Database
Solid experience with Microsoft DevOps and/or other source control systems
Strong analytical and communication skills
Keen desire to learn and develop skills
Comfortable working in a remote environment with distributed teams
Knowledge of other technologies including C#.Net, Oracle APEX, Oracle Analytics Cloud and SQL Server Integration Services would be advantageous
RPS, a Tetra Tech companyRPS, part of Tetra Tech since January 2023, is a global firm that defines, designs, and manages projects in urbanisation, natural resources, and sustainability. As part of Tetra Tech’s 28,000-strong team across 550 offices in over 120 countries, we deliver solutions that create lasting value in an increasingly urbanised and resource-scarce world. By leveraging our global expertise, we enable our clients to develop winning solutions for their clients and communities.As a Tetra Tech company, RPS is proud to provide market-leading development and project opportunities for our people, supporting their growth while addressing the challenges that matter. Our people drive our success, and this is where you come to build a career.What happens next?If we feel you are a good match you will be invited to attend a competency-based interview. All applications will be considered. Ready to apply? Please have your CV ready and continue with your application online. #LI-JP1
Principal Data Engineer
Places for People
United Kingdom, United Kingdom
Fully remote
Senior
Private salary
RECENTLY POSTED
processing-js
python
airflow
sql
composer
At Places for People, we hire People, not numbers! So, if you like the sound of one of our jobs, please apply - you could be just who we’re looking for! Of course, experience and track record are important, but we’re more interested in hiring someone that embodies our People Promises. That’s someone that does the right thing, is enthusiastic and motivated to grow, believes in Community spirit, is respectful and enjoys their work. As the UK’s leading Social Enterprise, we don’t discriminate based on any protected attribute. In fact, we’re dedicated to creating inclusive and thriving Communities for both our Customers and Employees.So, what are you waiting for? Join a Community that cares about you!More about the teamThe Data and Platform Engineering team are the foundation for the Data Office function. Responsible for designing, building, and maintaining PfP’s data platform we extract data from source, transform it into a usable format, load it into consumer models and marts and build and manage the infrastructure to do all this work.Data Engineering are transforming the way PfP consumes data. Having transitioned from On Premise to Google Cloud we are in the process of building a leading-edge Data Mesh platform. This is an exciting time to join a growing business function, with the opportunity to make your mark in the architecture of the platform and the development of the data engineering function.More about your roleThe Principal Data Engineer is a senior technical leader who drives the engineering strategy, architecture, and best practices across product domain squads. This role is pivotal in enabling decentralised data ownership while ensuring consistency, scalability, and interoperability across the data mesh.Key responsibilities include:
Technical leadership across product domains
Architecture and design
Mentorship and capability building
Cross domain interoperability
Governance and compliance enablement
Innovation and strategic influence
With a solid understanding of Google Cloud Platform, the Principal Data Engineer is responsible for the ensuring that the design and build of all data processes on the data platform are robust, performant, and compliant. This includes, data ingestion, data quality / integrity, transformation, security and encryption, batch management, monitoring, alerting and cost control.In addition to leading data processing the Principal Data Engineer will help design and build the Data Mesh including data modelling and the processing of data from raw through the semantic layers.The Principal Data Engineer will identify opportunities for automation and process improvement, coach and mentor data engineers, set coding standards and best practices, implement and document data integrity and quality checks, optimise queries, and facilitate data engineering collaboration across the team.The Principal Data Engineer will work hand in glove with the Principal Data Platform Engineer and the Data Domain Architect to ensure that the data platform and data pipeline design is optimised and reliable within Google Cloud Platform, documenting the approach and explaining the solution to engineers and non-technical business users.More about youYou will have an extensive cloud data engineering background with deep expertise in distributed systems, cloud platforms and modern data stacks. You will have a strong understanding of domain driven design, data mesh and product thinking. You will be an excellent communicator and collaborator across technical teams.Having worked on multiple projects within the cloud you have hands on experience in many of the tools and technologies on offer, and you embrace and learn new technologies quickly.You have a very clear view of what good looks like and can formulate plans to deliver a target state working closely with managers and engineers to deliver that vision.You will have multiple years’ experience working in GCP with good knowledge across the platform and deep knowledge in core processing and orchestration products such as Big Query, Data Flow, Data Fusion, Data Stream, Cloud Functions, Data Proc and Airflow / Composer.You will have excellent problem-solving skills, a rigorous approach to code checks / peer reviews and have the strength of character to drive high standards in the team. You will be able to manage and participate in the full development lifecycle of data products.You will have held a leading role in a Data Engineering function with responsibility for the directing the efforts of other data engineers though the design, build and deployment of complex data solutions. This includes driving the implementation and adoption of CI / CD.You will be self-motivated with excellent leadership qualities, capable of driving innovation and mentoring data engineers.Experience & Skills
A proven track record within Data Engineering,
Experience in a Lead / Principal Engineer role
Experience in a cloud data platform experience
GCP experience and associated tech stack
Strong understanding of Data Mesh principles (direct experience beneficial)
Technical mentoring / coaching skills
Extensive experience with Data Lake / Warehouse solutions
Strong proficiency in multiple languages with SQL and Python as must haves
In-depth knowledge of query optimization techniques and experience in fine-tuning complex queries.
Strong understanding of Data Governance including Data Dictionaries, MDM, Lineage, Data Legislation, and the handling of PII
Exceptional communication skills and the ability to work collaboratively with cross functional teams
Experience of Agile / Scrum / SDLC
We are a large diverse and ambitious business, which will give you all the challenge you could wish for.We know that there’s always more we can do to make you smile, that’s why we offer a comprehensive benefits package with each role, yours will include:
Competitive salary, with a salary review yearly
Pension with matched contributions up to 7%
Excellent holiday package – 35 days annual leave with the option to buy or sell leave
Cashback plan for healthcare costs – up to £500 saving per year
A bonus scheme for all colleagues at 2%
Training and development
Extra perks including huge discounts and offers from shops, cinemas and much more.
What’s next?If you meet the criteria and are ready to make the next step in your career then click apply. You will be redirected to our careers site where you can discover more about the role, read a full job description and apply directly to us.As part of our commitment to diversity and inclusion, we offer a guaranteed interview to candidates who are disabled, neurodiverse, or have served in the Armed Forces, provided they meet the essential criteria for the role. If you would like to be considered under this scheme, please indicate this in your application. We are dedicated to creating a supportive and accessible recruitment process for all.We are committed to creating an inclusive and accessible recruitment process. If you require any reasonable adjustments to support your application or interview experience, please let us know. We’re happy to work with you to ensure you have the opportunity to perform at your best.If you are a Places for People customer and you’re looking for support with your application, please contact our skills and employment team on skillsemployment@placesforpeople.co.uk.If you are a recruitment agency please note we operate a PSL and do not take cold callsSafeguardingAt Places for People, safeguarding is everyone’s responsibility. We are committed to creating safe communities for our customers and colleagues by protecting children, young people, and adults at risk from harm, abuse, and neglect.We follow robust safeguarding policies and procedures, ensuring all employees, volunteers, and contractors uphold the highest standards of safeguarding and accountability. Our recruitment process includes pre-employment checks, including Disclosure and Barring Service (DBS) checks where applicable, to promote a safe and secure working environment.By joining Places for People, you are expected to contribute to our safeguarding culture, following our policies and reporting concerns to protect those in our communities
Data Modeller (Finance)
DGH Recruitment Ltd
Multiple locations
Remote or hybrid
Senior - Leader
£70,000 - £76,000
RECENTLY POSTED
dot-net
python
sql
Data Modeller (Finance) / Data Manager - ModellingSummary: An exciting opportunity for a qualified, self-motivated, and highly organized individual to lead financial data modelling activities. The role involves managing a small team, driving improvements in financial data modelling processes, and supporting strategic decision-making.Key Responsibilities Team Leadership: Supervise and develop a finance team to deliver accurate, high-quality models and reports. Model Development & Maintenance: Design, build, and maintain financial data models for planning, forecasting, and performance evaluation. Technology Integration: Collaborate with IT and security teams to adopt modern tools and platforms. Process Improvement: Streamline modelling processes, enhance automation, and ensure data accuracy. Governance & Compliance: Maintain documentation, version control, and audit readiness. Stakeholder Engagement: Provide financial insights and reports to support business decisions.What You’ll Bring Technical Skills: Advanced Excel and experience with BI tools; knowledge of VBA, SQL, Python, .NET, and related technologies. Leadership: Proven ability to manage and motivate teams. Analytical & Commercial Acumen: Strong financial analysis and strategic thinking. Communication: Ability to present complex information clearly to senior stakeholders. Project Management: Experience delivering projects from planning to implementation.In accordance with the Employment Agencies and Employment Businesses Regulations 2003, this position is advertised based upon DGH Recruitment Limited having first sought approval of its client to find candidates for this position.DGH Recruitment Limited acts as both an Employment Agency and Employment Business
SME Microsoft Fabric Engineer - End-to-End Delivery
IO Associates
UK
Fully remote
Mid - Senior
Private salary
RECENTLY POSTED
fabric
python
sql
contentful
pyspark
snowflake
+1
Contract Opportunity: SME Microsoft Fabric Engineer - End-to-End DeliveryWe’re supporting a major organisation on a high-impact project that requires an experienced Microsoft Fabric Engineer who has delivered Fabric solutions end-to-end, from Lakehouse pipelines to Silver/Gold medallion models and Power BI - ready datasets.If you’ve worked extensively across Fabric’s engineering stack and can hit the ground running,this one’s for you.Location: Remote (UK)Day Rate: NegotiableContract: Short-term engagement with potential extensionYou’ll need production experience with:
Microsoft Fabric Lakehouse (Delta tables)
Synapse Data Engineering (Spark notebooks)
Synapse Data Warehouse (SQL)
Fabric Data Factory pipelines
Building Silver/Gold (Medallion) data models
Strong SQL + Python (ideally PySpark inside Fabric)
Transforming GA4 BigQuery export data
Ingesting external APIs/SaaS sources (ideally Contentful or similar)
Preparing Power BI-ready data models (fact/dim schemas)
Strongly Preferred:
Familiarity with Azure data services (ADF, ADLS, Azure SQL)
Experience joining web analytics with CMS/CRM data
Understanding of data governance + PII handling in analytics platforms
Nice to Have:
dbt (Core or Cloud)
Experience with modern lakehouse stacks (Databricks, Snowflake)
Background in nonprofit/charity analytics (not essential)
If you’ve delivered Microsoft Fabric projects end-to-end and can build clean, scalable data platforms using GA4 and CMS data, I’d love to hear from you.Please reply with your most up-to-date CV!
Power BI Developer - Nottingham / Remote £65k
Akkodis
Nottingham
Fully remote
Mid - Senior
£55,000 - £65,000
RECENTLY POSTED
fabric
dax
My prestigious client are looking for a Power BI enthusiast to come in play a key role in shaping their ambitious Data strategy.They are incredibly well-known with their sector with a flawless reputation and an enviable portfolio of clients. If you’re looking to join a company that’s investing heavily in its technology transformation and strategic growth agenda Look no further!As a business, they are in a great position as income and growth of the business is continually rising year on year. Growth has both been organic through good placing & hard work in their market, and some acquisitions too. They have an enviable portfolio of clients, including some huge corporate and public sector clients and are recognised as a leader in their market on both a local and national scale.There is currently a huge focus on their technology strategy and with an ambitious road-map in place for 2026, they’re now looking for their first, dedicated Power BI Developer to take the reins on the design and on-going development of a range of best-in-class BI solutions.You will be at the top of your game with a proven track record in using enterprise-level Power BI in a professional services environment. There is also scope for you to get involved in high-level design and complex architecture to help truly shape their strategy. Essentially, you will take the lead in designing their BI solutions moving forward - think plenty of data modelling, DAX and striving to deliver high-quality reporting solutions across various departments. You’ll be to “go-to person” for all things BI and reporting - upskilling the existing team and inspiring better ways of working!Naturally you’ll have solid knowledge across Power BI Desktop & Power BI Service, Power Query and DAX. This is key, as you’ll join as the sole Power BI expert in the team and your remit will be to help up-skill the wider team too. Any Microsoft Fabric exposure for scalable datasets would be hugely desirable as they have a vision to implement Fabric into the business very soon.Essentially, you will take the lead in designing their BI solutions moving forward - think plenty of data modelling, DAX and striving to deliver high-quality reporting solutions across various departments.What I really like about this role, is that it is a newly-created and autonomous position and very much a “blank canvass.”. a role you can make your own and one where you can inspire others whilst shaping the companies long-term Data strategy.This is your chance to join and work for an awesome Manager who has a great vision for the companies Data journey and you’ll play a key role alongside him, in shaping the way inwhich the company ultitise Data. It’s the type of environment where your voice will be both heard and valued too - they have a great reputation for treating their staff incredibly well. Its essentially, a lovely place to work - a close-knit and collabertive team where you’ll be truly supported from day one.We are flexible on ways of working but you must be open to visit their Nottingham-based HQ 1-2 times a month (or whenever needed!) and you can work the rest from home.Salary up to 65k depending on experience plus an awesome benefits package including bonus scheme, great pension and much more!I’m looking to shortlist this role ASAP so if you’re interested, please apply today or contact me directly on (phone number removed) or laura. removed) for immediate consideration.Modis International Ltd acts as an employment agency for permanent recruitment and an employment business for the supply of temporary workers in the UK. Modis Europe Ltd provide a variety of international solutions that connect clients to the best talent in the world. For all positions based in Switzerland, Modis Europe Ltd works with its licensed Swiss partner Accurity GmbH to ensure that candidate applications are handled in accordance with Swiss law.Both Modis International Ltd and Modis Europe Ltd are Equal Opportunities Employers.By applying for this role your details will be submitted to Modis International Ltd and/ or Modis Europe Ltd. Our Candidate Privacy Information Statement which explains how we will use your information is available on the Modis website.
Oracle Batch Developer
Experis
London
Remote or hybrid
Mid - Senior
£510/day
RECENTLY POSTED
processing-js
gitlab
Role: Oracle Batch DeveloperLocation: UK Remote but occasional client site visits in either Newcastle or ManchesterDuration: 6 MonthsDay rate: 510 Umbrella OnlyCandidates are required to have been a UK resident for a minimum of 5 yearsAbout the RoleWe are seeking an experienced Oracle Batch Developer to support a major Government department operating a highly integrated, mission-critical Oracle 19c OLTP environment. This role supports backend workloads that underpin essential business processes across multiple government departments and local authorities.You will play a key role in maintaining, optimising, and modernising the batch processing estate, ensuring the performance, reliability, and stability of a system that supports thousands of users and vital public services.Key Responsibilities
Develop, maintain, and optimise Pro C, PL/SQL, and shell script-based batch processes.
Support a high-volume OLTP environment across custom Oracle applications and Oracle E-Business Suite.
Analyse and modernise legacy code, working within multi-disciplinary teams to understand requirements, propose technical options, and prototype solution approaches.
Deliver robust, performant code and conduct performance testing in high-volume, transaction-driven environments.
Contribute to the ongoing enhancement of a mission-critical platform used across numerous government functions.
Essential Skills & Experience
Strong hands-on experience with:
Pro C
PL/SQL
Shell scripting (bash/ksh)
Experience developing and supporting batch workloads and high-volume OLTP systems within Oracle custom or Oracle Apps/EBS environments.
Proven analytical capability, including:
Legacy code analysis
Working with multidisciplinary teams
Providing technical input into solution design
Rapid prototyping of solution approaches
Ability to deliver highly performant, resilient, and maintainable code.
Experience with performance testing in demanding, high-transaction environments.
Desirable Skills
Experience with Oracle encryption, key management, and DBMS_CRYPTO.
Exposure to containerising workloads and migrating services to AWS EKS.
Experience building and maintaining CI/CD pipelines in GitLab.
Knowledge of AWS services.
Operational experience with:
Autosys (job scheduling)
Dynatrace (observability)
Prometheus (monitoring)
Senior Snowflake Data Engineer - Remote - £competitive
Tenth Revolution Group
London
Fully remote
Senior
£75,000 - £85,000
RECENTLY POSTED
snowflake
processing-js
aws
git
kafka
python
+4
Senior Snowflake Data Engineer - Remote - competitiveAbout the Role We are looking for an experienced Senior Snowflake Data Engineer to join a dynamic team working on cutting-edge data solutions. This is an exciting opportunity to design, build, and optimise high-performance data pipelines using Snowflake, dbt, and modern engineering practices. If you are passionate about data engineering, test-driven development, and cloud technologies, we’d love to hear from you.Key Responsibilities
Design, develop, and optimise scalable data pipelines in Snowflake.
Build and maintain dbt models with robust testing and documentation.
Apply test-driven development principles for data quality and schema validation.
Optimise pipelines to reduce processing time and compute costs.
Develop modular, reusable transformations using SQL and Python.
Implement CI/CD pipelines and manage deployments via Git.
Automate workflows using orchestration tools such as Airflow or dbt Cloud.
Configure and optimise Snowflake warehouses for performance and cost efficiency.
Required Skills & Experience
7+ years in data engineering roles.
3+ years hands-on experience with Snowflake.
2+ years production experience with dbt (mandatory).
Advanced SQL and strong Python programming skills.
Experience with Git, CI/CD, and DevOps practices.
Familiarity with ETL/ELT tools and cloud platforms (AWS, Azure).
Knowledge of Snowflake features such as Snowpipe, streams, tasks, and query optimisation.
Preferred Qualifications
Snowflake certifications (SnowPro Core or Advanced).
Experience with dbt Cloud and custom macros.
Exposure to real-time streaming (Kafka, Kinesis).
Familiarity with data observability tools and BI integrations (Tableau, Power BI).
What We Offer
Opportunity to work with modern data technologies and large-scale architectures.
Professional development and certification support.
Collaborative, engineering-focused culture.
Competitive salary and benefits package.
Interested? Apply now with your CV highlighting your Snowflake, dbt, and DevOps experience.
DataBricks Data Engineer
Tenth Revolution Group
London
Fully remote
Mid - Senior
£400/day - £450/day
RECENTLY POSTED
processing-js
Data Engineer (Databricks & Azure) - 3-Month Rolling ContractRate: 400- 450 per day Location: Remote IR35 Status: Outside IR35 Duration: Initial 3 months (rolling)About the CompanyJoin a leading Databricks Partner delivering innovative data solutions for enterprise clients. You’ll work on cutting-edge projects leveraging Databricks and Azure to transform data into actionable insights.About the RoleWe are seeking an experienced Data Engineer with strong expertise in Databricks and Azure to join our team on a 3-month rolling contract. This is a fully remote position, offering flexibility and autonomy while working on high-impact data engineering initiatives.Key Responsibilities
Design, develop, and optimize data pipelines using Databricks and Azure Data Services.
Implement best practices for data ingestion, transformation, and storage.
Collaborate with stakeholders to ensure data solutions meet business requirements.
Monitor and troubleshoot data workflows for performance and reliability.
Essential Skills
Proven experience with Databricks (including Spark-based data processing).
Strong knowledge of Azure Data Platform (Data Lake, Synapse, etc.).
Proficiency in Python and SQL for data engineering tasks.
Understanding of data architecture and ETL processes.
Ability to work independently in a remote environment.
Nice-to-Have
Experience with CI/CD pipelines for data solutions.
Familiarity with Delta Lake and ML pipelines.
Start Date: ASAP Contract Type: Outside IR35 Apply Now: If you’re a skilled Data Engineer looking for a flexible, remote opportunity with a Databricks Partner, we’d love to hear from you!
Senior .Net Developer
Morris Sinclair Recruitment
London
Fully remote
Senior
£70,000 - £100,000
RECENTLY POSTED
dot-net
csharp
sql
microsoft-azure
Remote-Friendly Software Development Opportunity in Agricultural Technology Join an award-winning agricultural analytics software company at the forefront of sustainable technology. We’re seeking exceptional Back-End Developers who can transform data into impactful solutions that enhance food production, farm economics, and environmental resilience.Position Highlights:
Remote Work
London based head office
Cutting-Edge Agricultural Technology Platform
Opportunity to Drive Sustainability through Innovative Software Solutions
Candidate Assessment Process:
LeetCode-Style Coding Challenge
Designed to evaluate algorithmic problem-solving skills
Focus on efficient, optimised solutions
Challenges will test:
Data Structures
Algorithm Design
Computational Thinking
Performance Optimisation
Ideal Candidate Profile: Academic and Professional Requirements:
Minimum Qualification: Degree/Masters/PhD in Computing, Data Science, or Equivalent
Proven Commercial Experience as a Back-End Microsoft Developer
Strong Academic Background with Demonstrable Achievements
Proven ability to excel in time-constrained, algorithmic problem-solving
Technical Expertise:
Advanced Proficiency in:
C# and .NET Framework
Microsoft Azure Cloud Platform
SQL and SQL Server
Complex Data-Driven Software Development
Strong Algorithm and Data Structure Knowledge
Preferred Background:
Experience in FinTech or Similar Structured Data Environments
Demonstrated Ability to Scale Applications for Large User Base
Previous success in technical coding assessments
Personal Attributes:
Exceptional Attention to Detail
Strong Collaborative and Communication Skills
Creative Problem-Solving Approach
Quick Analytical Thinking
Ability to perform under time pressure
Passion for Technological Innovation in Sustainability
What We Offer:
Work with Respected Professionals
Cutting-Edge Technology Solutions
Meaningful Impact on Agricultural Sustainability
Opportunity to Develop Market-Leading Software
On top of the very competitive salary all employees are included in the company share scheme
Principal Data Engineer
Police Scotland
Multiple locations
Remote or hybrid
Senior
£70,000
RECENTLY POSTED
python
nosql
java
scala
The postholder will utilise their exceptional technical skills and experience to provide the highest level of expert technical leadership in Data Engineering to the design, development, implementation and maintenance of integrations and data flows that connect operational systems, enabling the provision of highly capable data for analytics and business intelligence services across Police Scotland and the SPA. The role will involve designing, developing and implementing innovative, effective, efficient solutions to complex data integration problems, through the novel application of relevant technologies and techniques, in order to deliver secure, responsive, legislatively compliant data services across the organisation. You will provide exceptional technical leadership to the data engineering team in relation to the control and management of all data systems and services to ensure best performance, and alignment with business requirements. This will include, but not be limited to enterprise data archiving, data platforming, data ETL, data analytics and reporting technologies and associated infrastructure involved in the provision of all data services across the estate. Currently Police Scotland have guidance in place that allows appropriate roles to be operated on an agile basis. This is a permanent post which requires MV clearance. You will work 35 working hours per week, Monday Friday 9am 5pm Educational/Occupational Essential A degree in Data Science, Software Engineering, Computer Science, Mathematics, or other STEM-based subject, accompanied by experience in a data engineering or computer science environment. OR Where no relevant formal qualifications exist a considerable track record of success within a data engineering or computer science environment. Personal Qualities Essential Experience participating in data science projects involving cross-functional teams and senior stakeholders. Experience in leading data science capability building across teams and wider organisations through the construction of robust, scalable integration processes. Ability to communicate technical concepts effectively to non-technical stakeholders. Experience in re-engineering manual data flows. Creativity to solve complex problems. Ability to work under pressure and to demanding deadlines. Excellent communication skills with the ability to communicate, interact and influence at all levels of the organisation. Excellent attention to detail Personal Qualities Desirable Experience of Continuous Integration/Continuous Development and Agile frameworks. Experience creating/delivering complex technical messages for a variety of readers. Proven ability to manage end-to-end enterprise data technologies. Special Aptitudes Essential Expert knowledge of the development and administration of relational and NoSQL database Systems and Data Warehousing, Business Intelligence concepts and best practices. Expert in architecting, designing, developing and testing ETL solutions. Good working knowledge with data governance principles, data privacy regulations (e.g. GDPR), and data security best practices. Special Aptitudes Desirable Experience with geospatial data preparation and related tools. Knowledge and experience of programming languages such as Python, Java, or Scala. Good knowledge of machine learning principles and their application in data engineering. The Digital Division has more than 350 staff across 14 locations, supporting the technological provision, development and transformation of digital services to in excess of 22,000 Police officers and staff across the organisation. We continue to introduce new technologies and systems to support continuous improvement as a catalyst to new ways of working and creating new options for business functions to improve efficiencies. The division have delivered more than 10,000 mobile devices to our officers, implemented body worn video for our armed officers, supported the provision of virtual courts, plus progressing through the implementation of a single crime reporting system. We continue to deliver innovative and enabling technology through the development and implementation of numerous projects which will transform our services for a digital future. We will completely transform our communication platform across the organisation and how the public interact with our contact centres. We will deliver an end-to-end service across the Criminal Justice sector which will collect, manage, and share digital evidence throughout the criminal justice process. We will introduce new technologies and systems to allow us to ensure that Data is at the heart of everything we do and is captured, managed, protected and accessible to the benefit of Police Scotland and its partners. More than £1.4 million worth of training has been allocated to the Digital Division function in the last few years to ensure our people can continue to develop their skills to align with the future of the division. Digital Division have a range of training options available to staff, these include access to LinkedIn Learning licences, internal training and funded classroom training via our contracted training provider. Why join us? Competitive salary with annual increments Full-time or part-time shift patterns 28 days annual leave and 6 public holidays Local government pension scheme for long-term security Ongoing training to develop your skills Opportunities for career progression and professional growth Comprehensive wellbeing support and dynamic work environment Exclusive discounts and savings through our rewards and benefits network Full details regarding this vacancy can be found in the attached Role Profile. Applicants must be a British citizen, a member of the EU or other states in the EEA, a Commonwealth citizen or a foreign national free of restrictions. TPBN1_UKTJ
Sr.Data Engineer
Cognizant
Multiple locations
Remote or hybrid
Senior
Private salary
RECENTLY POSTED
aws
terraform
github
grafana
kafka
python
+4
We are hiring a senior Data Engineer to lead the development of intelligent, scalable data platforms for Industry 4.0 initiatives. This role will drive integration across OT/IT systems, enable real-time analytics, and ensure robust data governance and quality frameworks. The engineer will collaborate with cross-functional teams to support AI/ML, GenAI, and IIoT use cases in manufacturing and industrial environments. Key Responsibilities Architect and implement cloud-native data pipelines on AWS or Azure for ingesting, transforming, and storing industrial data. Integrate data coming from OT systems (SCADA, PLC, MES, Historian) and IT systems (ERP, CRM, LIMS) using protocols like OPC UA, MQTT, REST. Design and manage data lakes, warehouses, and streaming platforms for predictive analytics, digital twins, and operational intelligence. Define and maintain asset hierarchies, semantic models, and metadata frameworks for contextualized industrial data. Implement CI/CD pipelines for data workflows and ensure lineage, observability, and compliance across environments. Collaborate with AI/ML teams to support model training, deployment, and monitoring using MLOps frameworks. Establish and enforce data governance policies, stewardship models, and metadata management practices Monitor and improve data quality using rule-based profiling, anomaly detection, and GenAI-powered automation Support GenAI initiatives through data readiness, synthetic data generation, and prompt engineering. Mandatory Skills: Cloud Platforms:Deep experience with AWS (S3, Lambda, Glue, Redshift) and/or Azure (Data Lake, Synapse). Programming & Scripting:Proficiency in Python, SQL, PySpark etc. ETL/ELT & Streaming:Expertise in technologies like Apache Airflow, Glue, Kafka, Informatica, EventBridge etc. Industrial Data Integration:Familiarity with OT data schema originating from OSIsoft PI, SCADA, MES, and Historian systems. Information Modeling:Experience in defining semantic layers, asset hierarchies, and contextual models. Data Governance:Hands-on experience Data Quality:Ability to implement profiling, cleansing, standardization, and anomaly detection frameworks. Security & Compliance:Knowledge of data privacy, access control, and secure data exchange protocols. Defining and creating MLOPs pipeline Good to Have Skills GenAI Exposure:Experience with LLMs, LangChain, HuggingFace, synthetic data generation, and prompt engineering. Digital Twin Integration:Familiarity with nVidia Omniverse, AWS TwinMaker, Azure Digital Twin or similar platforms and concepts Visualization Tools:Power BI, Grafana, or custom dashboards for operational insights. DevOps & Automation:CI/CD tools (Jenkins, GitHub Actions), infrastructure-as-code (Terraform, CloudFormation). Industry Standards:ISA-95, Unified Namespace (UNS), FAIR data principles, and DataOps methodologies. TPBN1_UKTJ
Junior DataStage Developer
Experis IT
London
Remote or hybrid
Junior
£205/day
RECENTLY POSTED
jira
xslt
xml
Role Title: Junior DataStage Developer Location: Remote/London (occasional travel required) Clearance: BPSS required to start. Must be eligible to undergo SC Clearance.Duration: 7 months Start Date: ASAPThe ideal candidate will have active SC Clearance or be eligible to undergo SC Clearance.We are actively looking to secure two Junior DataStage Developers to join Experis. Experis Consultancy is a global entity with a well-established team of over 1000 consultants on assignment across 20 clients internationally. Our UK operation is expanding rapidly with ambitious growth plans for the coming years. We form part of the Manpower Group, collectively generating over $20 billion annually.Experis UK partners with major clients across multiple industries. Our approach is personal and collaborative for both our clients and our own employees. We are passionate about training, technology, and career development.Job Purpose/The Role:You will support a major data migration programme within a large-scale transformation project. Working closely with cross-functional teams, you will contribute to DataStage development, data transformation tasks, and migration workflows using established methodologies.Your Key Responsibilities:
Develop, enhance, and maintain DataStage Designer jobs.
Support data transformation activities using XML and XSLT.
Work within structured data migration frameworks such as MOSAIC
Use Jira for task tracking, sprint planning, and issue management.
Collaborate with technical and non-technical stakeholders to ensure smooth delivery.
Contribute to continuous improvement of processes and migration approaches.
Support project activities related to the specific programme.
Your Skills:Essential:
MUST be a UK National with a minimum of 5 years’ residency.
Proven experience with DataStage Designer.
Familiarity with MOSAIC or equivalent data migration frameworks.
Strong proficiency in XML & XSLT for data transformation tasks.
Confident use of Jira for project and issue management.
Excellent communication and stakeholder engagement skills.
Proactive, solutions-focused mindset with commitment to continuous learning.
Desirable:
Experience within large-scale transformation or enterprise data environments.
Knowledge of additional data engineering tools or processes.
Benefits Include:
Contributory pension scheme
Employee Assistance Programme
Medical and Dental cover
22 days holiday + bank holidays
Maternity pay, shared parental leave, and paternity leave
Sick pay
Senior Data Engineer - Insurance - Remote
Michael Page
London
Fully remote
Senior
£80,000 - £120,000
RECENTLY POSTED
kafka
python
java
sql
hadoop
scala
Senior Data EngineerThe Senior Data Engineer will play a crucial role in designing, implementing, and maintaining scalable data pipelines and infrastructure. This position is ideal for those with strong technical expertise and a passion for working in the Insurance / Financial services industry.Client DetailsSenior Data EngineerThe employer is a medium-sized organisation operating in the F sector. They focus on delivering innovative solutions and maintaining a strong reputation for excellence in analytics and data-driven decision-making.DescriptionSenior Data Engineer
Develop and maintain robust and scalable data pipelines and ETL processes.
Optimise data workflows and ensure efficient data storage solutions.
Collaborate with analytics and engineering teams to meet business objectives.
Ensure data integrity and implement best practices for data governance.
Design and implement data models to support analytical and reporting needs.
Monitor and troubleshoot data systems to ensure reliability and performance.
Evaluate and implement new tools and technologies to improve data infrastructure.
Provide technical guidance and mentorship to junior team members.
ProfileSenior Data EngineerA successful Senior Data Engineer should have:
Experience within the Insurance industry
Strong proficiency in programming languages such as Python, Java, or Scala.
Experience with cloud platforms like Azure.
Knowledge of big data technologies such as Hadoop, Spark, or Kafka.
Proficiency in SQL and database management systems.
Familiarity with data warehousing concepts and tools.
Ability to work collaboratively with cross-functional teams.
A solid understanding of data security and privacy standards.
A degree in Computer Science, Engineering, or a related field.
Job OfferSenior Data Engineer
Competitive salary ranging from 80,000 to 120,000 (Experience depending).
Equity options as part of the compensation package.
Comprehensive benefits package.
Opportunity to work remotely.
Be part of a collaborative and innovative team in the Insurance sector.
If you are passionate about data engineering and are excited to work in a challenging and rewarding role, we encourage you to apply today!
Senior Manager - Network Modelling & Optimisation
Robert Walters
Multiple locations
Remote or hybrid
Senior
Private salary
RECENTLY POSTED
TECH-AGNOSTIC ROLE
Senior Manager, Network Modelling and Optimisation Salary: Competitive and based on experience Location: Melbourne (relocation support provided) A blue-chip logistics provider is seeking a Senior Manager, Network Modelling and Optimisation to transform one of Australia’s largest logistics networks. This role offers an exciting opportunity to shape the future of a complex logistics system by leveraging advanced mathematical modelling and optimisation techniques. As a key leader, you will drive strategic scenario planning, develop innovative solutions, and deliver actionable insights that enhance operational efficiency and cost-effectiveness. Working within a collaborative and inclusive environment, you’ll have access to cutting-edge tools, professional development opportunities, and the chance to make a tangible impact on both business outcomes and community service. Key Responsibilities Develop innovative network models aligned with long-term strategic objectives. Evaluate network options using advanced optimisation tools and methodologies. Collaborate with cross-functional teams to ensure practical, sustainable solutions. Lead automated data processes for site-level volume analysis across multiple aggregation levels. Construct detailed cost-benefit analyses to support regional strategies and business case development. Analyse volume flows and cost drivers to identify opportunities for improvement. Foster a culture focused on safety, wellbeing, customer-centricity, and continuous improvement. About You To excel in this role, you will bring: Proven expertise in quantitative methods within large-scale logistics or consulting environments (PhD preferred). Strong commercial acumen with hands-on experience in supply chain optimisation or network modelling. Exceptional leadership skills with experience managing teams toward shared goals while nurturing professional growth. Advanced proficiency in mathematical modelling tools (e.g., linear programming) and expert use of MS Excel or similar platforms. Demonstrated ability to build automated processes for large-scale volume modelling at various aggregation levels. Experience creating detailed cost-benefit analyses for scenario evaluation, including sensitivity testing for capital planning purposes. Relevant qualifications such as a Master’s or PhD in Data Science, Mathematics, or related fields are highly desirable. Why Join Us? This organisation fosters an inclusive workplace built on trust, collaboration, and innovation: Flexible working arrangements support work-life balance. Ongoing training ensures career growth tailored to your aspirations. A strong focus on wellbeing prioritises mental health alongside professional success. By joining this forward-thinking organisation, you’ll play a pivotal role in driving operational excellence while contributing to positive social impact-connecting people nationwide through innovation. Robert Walters Operations Limited is an employment business and employment agency and welcomes applications from all candidates
Data Engineer
BIOMETRIC TALENT
Preston
Remote or hybrid
Mid - Senior
£70,000
RECENTLY POSTED
airflow
sql
snowflake
dbt
About the Client Our client is a long-established, service-focused business delivering intelligent, data-driven solutions that help organisations increase efficiency, reduce operational risk, and streamline complex logistical processes. With a strong reputation for reliability and innovation, they serve a diverse portfolio of national and regional clients, many of whom rely on their services as a critical part of day-to-day operations. They pride themselves on combining technology with exceptional service delivery, offering bespoke solutions tailored to the evolving needs of their customers. The organisation continues to invest in digital transformation and strategic partnerships to remain at the forefront of operational excellence. Known for a collaborative and pragmatic culture, they value long-term relationships and continuous improvement. How youll spend your day Youll play a key role in building, maintaining, and optimising data pipelines and transformation workflows. Your focus will be on ensuring data integrity, reliability, and performance across the organisations cloud-based analytics environment. Develop and maintain automated data ingestion pipelines using Fivetran. Implement and manage dbt models for scalable data transformations. Monitor and optimise Snowflake performance and costs. Ensure version control and CI/CD best practices for dbt projects. Set up orchestration and monitor pipeline health. Troubleshoot and resolve issues to maintain smooth data operations. Collaborate with BI Analysts and Data Stewards to deliver trusted, compliant datasets. Support business teams with data availability and workflow optimisation. What youll bring to this role Were looking for a technically strong Data Engineer with a proactive, problem-solving approach and solid experience in modern data tools and practices. Proven experience with SQL and Snowflake performance tuning. Hands-on expertise with Fivetran and dbt. Good understanding of data modelling, governance, and security best practices. Familiarity with orchestration tools such as Airflow or Prefect (advantageous). Experience working in Azure (or AWS/GCP). Strong analytical and collaboration skills, with great attention to detail. A degree in Computer Science, Data Engineering, or relevant experience. Perks & Benefits: 2 x basic annual salary Death in service 8% pension(5% employee) 25 days holiday plus bank holidays What happens next? One of our Recruitment Consultants will be in touch and inform you if youve been successful to the next stage of the process or not, which is a qualification call where we will tell you more about the role and the client, and understand more about you, your experience and career aspirations. Should we both wish to proceed, we will submit your details to the client and be in touch regarding the outcome and any further steps. The interview process for this client consists of: Stage 1 Remote Technical Discussion Stage 2 Remote Competency and Culture Interviewwith the CTO Equal Opportunities We are committed to providing equal opportunities for all candidates and welcome applications from individuals regardless of age, disability, gender identity, marital status, race, religion or belief, sexual orientation, or any other characteristic protected by law. As an employment agency for permanent and contract hires, we are dedicated to promoting a diverse and inclusive workforce, and we encourage applications from underrepresented groups to drive innovation and equality within the workplace. Should you require any reasonable adjustments please let us know so we can accommodate for any interactions with us at Biometric Talent, but also inform the client to ensure reasonable adjustments are made to allow for a fair and equitable process. TPBN1_UKTJ
Lead Data Engineer
TPXImpact Holdings Plc
London
Remote or hybrid
Senior
£65,000
RECENTLY POSTED
fabric
aws
git
airflow
sql
dimensions
+1
About The Role Job level: 10 Were looking for a Lead Data Engineer to join our Data Engineering and Analytics practice. In this role, you will: Lead the design, development, management and optimisation of data pipelines to ensure efficient data flows, recognising and sharing opportunities to reuse data flows where possible. Coordinate teams and set best practices and standards when it comes to data engineering principles. Champion data engineering across projects and clients. Responsibilities Lead by example, holding responsibilities for team culture, and how projects deliver the most impact and value to our clients. Be accountable for the strategic direction, delivery and growth of our work. Lead teams, strands of work and outcomes, owning commercial responsibilities. Hold and manage uncertainty and ambiguity on behalf of clients and our teams. Ensure teams and projects are inclusive through how you lead and manage others. Effectively own and hold the story of our work, ensuring we measure progress against client goals and our DT missions. Work with our teams to influence and own how we deliver more value to clients, working with time and budget constraints. Strategically plan the overall project and apply methods and approaches. Demonstrably share work with wider audiences. Elevate ideas through how you write, speak and present. Dimensions Headcount : Typically leads a multidisciplinary team or multiple workstreams (team size 515) Resource complexity: Provides leadership across multiple workstreams or technical domains within a project or programme. Responsible for delivery coordination, prioritisation, and quality, often overseeing more junior leads or specialists. Problem-solving responsibility : Solves highly complex problems, balancing technical, user, business, and operational needs. Applies expert judgement to make decisions, manage risks, and guide teams through ambiguity. Change management requirements: Leads or co-leads significant change initiatives. Responsible for managing stakeholder expectations, supporting adoption, and embedding sustainable ways of working. Internal/External interactions: Acts as a trusted partner to client and internal stakeholders at multiple levels. Leads workshops, presentations, and stakeholder engagement to ensure buy-in, alignment, and delivery clarity. Strategic timeframe working towards: Works across mid- to long-term delivery cycles (612 months), ensuring that near-term work supports broader programme and client objectives. About You Professional knowledge and experience Essential Proven experience in data engineering, data integration and data modelling Expertise with cloud platforms (e.g. AWS, Azure, GCP) Expertise with modern cloud data platforms (e.g. Microsoft Fabric, Databricks) Expertise with multiple data analytics tools (e.g. Power BI) Deep understanding of data wareho using concepts, ETL/ELT pipelines and dimensional modelling Proficiency in advanced programming languages (Python/PySpark, SQL) Experience in data pipeline orchestration (e.g. Airflow, Data Factory) Familiarity with DevOps and CI/CD practices (Git, Azure DevOps etc) Ability to communicate technical concepts to both technical and non-technical audiences Proven experience in delivery of complex projects in a fast paced environment with tight deadlines Desirable Advanced knowledge of data governance, data standards and best practices. Experience in a consultancy environment, demonstrating flexibility and adaptability to client needs. Experience defining and enforcing data engineering standards, patterns, and reusable frameworks Professional certifications in relevant technologies (e.g. Microsoft Azure Data Engineer, AWS Data Analytics, Databricks Certified Professional Data Engineer) Skills Data Development Process Design, build and test data products that are complex or large scale Build and lead teams to complete data integration services integration and reusable pipelines that meet performance, quality and scalability standards Collaborate with architects to align solutions with enterprise data strategy and target architectures Data Engineering and Manipulation Work with data analysts, engineers and data science and AI specialists to design and deliver products into the organisation effectively. Understand the reasons for cleansing and preparing data before including it in data products and can put reusable processes and checks in place. Access and use a range of architectures (including cloud and on-premise) and data manipulation and transformation tools deployed within the organisation. Optimise data pipelines and queries for performance and cost efficiency in distributed environments Testing (Data) Review requirements and specifications, and define system integration testing conditions for complex data products and support others to do the same Identify and manage issues and risks associated with complex data products and support others to do the same Analyse and report system test activities and results for complex data products and support others to do the same Other Skills Proficiency in developing and maintaining complex data models (conceptual, logical and physical). Strong skills in data governance and metadata management. Experience with data integration design and implementation. Ability to write efficient, maintainable code for large scale data systems. Experience with CI/CD pipelines, version control, and infrastructure-as-code (e.g. Git, Azure DevOps). Strong stakeholder communication skills, with the ability to translate technical concepts into business terms. Ability to mentor junior engineers, foster collaboration, and build a high-performing data engineering culture. Behaviours and PACT values Purpose: Be values-driven, recognising that our client’s needs are paramount. Approach client engagements with professionalism and creativity, balancing commercial and operational needs. Accountability: Be accountable for delivering your part of a project on time and under budget and working well with other leaders. Lead by example, promoting a culture where quality and client experience are foremost. Craft: B alance multiple priorities while leading high-performing teams. Navigate ambiguity and set the technical direction and approach to support positive outcomes. Togetherness: Collaborate effectively with others across TPXimpact. Build strong relationships with colleagues and clients. About Us People-Powered Transformation We’re a purpose driven organisation, supporting organisations to build a better future for people, places and the planet. Combining vast experience in the public, private and third sectors and expertise in human-centred design, data, experience and technology, were creating sustainable solutions ready for an ever-evolving world. At the heart of TPXimpact, were collaborative and empathetic. Were a team of passionate people who care deeply about the work we do and the impact we have in the world. We know that change happens through people, with people and for people. Thats why we believe in people-powered transformation. Working in close collaboration with our clients, we seek to understand their unique challenges, questioning assumptions and building in their teams the capabilities and confidence to continue learning, iterating and adapting. Benefits Include: 30 days holiday + bank holidays 2 volunteer days for causes that you are passionate about Maternity/paternity - 6 months Maternity Leave, 3 months Paternity Leave Life assurance Employer pension contribution of 5% Health cash plan Personal learning and development budget Employee Assistance Programme Access to equity in the business through a Share Incentive Plan Green incentive programmes including Electric Vehicle Leasing and the Cycle to Work Scheme Financial advice Health assessments About TPXimpact - Digital Transformation We drive fundamental change in approaches to product and service development, delivery and technology. Our agile, multidisciplinary teams use technology, design and data to deliver better results, improving outcomes for individuals, organisations and communities. By working in the open, in partnership with our clients, we not only transform their systems and services but also build the capability of their teams, so work can continue without us in the longer term. Our focus is sustainable change, always delivered with positive impact. Were an inclusive employer, and we care about diversity in our teams. Let us know in your application if you have accessibility requirements during the interview. TPBN1_UKTJ
Data Engineer
Proactive Appointments
Berkshire
Remote or hybrid
Junior - Mid
£40,000 - £60,000
fabric
sql
Our client based in Reading is looking to recruit 2 experienced Data Engineers to join ASAP. The positions will be fully remote working with occasional visit to their offices in Reading. To be considered for the role you must have the following essential skills & experience:Key Skills & Experience
Whilst not a pre-requisite, knowledge and experience in the financial services industries would be advantageous.
Previous experience as a data engineer or in a similar role with a minimum of 2 years’ experience.
Familiarity with data models, data mining, and segmentation techniques.
Knowledge of SQL and Azure technologies.
Strong analytical skills to interpret trends and patterns in data.
Proficiency in database languages.
Attention to detail and excellent organizational skills
Technical Skills required
Familiarity with the building and deployment of ETL process through Azure Data Factory.
Basic understanding of Azure Fabric and appropriate implementation.
Familiarity with the management and use cases of Azure Data Lake Storage.
Familiarity with the development of analytics Azure Synapse.
Familiarity with Azure database service with strong knowledge of the SQL language.
An understanding of data modelling concepts.
An understanding of data warehousing concepts.
Implementing strong security around data sets.
Benefits
We offer an attractive reward package; typical benefits can include:
Competitive salary
Participation in Discretionary Bonus Scheme
A set of core benefits including Pension Plan, Life Assurance cover and employee assistance programme, 25 days holiday and access to a qualified, practising GP 24 hours a day/365 days a year
Flexible Benefits Scheme to support you in and out of work, helping you look after you and your family covering Security & Protection, Health & Wellbeing, Lifestyle
Due to the volume of applications received for positions, it will not be possible to respond to all applications and only applicants who are considered suitable for interview will be contacted.Proactive Appointments Limited operates as an employment agency and employment business and is an equal opportunities organisationWe take our obligations to protect your personal data very seriously. Any information provided to us will be processed as detailed in our Privacy Notice, a copy of which can be found on our website
Data Engineer Databricks
Hays Specialist Recruitment Limited
London
Remote or hybrid
Mid - Senior
£400/day - £450/day
unity-3d
delta-lake
vault
pyspark
About the RoleWe are seeking a Databricks Data Engineer with strong expertise in designing and optimising large-scale data engineering solutions within the Databricks Data Intelligence Platform. This role is ideal for someone passionate about building high-performance data pipelines and ensuring robust data governance across modern cloud environments.Key Responsibilities
Design, build, and maintain scalable data pipelines using Databricks Notebooks, Jobs, and Workflows for both batch and streaming data.
Optimise Spark and Delta Lake performance through efficient cluster configuration, adaptive query execution, and caching strategies.
Conduct performance testing and cluster tuning to ensure cost-efficient, high-performing workloads.
Implement data quality, lineage tracking, and access control policies aligned with Databricks Unity Catalogue and governance best practices.
Develop PySpark applications for ETL, data transformation, and analytics, following modular and reusable design principles.
Create and manage Delta Lake tables with ACID compliance, schema evolution, and time travel for versioned data management.
Integrate Databricks solutions with Azure services such as Azure Data Lake Storage, Key Vault, and Azure Functions.
What We’re Looking For
Proven experience with Databricks, PySpark, and Delta Lake.
Strong understanding of workflow orchestration, performance optimisation, and data governance.
Hands-on experience with Azure cloud services.
Ability to work in a fast-paced environment and deliver high-quality solutions.
SC Cleared candidates
If you’re interested in this role, click ‘apply now’ to forward an up-to-date copy of your CV, or call us now.If this job isn’t quite right for you, but you are looking for a new position, please contact us for a confidential discussion about your career.Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C’s, Privacy Policy and Disclaimers which can be found at hays.co.uk
Data Engineer
Buzz Bingo
Multiple locations
Remote or hybrid
Mid - Senior
£50,000 - £60,000
fabric
mysql
git
python
postgresql
mssql
+6
Job Title: Data EngineerContract Type: Permanent / Full-TimeAbout the RoleAre you passionate about technology and eager to make a real impact? At Buzz Bingo, we re looking for a Data Engineer who thrives on innovation and enjoys working across a diverse technology stack.The systems you ll support underpin both our in-club and online customer experiences, giving you the opportunity to influence how thousands of people interact with Buzz Bingo every day.What You ll Do
Data Pipeline Development: Design, implement, and maintain robust ETL/ELT pipelines for ingesting and transforming data from multiple sources.
Data Modelling: Create and maintain models that support analytics and reporting needs, ensuring data integrity and consistency.
Database Management: Administer and optimize relational databases for efficient storage and retrieval of large datasets.
Collaboration: Work closely with software engineers, analysts, and business teams to deliver secure, reusable, and efficient data solutions.
Data Quality Assurance: Implement checks and monitoring processes to ensure accuracy and reliability.
Documentation: Maintain detailed technical documentation for data architectures, pipeline designs, and operational procedures.
Performance Tuning: Analyse and optimize workflows for performance and cost efficiency.
Innovation: Stay current with emerging technologies and best practices to continuously improve our data engineering capabilities.
What You ll Get in Return
24/7 access to GPs, mental health support, and more for you and your family
Thrive App NHS-approved mental wellbeing support
Buzz Brights Apprenticeships & Buzz Learning access to 100s of online courses
Buzz Brilliance Awards employee recognition scheme
5 weeks annual leave plus public holidays (pro-rated for part-time roles)
Holiday Buy Scheme purchase an extra week of holiday (eligibility applies)
50% staff discount on bingo tickets, food, and soft drinks
Refer a Friend Scheme
Life Assurance & Pension Scheme
Access to trained Mental Health Advocates
What We re Looking ForEssential Skills & Experience:
Proven experience as a Data Engineer or similar role, with strong knowledge of data warehousing and modelling.
Proficiency in C#, Python, Java, or Scala.
Hands-on experience with ETL tools (e.g., SSIS) and orchestration tools (e.g., Azure Data Factory).
Strong SQL skills and experience with relational databases (MSSQL, PostgreSQL, MySQL).
Familiarity with Azure services (Fabric, Azure SQL, Synapse Analytics, Blob Storage) and hybrid cloud/on-prem solutions.
Understanding of data security best practices, GDPR compliance, and governance frameworks.
Strong experience with data visualization tools (Power BI, Tableau, SSRS).
Knowledge of CI/CD pipelines and version control (Git).
Experience with SSAS cubes, Azure-based data pipelines, and containerization technologies.
Must have a full UK Driving Licence and access to your own vehicle.
Desirable:
Familiarity with DAX Studio for performance tuning and query diagnostics.
Strong proficiency in DAX (Data Analysis Expressions) for creating complex measures, calculated columns, and tables
Background in retail, hospitality, or gaming/gambling sectors.
Why Join Buzz Bingo?
Work on impactful projects that shape customer experiences across the UK.
A collaborative, supportive environment where innovation is encouraged.
Opportunities to learn, grow, and work with cutting-edge technologies.
Ready to make an impact? Apply now and help us build secure, scalable, and innovative data solutions for Buzz Bingo!
Page 1 of 2

Frequently asked questions

What types of remote Data Engineer jobs are available on Haystack?
Haystack features a wide range of remote Data Engineer positions, including roles focused on data pipeline development, ETL processes, data warehousing, cloud data engineering, and real-time data processing across various industries.
How can I apply for remote Data Engineer jobs through Haystack?
You can browse remote Data Engineer job listings on Haystack, create a profile, upload your resume, and apply directly through the platform to employers offering remote positions.
Are remote Data Engineer roles on Haystack full-time, part-time, or freelance?
Haystack offers a variety of remote Data Engineer roles including full-time, part-time, contract, and freelance opportunities. You can filter job listings by employment type to find the best fit.
What skills are commonly required for remote Data Engineer positions listed on Haystack?
Common skills for remote Data Engineer jobs include proficiency in SQL, Python or Java, experience with cloud platforms like AWS or Azure, knowledge of big data tools such as Hadoop or Spark, and expertise in data modeling and ETL frameworks.
Does Haystack provide resources to help me prepare for remote Data Engineer interviews?
Yes, Haystack offers interview tips, sample questions, and career advice specifically geared towards Data Engineers seeking remote positions to help you prepare effectively.