Make yourself visible and let companies apply to you.
Role title
Roles
Apache Airflow Jobs
Trending Apache Airflow jobs
Get notified about new jobs that match this search?
HVAC Sales Engineer
WR HVACR
Southend-on-Sea
In office
Junior - Mid
£45,000
RECENTLY POSTED

Suitable for: Applications Engineer, Internal Sales Engineer, HVAC Estimator, Proposals Engineer, Technical Sales Engineer, Sales Estimator, Technical Estimator

Overview & Role

An established HVAC manufacturer specialising in air diffusion products is seeking a Technical Sales Engineer to support specification-led projects through accurate product selection, technical quotation, and coordination with consultants and contractors.

This role supports a high volume of technical enquiries, interpreting drawings and specifications to produce accurate, compliant quotations and product selections based on airflow, pressure drop and acoustic performance. You will act as a key link between consultants, contractors and internal teams, responding to technical queries, supporting specification retention and contributing to value engineering where required.

Requirements

  • Experience within HVAC or building services
  • Background in estimating, internal sales, or applications
  • Understanding of airflow principles and air distribution products (grilles, diffusers, louvres)
  • Ability to interpret drawings, specifications and schedules
  • Experience producing technical quotations
  • Strong attention to detail and ability to manage multiple enquiries
  • Commercial awareness with the ability to prioritise workload

Desirable:

  • Direct experience with air diffusion products
  • CAD (2D/3D) capability
  • Exposure to specification-led projects
  • Experience supporting CPDs (e.g. CIBSE)

Package

  • Salary circa £45,000
  • Christmas bonus
  • Full-time, office-based role
  • Monday to Friday, 08:30 - 17:00
  • Up to 25 days of annual leave

Interested in hearing more? Call Max Robinson on 02394 004 703 or Email

WR HVAC | M&E are #1 recruitment partner for HVAC jobs. Our customers include a range of large and small M&E companies, manufacturers and suppliers of heating, ventilation, air conditioning and refrigeration equipment. We recruit UK wide for sales, management and technical jobs.

WR is acting as an Employment Agency in relation to this vacancy.

LEV Engineer (P601 Certified)
Ernest Gordon Recruitment
Taunton
Hybrid
Mid - Senior
£45,000
RECENTLY POSTED

LEV Engineer (P601 Certified) £42,000 - £46,000 + Progress to Supervisor + Monday to Friday + Company Car + 1.5x Overtime + Hybrid Working
Taunton, Somerset

Are you an LEV Engineer or similar with your LEVP601, looking to join an industry leading manufacturer of LEV and filtration equipment who offer continued progression into senior management roles?

On offer is the opportunity to work for an industry leader in the manufacture and distribution of bespoke filter products both in the UK and beyond. They also offer ongoing servicing and maintenance plans to ensure quality and reliability for their customers. After being established for the last 50 years, they are looking to expand their service team due to an increase in their workload.

In this role, you will be provided with the support needed to further devel;op your skillset and move into a supervisory/management position. During this transition you will be working a varied role split between being hands-on and from home. Day to day you will be following up reports to survey remedials/defects highlighted and assisting with quotations. You will also be advising on testing reports as well as being on the tools commissioning, testing or servicing.

This role would suit, a LEV Engineer or someone with a background in Local Exhaust Ventilation with LEVP601, looking to progress their career with an industry leading company.

The Role

  • Receive guidance and training to progress
  • Hybrid working / In the field
  • Assist with reports and follow servicing and testing
  • Support junior technicians

The Person

  • LEV Engineer or similar
  • Looking to step-up into management
  • UK driving license

Reference Number: BBBH24441b

LEV, Local Exhaust Ventilation, Service, Fume Extraction, Maintenance, Testing, Commissioning, Ventilation, Duct, Airflow, HVAC, Air Conditioning, Bridgewater, Taunton, Somerset

If you are interested in this role, click ‘apply now’ to forward an up-to-date copy of your CV.

We are an equal opportunities employer and welcome applications from all suitable candidates. The salary advertised is a guideline for this position. The offered renumeration will be dependent on the extent of your experience, qualifications, and skill set.

Ernest Gordon Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job, you accept the T&C’s, Privacy Policy and Disclaimers which can be found at our website.

Principal Data Engineer
VIQU IT Recruitment
London
Fully remote
Senior
£80,000 - £95,000
RECENTLY POSTED

Principal Data EngineerLocation: RemoteSalary: Up to £95,000Sponsorship: Not available

VIQU have partnered with a nationally recognised organisation that has recently completed a full migration from on-premise infrastructure to Google Cloud Platform (GCP) and is now building a modern, domain-oriented Data Mesh. This is a critical leadership hire as they continue to scale their cloud data platform and embed engineering excellence across multiple product domains.

This is not a step-up role. They will only consider candidates with proven Principal or Lead-level experience, including clear ownership of engineers, standards, and cross-domain architectural direction.

The Role

As Principal Data Engineer, you will be the senior technical authority across data product squads, driving strategy, architecture, and engineering best practice.

Key responsibilities include:

  • Providing visible, hands-on technical leadership across multiple domain teams
  • Defining how projects are approached and ensuring delivery standards are upheld
  • Designing and governing scalable pipelines using BigQuery, Dataflow, Datastream, Data Fusion, Dataproc and Cloud Composer (Airflow)
  • Leading best practice across dbt, Terraform, Docker and CI/CD
  • Driving Data Mesh adoption, domain ownership, and interoperable data products
  • Setting modelling standards across raw, curated and semantic layers
  • Embedding governance covering lineage, MDM, data quality, PII and compliance
  • Mentoring and challenging senior engineers to maintain high standards

You will work closely with Architects and Platform Engineering to ensure performance, cost control, security and scalability across the GCP estate.

What They Need

  • 10+ years in Data Engineering
  • Clear, demonstrable experience leading engineers at Lead/Principal level
  • Deep, hands-on GCP experience (this is essential)
  • Strong SQL and Python expertise
  • Experience building cloud-native data platforms end-to-end
  • Strong background in Data Lake / Warehouse architecture
  • Proven experience implementing CI/CD and infrastructure as code
  • Confidence to challenge stakeholders and influence technical direction

Candidates without strong, proven leadership experience in enterprise-scale environments will not be considered.

Why Join?

You’ll be joining at a pivotal stage in the evolution of a 100% cloud-native platform. The foundations are in place, but the Data Mesh journey is still being shaped. This is a genuine opportunity to influence architecture, standards, and the long-term engineering strategy — fully remote, with executive backing and real scope for impact.

Apply now to speak with VIQU IT in confidence. Or contact Aaron Chiverton on the VIQU website. Know someone great? Refer them and receive up to £1,000 if successful (terms apply). For more exciting roles and opportunities, follow us on LinkedIn @VIQU IT Recruitment.

Contract Data Engineer & Analyst - £400/pd
Tenth Revolution Group
Newcastle upon Tyne
Hybrid
Mid - Senior
£400,000
RECENTLY POSTED

Please note - this role will require you to be based in the North East and able to attend the Newcastle office 2-3 days per week. This organisation is not able to offer sponsorship.

We are seeking an experienced Data Engineer / Data Analyst contractor to provide hybrid support across data engineering and analysis during a team refresh. This engagement is initially for 3 months, with a strong focus on maintaining business-as-usual reporting and delivering a key migration from Excel to SQL / Power BI.

This role requires a self-sufficient, trusted contractor who can quickly understand an existing data estate, work confidently with limited documentation, and make sound recommendations when improving or stabilising solutions.

Key responsibilities include:

  • Maintain and support existing data pipelines to ensure BAU continuity
  • Convert a critical Excel-based report/dashboard into a robust SQL and Power BI solution
  • Deliver ad-hoc, priority reporting requests as needed
  • Work independently to analyse data structures and logic where documentation is limited
  • Provide a clear handover to the incoming Data & Insights Manager at the end of the contract

Required experience:

  • Proven experience across data engineering and analytical reporting in live environments
  • Strong capability working autonomously with minimal oversight or documentation
  • Confidence in making recommendations and proposing practical improvements
  • Experience supporting BAU reporting alongside project work
  • Strong communication skills, particularly around handover and knowledge transfer

Technology in use:

  • Azure SQL Databases
  • Azure Ubuntu Virtual Machines
  • Apache Airflow
  • Power BI
  • Python
  • REST & SOAP APIs

To apply for this role please submit your CV or contact David Airey on or at .

Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We’re the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.

Data Engineer
Noir
Newcastle upon Tyne
In office
Mid - Senior
£45,000 - £80,000
RECENTLY POSTED

Data Engineer - FinTech - Newcastle

(Tech stack: Data Engineer, SQL, Python, AWS, Git, Airflow, Data Pipelines, Data Platforms, Programmer, Developer, Architect, Data Engineer)

Our client is a trailblazer in the FinTech space, known for delivering innovative technology solutions to global financial markets. They are expanding their engineering capability in Newcastle and are looking for a talented Data Engineer to join their team. This role will focus on building and optimising systems that make complex datasets accessible, reliable, and valuable for the business.

As a Data Engineer, you will take responsibility for the development of high-quality pipelines that process and manage large volumes of data from a range of external and internal sources. You’ll play a key role in enhancing and maintaining their central data platform, ensuring the smooth delivery of information that supports investment decision-making. Working closely with stakeholders across the business, you’ll help shape how data is accessed, tested, and leveraged to maximise value.

The successful candidate will bring:

  • 3-6 years of relevant experience working as a Data Engineer (or in a closely related role).
  • A 2:1 or above in Computer Science (or related field), ideally from a Russell Group university.
  • Direct experience in the hedge fund sector (essential).
  • Strong ability to design and build data pipelines that integrate multiple data sources.
  • Proficiency in SQL and Python, with solid exposure to AWS or other cloud-based data tools.
  • Familiarity with version control systems such as Git and workflow/orchestration tools such as Airflow.
  • Proven ability to test and troubleshoot data systems, with a track record of improving reliability and accuracy.
  • Excellent communication skills, with the ability to collaborate effectively in a team environment.
  • A detail-oriented, proactive mindset, with a willingness to learn and apply new technologies.

This is an exciting opportunity to join a forward-thinking organisation where data is at the core of their success. You’ll be part of a collaborative environment where your work will directly support world-class FinTech solutions.

Location: Newcastle, UK (Fully Office Based)

Salary: £45,000 - £80,000 + Bonus + Benefits

Applicants must be based in the UK and have the right to work in the UK.

Noir continues to be the leading Microsoft recruitment agency; we can help you make the right career decisions!

NOIRUKTECHREC

NOIRUKREC

Senior Data Engineer (AWS, Airflow, DBT)
Harnham - Data & Analytics Recruitment
Multiple locations
Fully remote
Senior
£80,000 - £100,000
RECENTLY POSTED
+1

Senior Data Engineer

Up to £100,000 + BenefitsRemote - UKThis is a great opportunity to join a high-growth organisation where you can take ownership of end-to-end data engineering projects and play a key role in shaping a modern, scalable data platform.THE COMPANY:This a next-generation sports betting and gaming platform built for a new wave of players. Combining sharp product thinking, bold branding and fast execution.THE ROLE:You will take ownership of the full data engineering lifecycle.Key responsibilities include:

  • Owning end-to-end data engineering projects across the platform
  • Designing, building and optimising scalable data pipelines using Python, SQL and modern orchestration tools
  • Developing robust data models aligned with industry best practices
  • Ensuring high standards of data quality through testing, monitoring and alerting
  • Driving engineering best practices, contributing to code reviews and mentoring other engineers

YOUR SKILLS AND EXPERIENCE:You will bring strong capability in:

  • Python and advanced SQL
  • Building and maintaining production-grade data pipelines
  • Modern data orchestration tools (e.g. Dagster, Airflow, Prefect)
  • Data modelling methodologies (Kimball, Data Vault, etc.)
  • Engineering best practices including testing, version control and clean code
  • AWS experience

THE BENEFITS:You will receive a salary of up to £100,000 depending on experience, along with a comprehensive benefits package including private health insurance, income protection, flexible working, enhanced holiday entitlement and a fully supported home-office setup. HOW TO APPLY:Please register your interest by sending your CV to Molly Bird via the apply link on this page.

Data Scientist High-Growth Technology Consultancy | Hybrid (UK)
IT Graduate Recruitment
North East England
Hybrid
Graduate - Junior
Private salary
RECENTLY POSTED

We’re a fast-growing technology consultancy delivering advanced engineering solutions in highly demanding environments. Our model focuses on deploying top-tier engineers into mission-critical projects, combining deep technical expertise with a strong delivery mindset.

We’re now hiring a Data Scientist to design and build scalable data solutions for leading organisations.

Why Join

  • Work on high-impact, complex data projects
  • Hybrid working with flexibility
  • Collaborate with high-performing engineering teams
  • Exposure to modern data stacks and cloud technologies
  • Clear progression in a growing consultancy

The Role

You’ll design, build, and optimise data pipelines and systems that support critical business decisions.

Key Responsibilities

  • Build and maintain data pipelines and data-intensive systems
  • Process and transform large datasets efficiently
  • Ensure data quality, reliability, and performance
  • Troubleshoot and improve data products
  • Collaborate with stakeholders to deliver solutions

About You

  • 0–3 years’ experience in data engineering or a related field
  • Degree from a Russell Group university
  • Strong academic background, including A*/A/B grades at A-Level

Experience in the following areas is advantageous, however, training is provided within the role, so do not worry if you do not meet all of these

  • Experience building data pipelines or data applications
  • Strong skills in Python and/or SQL
  • Familiarity with tools like Spark, Databricks, or Airflow
  • Experience with AWS, Azure, or GCP
  • Strong problem-solving and communication skills

What We Offer

  • Competitive salary + progression opportunities
  • Exposure to complex, large-scale projects
  • Career growth in a fast-scaling environment
  • Hybrid working and collaborative culture

Apply Now: If you’re excited by solving complex data challenges in a high-performance environment, we’d love to hear from you.

Data Engineering, Data Pipelines, ETL, Python, SQL, Spark, Databricks, Airflow, AWS, Azure, GCP, Cloud Technologies, Data Processing, Data Transformation, Data Quality, Scalable Systems, Data Applications, Data-Intensive Systems, Data Engineer, Engineering Consultant, High-Impact Projects, Problem Solving, Stakeholder Collaboration, Analytical Thinking, Communication, Delivery Mindset, Technical Expertise, Hybrid Working, Career Growth

GCP Data Engineer - up to £85,000 Bonus - Hybrid/London
Involved Solutions
London
Hybrid
Mid - Senior
£75,000 - £85,000
RECENTLY POSTED
+1

GCP Data Engineer (BigQuery / Dataflow)Salary: Up to £85,000 + Bonus + BenefitsLocation: London - Hybrid (3 days per week onsite)Working Hours: 40 hours per week - Full timeJob Type: Permanent

A globally established organisation is seeking an experienced GCP Data Engineer to help build modern, scalable cloud data platforms within a complex enterprise environment.

This GCP Data Engineer role will focus on designing data architectures on Google Cloud Platform, building high-performance pipelines, and enabling reliable, secure and governed data solutions that support business growth and decision-making.

Responsibilities for the GCP Data Engineer:

  • Design and build scalable data pipelines, ETL/ELT workflows and cloud data architectures on GCP
  • Develop solutions using services such as BigQuery, Dataflow, Spanner and Cloud Storage
  • Build and maintain code using Python, Java or Scala for data transformation and processing
  • Optimise data pipelines, queries and workloads for performance and scalability
  • Implement data quality, validation and governance controls
  • Collaborate with cross-functional teams to deliver end-to-end data solutions
  • Ensure alignment with security, privacy and compliance standards
  • Troubleshoot issues across pipelines and processing workflows
  • Maintain documentation across data flows, platforms and architectures

Essential Skills for the GCP Data Engineer:

  • Strong experience as a Data Engineer within GCP environments
  • Hands-on experience with BigQuery, Dataflow and Spanner
  • Strong programming capability in Python, Java or Scala
  • Experience with Apache Beam / Dataflow and distributed processing frameworks
  • Experience designing and managing ETL / ELT pipelines
  • Solid understanding of data modelling and database design
  • Experience with workflow orchestration tools such as Apache Airflow
  • Strong understanding of data governance, security and scalability

Desirable Skills for the GCP Data Engineer:

  • Experience with Pub/Sub, Cloud Composer or Cloud Data Fusion
  • Exposure to CI/CD, DevOps and Infrastructure as Code
  • Experience with real-time streaming architectures
  • Knowledge of modern data governance frameworks

If you are an experienced GCP Data Engineer looking to build enterprise-scale data platforms using modern Google Cloud technologies, this role offers strong exposure to complex delivery and cloud transformation.

GCP, Data Engineer, GCP Data Engineer

Senior GCP Data Engineer
VIQU IT Recruitment
Manchester
Hybrid
Senior
£65,000
+3

Lancashire – Permanent – Hybrid
Competitive Salary

VIQU have partnered with a leading organisation seeking a Senior Data Engineer to join their Data and Platform Engineering team during an exciting period of cloud and data platform transformation. In this hands-on role, you will design, build, and deliver modern data platforms within a cloud-first, Data Mesh environment, work closely with product managers, architects, and engineers, take ownership of your projects, and mentor junior colleagues, making a real impact on both the technology and the team.

Key Responsibilities:

• Lead the design, development, and delivery of cloud-based data platforms and data products as a Senior Data Engineer.
• Own the full data product lifecycle, from initial design through to decommissioning.
• Build and maintain robust ETL / ELT pipelines using SQL, Python, and modern tooling.
• Collaborate closely with product managers, architects, and engineers to solve complex technical and business challenges.
• Act as the go-to technical expert for junior engineers, providing mentorship, code reviews, and quality assurance.
• Produce clear, well-documented solutions for both technical and non-technical audiences.
• Support CI/CD, environment control (dev/test/prod), and effective change management practices.
• Contribute to cloud platform development, with a strong preference for GCP (BigQuery), within a Data Mesh architecture.

Key Requirements:

• 5+ years’ experience as a Data Engineer with a strong focus on ETL / ELT.
• Advanced SQL and Python development skills.
• Hands-on experience with DBT, GIT, Terraform, Docker, IAM, and Airflow (Composer).
• Proven experience working on cloud platforms – ideally GCP (BigQuery), but Azure or AWS also considered.
• Strong understanding of Data Mesh, Test Driven Design, and Agile delivery.
• Experience with documentation, CI/CD pipelines, and multi-environment controls.
• Excellent communication skills and the ability to lead by example within engineering teams.
• Experience supporting mergers, integrations, or large-scale organisational change is highly desirable.

Senior GCP Data Engineer
London – Permanent – Hybrid
Competitive Salary

Apply today to speak with VIQU in confidence or contact Belle Hegarty at .
Know someone exceptional for this position? Refer them and receive up to £1,000 if successful (terms apply).
Follow us on LinkedIn @VIQU IT Recruitment for more exciting opportunities.

GCP Data Engineer - up to £85,000 + Bonus - Hybrid/London
Involved Solutions
London
Hybrid
Mid - Senior
£75,000 - £85,000
+1

GCP Data Engineer (BigQuery / Dataflow)Salary: Up to £85,000 + Bonus + BenefitsLocation: London - Hybrid (3 days per week onsite)Working Hours: 40 hours per week - Full timeJob Type: Permanent

A globally established organisation is seeking an experienced GCP Data Engineer to help build modern, scalable cloud data platforms within a complex enterprise environment.

This GCP Data Engineer role will focus on designing data architectures on Google Cloud Platform, building high-performance pipelines, and enabling reliable, secure and governed data solutions that support business growth and decision-making.

Responsibilities for the GCP Data Engineer:

  • Design and build scalable data pipelines, ETL/ELT workflows and cloud data architectures on GCP
  • Develop solutions using services such as BigQuery, Dataflow, Spanner and Cloud Storage
  • Build and maintain code using Python, Java or Scala for data transformation and processing
  • Optimise data pipelines, queries and workloads for performance and scalability
  • Implement data quality, validation and governance controls
  • Collaborate with cross-functional teams to deliver end-to-end data solutions
  • Ensure alignment with security, privacy and compliance standards
  • Troubleshoot issues across pipelines and processing workflows
  • Maintain documentation across data flows, platforms and architectures

Essential Skills for the GCP Data Engineer:

  • Strong experience as a Data Engineer within GCP environments
  • Hands-on experience with BigQuery, Dataflow and Spanner
  • Strong programming capability in Python, Java or Scala
  • Experience with Apache Beam / Dataflow and distributed processing frameworks
  • Experience designing and managing ETL / ELT pipelines
  • Solid understanding of data modelling and database design
  • Experience with workflow orchestration tools such as Apache Airflow
  • Strong understanding of data governance, security and scalability

Desirable Skills for the GCP Data Engineer:

  • Experience with Pub/Sub, Cloud Composer or Cloud Data Fusion
  • Exposure to CI/CD, DevOps and Infrastructure as Code
  • Experience with real-time streaming architectures
  • Knowledge of modern data governance frameworks

If you are an experienced GCP Data Engineer looking to build enterprise-scale data platforms using modern Google Cloud technologies, this role offers strong exposure to complex delivery and cloud transformation.

GCP, Data Engineer, GCP Data Engineer

Senior GCP Data Engineer
VIQU IT
Bolton
Hybrid
Senior
£65,000 - £68,500
+3

Lancashire Permanent Hybrid
Competitive Salary

VIQU have partnered with a leading organisation seeking a Senior Data Engineer to join their Data and Platform Engineering team during an exciting period of cloud and data platform transformation. In this hands-on role, you will design, build, and deliver modern data platforms within a cloud-first, Data Mesh environment, work closely with product managers, architects, and engineers, take ownership of your projects, and mentor junior colleagues, making a real impact on both the technology and the team.

Key Responsibilities:

• Lead the design, development, and delivery of cloud-based data platforms and data products as a Senior Data Engineer.
• Own the full data product lifecycle, from initial design through to decommissioning.
• Build and maintain robust ETL / ELT pipelines using SQL, Python, and modern tooling.
• Collaborate closely with product managers, architects, and engineers to solve complex technical and business challenges.
• Act as the go-to technical expert for junior engineers, providing mentorship, code reviews, and quality assurance.
• Produce clear, well-documented solutions for both technical and non-technical audiences.
• Support CI/CD, environment control (dev/test/prod), and effective change management practices.
• Contribute to cloud platform development, with a strong preference for GCP (BigQuery), within a Data Mesh architecture.

Key Requirements:

• 5+ years experience as a Data Engineer with a strong focus on ETL / ELT.
• Advanced SQL and Python development skills.
• Hands-on experience with DBT, GIT, Terraform, Docker, IAM, and Airflow (Composer).
• Proven experience working on cloud platforms ideally GCP (BigQuery), but Azure or AWS also considered.
• Strong understanding of Data Mesh, Test Driven Design, and Agile delivery.
• Experience with documentation, CI/CD pipelines, and multi-environment controls.
• Excellent communication skills and the ability to lead by example within engineering teams.
• Experience supporting mergers, integrations, or large-scale organisational change is highly desirable.

Senior GCP Data Engineer
London Permanent Hybrid
Competitive Salary

Apply today to speak with VIQU in confidence or contact Belle Hegarty at (url removed).
Know someone exceptional for this position? Refer them and receive up to £1,000 if successful (terms apply).
Follow us on LinkedIn IT Recruitment for more exciting opportunities.

Business Development Manager - Construction
Remarkable Jobs
Wokingham
In office
Mid - Senior
£35,000 - £45,000

Sales Manager - Technical Instruments

Location: Wokingham, Berkshire

Salary: Up to £45,000 base + Commission

Hours: Full-time, Monday to Friday

Work Location: Office Based

Full time / Permanent

We are seeking an experienced Sales Manager - Technical Instruments to join a respected organisation operating within the building services and engineering sector. This is an exciting opportunity for a commercially driven Sales Manager to take ownership of a specialist product portfolio used widely across the HVAC, commissioning and environmental testing markets.

The successful Sales Manager will be responsible for developing and growing sales of a range of high-quality testing and measurement instruments, used by engineers to assess building performance, air quality, airflow, temperature, pressure and environmental conditions.

This role will be primarily office-based, focusing on developing client relationships, managing incoming enquiries and proactively identifying sales opportunities within key engineering sectors.

Sales Manager Role

As a Sales Manager, you will be responsible for driving the sales of specialist testing and measurement instrumentation across the UK market. You will build relationships with technical buyers, understand project requirements and identify opportunities to expand the company’s presence within the building services and environmental monitoring sectors.

Sales Manager Key Responsibilities

  • Manage and grow sales of a portfolio of technical testing and measurement instruments
  • Develop relationships with engineers, contractors and consultants who require specialist testing equipment.
  • Handle incoming enquiries and convert them into sales opportunities.
  • Identify and develop new business opportunities across multiple engineering sectors.
  • Provide technical guidance to customers regarding the most appropriate instrumentation solutions.
  • Work closely with internal technical teams to ensure the correct products and solutions are recommended.
  • Manage the full sales cycle from enquiry through to order and ongoing account development.
  • Maintain accurate sales records and manage a consistent pipeline of opportunities.

Industries You Will Be Selling Into

The Sales Manager will develop relationships across a wide range of technical industries, including:

  • HVAC & Building Services Engineering, Commissioning & Testing Engineers (TAB - Testing, Adjusting & Balancing)
  • Mechanical & Electrical (M&E) Contractors
  • Facilities Management & Building Maintenance Organisations
  • Environmental & Air Quality Testing Companies
  • Building Services / MEP Engineering Consultancies
  • Energy & Sustainability Consultancies
  • Laboratories and Product Testing Facilities
  • Manufacturers of HVAC and ventilation equipment

What They Are Looking For

Essential

  • Experience in technical sales within engineering, HVAC, instrumentation or building services
  • Proven track record of developing client relationships and generating new business.
  • Experience selling to engineers, contractors, consultants or facilities management organisations.
  • Strong commercial awareness and consultative sales approach.
  • Desirable
  • Experience selling test & measurement equipment or environmental monitoring instruments
  • Knowledge of HVAC, building services or commissioning processes

Sales Manager Key Attributes

  • Strong relationship builder with excellent communication skills.
  • Commercially driven and proactive in identifying opportunities.
  • Technically curious with the ability to understand engineering applications.
  • Self-motivated and capable of managing a varied and dynamic sales pipeline.

If you are an experienced Sales Manager with a background in technical or engineering sales, this is a fantastic opportunity to join a well-respected organisation and work with specialist products that support the performance and efficiency of buildings.

AWS Data Engineer
Bis Henderson
Leicestershire
Hybrid
Mid - Senior
£70,000 - £80,000

Location: Leicestershire, hybrid Salary: circa £70,000 - £80,000 Summary: We are looking for a hands-on Data Engineer to lead the build of a modern AWS-based data platform, taking ownership from core infrastructure through to curated, business-ready datasets. Key Responsibilities: Design, build, and operate a scalable AWS data platform, including storage, compute, security, and monitoring Develop robust, idempotent ETL pipelines across diverse data sources (APIs, databases, files, and event streams) Implement medallion architecture (bronze, silver, gold) to transform raw data into high-quality, analytics-ready datasets Establish infrastructure-as-code and CI/CD practices to ensure reproducibility and continuous improvement Own data modelling, master data management, and aggregation layers to support reporting, analytics, and ML use casesSkills and Experience: Proven experience building and running production data platforms on AWS end-to-end, ideally in a multi-site business environment Strong proficiency in Python and SQL, with hands-on experience in Spark/PySpark and modern table formats (e.g. Delta Lake, Iceberg, Hudi) Expertise in AWS data services (e.g. S3, Glue, Redshift, Lambda, Step Functions, Kinesis) and infrastructure-as-code (Terraform or CDK) Experience with workflow orchestration tools such as Airflow or similar Ability to design scalable architectures, make pragmatic trade-offs, and communicate effectively with both technical and non-technical stakeholdersProcessing Your Data Bis Henderson Recruitment is a leading provider of recruitment, interim management and consultancy services to the supply chain and logistics industry. Should you respond to this advertisement we may store your CV and contact details and will process this data for recruitment purposes only. Should we process your data, then we will always tell you that we are doing so. Please visit our website to read our Privacy Policy in full, in this Policy you will find information about our compliance with the UK General Data Protection Regulations. All applicants must have an unrestricted right to work in the UK as our client will not support visa sponsorship for this role

Principal Data Engineer (we have office locations in Cambridge, Leeds and London)
Genomics England
London
Remote or hybrid
Senior
£103,400 - £103,400
+2

Company Description Genomics England is a global leader in enabling genomic medicine and research, focused on creating a world where everyone benefits from genomic healthcare. Building on the 100,000 Genomes Project, we support the NHS’s world-first national whole genome sequencing service and run the growing National Genomic Research Library, alongside delivering numerous major genomics initiatives. By connecting research and clinical care at national scale, we enable immediate healthcare benefits and advances for the future. Our mission is to provide the evidence and digital systems so that by 2035 genomics could play a role in up to half of all healthcare interactions, whilst securing the UK’s position as the best place to discover, prove and benefit from genomic innovations. We are accelerating our impact and working with patients, doctors, scientists, government and industry to improve genomic testing, and help researchers access the health data and technology they need to make new medical discoveries and create more effective, targeted medicines for everybody. Behind the Healthcare and Research outcomes, Genomics England delivers through designing, developing and operating complex healthcare software systems. We're on the cusp of big changes with the real prospect of genomics becoming the fabric of everyday healthcare through the lifetime - from birth to old age. Job Description About the role We’re looking for an experienced Principal Data Engineer with a proven track record of senior technical leadership across Genomics England’s data engineering capability. This is a senior individual contributor role operating at Principal level, where you will set technical direction and influence data engineering strategy across complex, regulated, national scale clinical and research platforms. You will shape how we design, build and operate scalable, secure and highly reliable data platforms. This includes defining standards, patterns and data engineering approaches that multiple teams will align to and deliver against. This is a hands-on leadership role. You will stay close to data engineering, code and delivery where it matters, while also operating at architectural and strategic level to guide direction, influence technical decisions, raise data engineering maturity, and build capability across the organisation. Working across data engineering, data architecture, product and governance, you will ensure our data platforms are robust, interoperable and aligned to organisational and national priorities, including highly sensitive clinical and research data. What you will be doing Setting technical direction and influencing data engineering standards across data platforms, pipelines and data products used by multiple teams Leading the design and delivery of complex, large scale data solutions across cloud and hybrid environments Influencing architectural direction for enterprise data platforms, including Lakehouse, distributed and event driven systems Acting as a technical authority across squads, influencing alignment to data engineering standards and data architectural principles Working closely with senior engineers, architects, bioinformaticians, data colleagues and product teams to influence delivery and technical decision making Ensuring all data engineering approaches meet strict governance, privacy, security and regulatory requirements Driving adoption and influencing use of modern engineering practices including DataOps, DevOps, CI/CD, infrastructure as code, automation and observability Improving platform resilience, performance and operational maturity across critical data systems Evaluating and influencing adoption of new tools, technologies and data engineering approaches where appropriate Mentoring experienced data engineers and influencing capability uplift across teams Contributing to organisation wide technical communities, standards and design governanceWhat we are looking for Proven experience operating at senior or Principal level in data engineering Strong track record of designing and successfully delivering enterprise scale data platforms and systems Deep experience with modern data architecture, cloud infrastructure and scalable data processing systems Strong hands-on programming experience, i.e. Python and SQL Strong expertise in data modelling, data lifecycle management, metadata and governance Experience working in complex regulated environments such as healthcare, genomics or research Strong understanding of security by design and handling highly sensitive data Proven ability to influence architectural and data engineering direction across multiple teams or domains Strong stakeholder engagement skills, with experience influencing senior technical and non-technical stakeholders in matrix environmentsDesirable Experience with genomics, bioinformatics or clinical data platforms Familiarity with GA4GH, OMOP or FHIR standards Experience with orchestration tools such as Airflow, Prefect or AWS Glue Experience with modern data architecture patterns such as data mesh or Lakehouse approachesThis is a rare opportunity to define how data engineering operates at national scale in a regulated, mission driven organisation. As an experienced Principal Data Engineer with a strong and demonstrable track record, you will help set direction, shape standards others build to, and directly influence the maturity and reliability of critical data platforms, while staying close enough to the engineering to make a real hands-on impact. Qualifications Qualifications Degree in Computer Science, IT or equivalent practical industry experience Architecture or data related certifications are beneficial, TOGAF especially valued Additional Information Salary From: £103,400 pa Closing Date: Tuesday 12th May @ 23:00 (UK time) Being an integral part of such a meaningful mission is extremely rewarding in itself, but in order to support our people, we’re continually improving our benefits package. We pride ourselves on investing in our people and supporting them to achieve their career goals, as well as offering a benefits package including: Generous Leave:  30 days’ holiday plus bank holidays, plus additional leave for long service, and also the option to apply for up to 30 days of remote working abroad annually (approval required). Family-Friendly: Blended working arrangements, flexible working, enhanced maternity, paternity and shared parental leave benefits. Pension & Financial: Defined contribution pension (Genomics England double-matches up to 10%, however you can contribute more if you wish), Life Assurance (3x salary), and a Give As You Earn scheme. Learning & Development: Individual learning budgets, support for training and certifications, and reimbursement for one annual professional subscription (approval required). Recognition & Rewards: Employee recognition programme and referral scheme. Health & Wellbeing: Subsidised gym membership, a free Headspace account, and access to an Employee Assistance Programme, eye tests, flu jabs.Equal opportunities and our commitment to a diverse and inclusive workplace Genomics England is actively committed to providing and supporting an inclusive environment that promotes equity, diversity and inclusion best practice both within our community and in any other area where we have influence. We are proud of our diverse community where everyone can come to work and feel welcomed and treated with respect regardless of any disability, ethnicity, gender, gender identity, religion, sexual orientation, or social background. Genomics England’s policies of non-discrimination and equity and will be applied fairly to all people, regardless of age, disability, gender identity or reassignment, marital or civil partnership status, being pregnant or recently becoming a parent, race, religion or beliefs, sex or sexual orientation, length of service, whether full or part-time or employed under a permanent or a fixed-term contract or any other relevant factor. Genomics England does not tolerate any form of discrimination, harassment, victimisation or bullying at work. Such behaviour undermines our mission and core values and diminishes the dignity, respect and integrity of all parties. Our People policies outline our commitment to inclusivity. We aim to remove barriers in our recruitment processes and to be flexible with our interview processes. Should you require any adjustments that may help you to fully participate in the recruitment process, we encourage you to discuss this with us. Culture We have four key behaviours that represent what we would like Genomics England to feel like and the culture we want to encourage, in order for us to achieve our mission. These behaviours help us all work well together, deliver on our outcomes, celebrate our successes and share feedback with each other. You can read about these and other aspects of our culture here

Data Engineer
Saffron housing
Norwich
In office
Mid - Senior
£56,000

Long Stratton, Norwich, Norfolk £56,000 per annum Full Time: 37hrs per week Saffron is looking for a talented Data Engineer to help drive the next stage of our data transformation. This role is all about building and optimising our Azure-based data platform, developing high‑performing pipelines in Azure Data Factory, and supporting our move toward Microsoft Fabric. You will work closely with BI Analysts and teams across the business to deliver reliable, high‑quality data that powers smarter decisions and sharper insights. It is a chance to shape a modern, scalable data environment and make a real impact on how we use data across the organisation. Key Responsibilities: \* Design, build, and maintain a scalable Azure-based data warehouse that meets the current and future requirements of the Data & Analytics team. \* Lead the introduction, adoption, and optimisation of Microsoft Fabric (e.g., Lakehouse, Warehouse, Data Engineering, Pipelines). \* Apply CI/CD practices (e.g., Azure DevOps) for version control, deployment automation, and environment management. \* Implement data quality checks, pipeline observability, alerting, and automated monitoring to ensure consistent platform reliability. \* Work collaboratively with data owners and the wider data team to ensure data definitions, lineage, and ownership are clearly established. \* Work collaboratively with data owners and the wider data team to ensure data definitions, lineage, and ownership are clearly established. \* Provide technical guidance and coaching to the wider data team members on data engineering best practices. For a full list of responsibilities please see the attached Role Profile Our Ideal Candidate Will Have: Education and Qualifications: \* Degree in Computer Science, Data Engineering, Mathematics, or a related discipline, or equivalent experience (E) \* Microsoft certifications in SQL, Fabric, including Power BI, or other Azure Data Services (D) Experience: \* Advanced SQL skills, including optimisation of complex queries (E). \* Experience building data pipelines and ETL/ELT workflows using tools such as: \* Azure Data Factory, Databricks, Airflow, Luigi, or similar (E) \* Strong understanding of data modelling (E) \* Programming skills in Python and/or Scala for data processing (D). \* Experience with machine learning pipelines or MLOps frameworks (D). Personal Attributes: \* Confident communicator able to engage both technical and non-technical audiences. \* Proactive, innovative, and committed to continuous improvement. \* Collaborative, with mentoring and leadership capabilities. \* Customer-focused, with a commitment to improving services through data. \* Experience working in a busy, fast-paced workload, and managing multiple projects to meet deadlines

Data Engineer
Teksystems
London
In office
Mid - Senior
Private salary

Job Title: Data Engineer (SQL, Python) This role offers the opportunity to work with very large, complex data sets to influence product decisions for digital products used by hundreds of millions of people every day. You will contribute directly to user growth, retention and experience by building and maintaining robust data warehouse solutions, data models and pipelines. You will join a high-calibre data engineering team solving challenging web and mobile data problems at a scale that few organisations can match, with a strong focus on adoption and buyer personalisation. Responsibilities Maintain and support existing ETL pipelines and data processes running in production, ensuring stability and reliability. Work with very large-scale data sets, applying appropriate SQL techniques and performance optimisation strategies. Manage data warehouse plans and roadmaps for a product or group of products, ensuring alignment with business priorities. Interface regularly with engineers, product managers and product analysts to understand data needs and translate them into technical solutions. Build deep data expertise in your areas of ownership and take responsibility for data quality across those domains. Design, build and launch new production-grade data models that support analytics, reporting and personalisation use cases. Design, build and launch new data extraction, transformation and loading (ETL) processes in production environments. Rewrite and refactor data pipelines when upstream tables, schemas or privacy rules change, updating SQL and Python code accordingly. Implement changes to data visualisation dashboards, including wholesale reimplementation to meet new visualisation standards and guidelines. Contribute to task-focused delivery, taking ownership of tightly scoped backlog items and driving them to completion within agreed timelines. Use data analysis to identify deliverables, gaps and inconsistencies, and propose solutions to improve data coverage and quality. Communicate data-driven insights and technical considerations clearly and effectively to both technical and non-technical audiences.Essential Skills At least 5 years’ experience with programming languages, with strong hands-on experience in Python. At least 5 years’ experience writing efficient, optimised SQL statements for large and complex data sets. Hands-on experience with schema design and dimensional data modelling. Strong ability to analyse data to identify deliverables, gaps and inconsistencies. Ability to understand and work with Python language constructs, including constraints, conditional statements and code structure, in order to read, debug and rewrite existing code. Proficiency in building and maintaining ETL pipelines using tools or frameworks similar to Airflow (for example, internal orchestration tools with comparable concepts). experience working with large-scale data sets and applying SQL techniques suitable for high-volume environments. experience with data visualisation and dashboard development, ideally using Tableau. Excellent communication skills, including the ability to explain data-driven insights and technical topics clearly to stakeholders.Additional Skills & Qualifications Familiarity with industry-standard data pipeline orchestration tools such as Airflow or similar frameworks. experience working with cross-functional teams including product management, analytics and marketing. Strong problem-solving skills and the ability to work independently on task-focused backlog items. Attention to detail when implementing changes to dashboards, visualisation standards and reporting guidelines.Why Work Here? You will work on cutting-edge data technologies and large-scale systems that operate at a level of complexity and volume matched by few organisations. The environment encourages collaboration with some of the brightest minds in data engineering, product and analytics, offering continuous learning and exposure to modern tools and practices. Work Environment You will work in a modern, technology-driven environment focused on large-scale web and mobile data. The team uses SQL, Python and advanced data warehousing technologies to process and analyse massive data sets. Location London, UK Trading as TEKsystems. Allegis Group Limited, Maxis 2, Western Road, Bracknell, RG12 1RT, United Kingdom. No. (phone number removed). Allegis Group Limited operates as an Employment Business and Employment Agency as set out in the Conduct of Employment Agencies and Employment Businesses Regulations 2003. TEKsystems is a company within the Allegis Group network of companies (collectively referred to as "Allegis Group"). Aerotek, Aston Carter, EASi, Talentis Solutions, TEKsystems, Stamford Consultants and The Stamford Group are Allegis Group brands. If you apply, your personal data will be processed as described in the Allegis Group Online Privacy Notice available at (url removed)> To access our Online Privacy Notice, which explains what information we may collect, use, share, and store about you, and describes your rights and choices about this, please go to (url removed)> We are part of a global network of companies and as a result, the personal data you provide will be shared within Allegis Group and transferred and processed outside the UK, Switzerland and European Economic Area subject to the protections described in the Allegis Group Online Privacy Notice. We store personal data in the UK, EEA, Switzerland and the USA. If you would like to exercise your privacy rights, please visit the "Contacting Us" section of our Online Privacy Notice at (url removed)/en-gb/privacy-notices for details on how to contact us. To protect your privacy and security, we may take steps to verify your identity, such as a password and user ID if there is an account associated with your request, or identifying information such as your address or date of birth, before proceeding with your request. If you are resident in the UK, EEA or Switzerland, we will process any access request you make in accordance with our commitments under the UK Data Protection Act, EU-U.S. Privacy Shield or the Swiss-U.S. Privacy Shield

Junior Data Engineer
Hays Technology
London
Hybrid
Junior
£310/day
+3

Junior Data Engineer - Public Sector Contract: Initial 7 months (extension possible) Rate: £310 per day, Inside IR35 Location: Remote with travel to Waterloo (2-3 days per month) Security Clearance: SC‑eligible (5 years UK residency required) I am working with a key consultancy delivering a major UK public sector programme and are looking for a Junior Data Engineer / Scientist to join a mixed delivery team building and operating secure, reliable data platforms that support critical public services.This role is designed for someone early in their data career who wants to develop strong engineering fundamentals in a real production environment. The role - a junior, generalist data engineering position. This is an engineering‑led role, not a specialist or senior position. The team is ideally looking for a generalist in their first few years within data engineering or data science, who is building breadth across data platforms, pipelines and operations. You'll focus on: Designing, building and maintaining data pipelines Supporting the operation of data lakes and data warehouses Implementing and improving ETL / ELT processes Using Python and SQL to transform, validate and move data Working with analysts and developers to turn data requirements into technical solutions Monitoring data quality, documenting data models and lineage, and resolving issues Automating data workflows and operational tasks Participating in Agile delivery, sprint work and collaboration Supporting incidents and helping improve platform reliability over time Working within public sector data governance, security and privacy standardsThis role offers exposure to how data platforms are built, operated and supported in a regulated environment - forming the foundations of a long‑term data engineering career. What this role is not To avoid misalignment, it's important to be clear about what this role is not focused on: ❌ Not a Data Analyst role ❌ Not a Power BI / dashboard developer role ❌ Not an insight, reporting or MI position ❌ Not a modelling, ML or research‑focused role ❌ Not an LLM, AI or advanced data science roleWhile you may work alongside analysts and data scientists, this role does not centre on: Building dashboards Producing insights or reports Statistical modelling Predictive or machine learning solutionsThe emphasis is on data engineering foundations and platform delivery. Ideal candidate profile This role is best suited to someone who: Is in their first few years of a data engineering or data science career Wants to build core engineering skills rather than specialise immediately Has hands‑on experience with SQL and Python Understands basic data modelling and ETL concepts Is comfortable learning through delivery in a production environment Is interested in how data platforms work end‑to‑end, including operations and support Is keen to grow within public sector data platforms Who this role is unlikely to suit This role is unlikely to be appropriate for candidates who: Are very senior data engineers or architects Have primarily worked in advanced ML, AI, or research‑focused roles Are specialised Power BI, reporting or MI developers Are looking for a role centred on analysis, insights or modelling Are seeking leadership, ownership of platform strategy, or advanced optimisation workApplications that demonstrate significant seniority or deep specialisation rather than junior‑to‑mid generalist experience may not be progressed. Required skills and experience Your CV should clearly demonstrate: A degree in a technical discipline (Computer Science, Data Science, Mathematics or similar) Hands‑on experience with SQL Experience using Python, Java or Bash Understanding of ETL processes and data modelling fundamentals Experience with version control (e.g. Git) Comfort working in Agile / DevOps environments Awareness of data security and privacy Eligibility for UK SC clearance Nice to have (but not essential) Exposure to AWS, Azure or GCP Familiarity with tools such as Airflow, dbt, Spark Awareness of CI/CD pipelines or containerisation Experience in public sector or regulated environments Important note for applicants This role is deliberately positioned as a junior, generalist data engineering role. Please ensure your CV clearly demonstrates hands‑on data engineering fundamentals, rather than senior leadership, advanced AI/ML work, or analytics‑only experience. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found at (url removed)

Data Engineer
Maxwell Bond
Manchester
Hybrid
Mid - Senior
£375/day - £400/day

My client are a data and analytics company who have delivered data driven insights to large global brands for over a decade. They need a contract Data Products Engineer to support the development and evolution of behavioural data products. 6 months Outside IR35 £400 per day 2 days a week in Manchester The role You work within the Data Products team. You turn raw signals from capture technologies into well defined, high quality datasets. You use SQL, Python, and internal tooling, including AI assisted tools. You define schemas, apply business logic, enforce quality checks, and deliver client ready data feeds. Key responsibilities • Design and evolve data product schemas and fields • Turn product requirements into structured datasets • Build and maintain data feeds using internal tools, SQL, and Python • Monitor and improve data quality using checks and diagnostics • Investigate issues across large datasets • Work with Product, Data Engineering, Apps, and ML teams • Maintain clear and up to date documentation • Follow internal policies on security, quality, and health and safety About you You enjoy working closely with data. You focus on representing real world digital behaviour in a clean and reliable way. You need • Strong SQL skills with large datasets • Experience with Python or another data focused language • High attention to detail • Strong focus on data quality and validation • Clear communication skills • Willingness to use AI to improve output and speed Nice to have • Experience with behavioural or event level data • Understanding of privacy and data governance • Exposure to AWS tools such as S3, EMR or Spark, Athena, Airflow or Azkaban

Data Engineer
Experis
Bath
Hybrid
Mid - Senior
£500/day - £550/day
+1

12 months Bath - x3 days onsite £550 Outside ir35 Active DV clearance (SC Considered) required Role Overview We are seeking an experienced DV (SC Considered) Cleared Data Engineer to support a high‑profile, mission‑critical programme within a secure government environment. The role will focus on designing, building and maintaining robust data pipelines and platforms that support advanced analytics, intelligence, and operational decision‑making. Key Responsibilities Design, build and maintain scalable data pipelines (batch and/or streaming) within secure environments Develop and optimise ETL / ELT processes for high‑volume, structured and semi‑structured datasets Work with stakeholders to translate complex requirements into technical data solutions Ensure data platforms meet security, accreditation, and information assurance standards Support data quality, lineage, monitoring, and performance optimisation Contribute to data architecture and platform evolution, including cloud and on‑prem solutions Collaborate in Agile delivery teams (Scrum / SAFe environments)Required Skills & Experience Proven experience as a Data Engineer in secure or government environments Strong skills in Python, SQL, and data pipeline development Experience with data integration tools (e.g. Airflow, NiFi, Azure Data Factory, AWS Glue, or similar) Hands‑on experience with cloud platforms (AWS, Azure, or GCP - secure tenants preferred) Knowledge of data modelling, data warehousing, and big data technologies Familiarity with DevOps / CI‑CD practices within restricted environments Strong understanding of data security, governance, and access controlsDesirable Experience Experience working within UK defence, intelligence, or national security programmes Exposure to Apache Kafka, Spark, Hadoop, or similar technologies Experience supporting data science or advanced analytics use cases Background working with classified datasets Contract DetailsApply now to be part of this impactful opportunity

Data Engineer (Apache Spark) - (Financial Services)
Hays Technology
London
Remote or hybrid
Mid
£550/day - £700/day
+2

Your new company Working for a renowned financial services organisation Your new role We are seeking a Data Engineer to support the replacement of a legacy ETL tool with a modern Apache Spark based data platform. This is a hands‑on engineering role focused on building and supporting Spark jobs, with an emphasis on performance, reliability, and scalability. The role is focused on building nonperformance Apache Spark jobs, with a strong emphasis on performance optimisation. It is primarily Java‑based, with Python strongly encouraged; experience across Java, Python, and Scala is ideal, with Java and Python being the preferred combination. The role sits within a small Agile delivery team of four engineers (two onshore and two in Shenzhen), working closely with a Senior Data Engineer. You will be responsible for development work, sprint delivery, demos, documentation, and stakeholder engagement. This position suits a mid‑level engineer with strong Spark development experience rather than design, infrastructure, or management responsibilities. What you'll need to succeed Strong hands‑on experience with Apache Spark - Writing and tuning Spark jobs /PySpark development experience Strong Java or Java Spring boot development background Experience with programming in Python or Scala Exposure to Big Data technologies and distributed data processing Working with containerised environments using Kubernetes/ Airflow. Agile ways of working (Scrum, sprints, demos) Comfortable working in a distributed team Financial services or professional services experience required. What you'll get in return Flexible working options available. What you need to do now If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found at (url removed)

Data Engineer (Big Data/ Hadoop/ Spark Dev)
Hays Technology
London
Remote or hybrid
Mid - Senior
£550/day - £700/day
+2

Your new company Working for a renowned financial services organisation Your new role We're looking for a Data Engineer to design and deliver scalable on prem, high‑quality data solutions for low/ high-level data platforms that power analytical and business insights. This is a hands‑on role suited to someone with strong data engineering and big data expertise, ideally gained within financial services. Using , Big Data tools and Spark development you will ensure data quality through automated validation, monitoring, and testing. You will also enable seamless integration across data warehouses and data lakes, contributing to a robust, scalable, and resilient enterprise data ecosystem. What you'll need to succeed Strong Data Engineering expertise with Big Data technologies. Experience designing and building on‑prem data platforms, from high‑level architecture to detailed technical design. Strong Spark development experience. Hands‑on experience with Hadoop - configuring multi‑node Hadoop clusters, including resource management, security, and performance tuning. Strong Big Data engineering background using Apache Airflow, Spark, dbt, Kafka, and Hadoop ecosystem tools. Knowledge of RDBMS systems (PostgreSQL, SQL Server) and familiarity with NoSQL/distributed databases such as MongoDB. Proven delivery of streaming pipelines and real‑time data processing solutions. Delivered streaming pipelines and real‑time data processing solutions. What you'll get in return Flexible working options available. What you need to do now If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found at (url removed)