Make yourself visible and let companies apply to you.
Roles
AWS Glue Jobs
Overview
Discover the latest AWS Glue jobs on Haystack, your go-to IT job board for cloud data engineering roles. Whether you're an experienced AWS Glue developer or looking to specialize in serverless data integration, find top opportunities with leading companies hiring for AWS Glue experts. Start your cloud career growth today with the best AWS Glue job listings curated just for you.
ICT Data Engineer (Fixed Term for 12 Months), Highland-wide - HGH20408
HIGHLAND COUNCIL
UK
Hybrid
Junior - Mid
£30,001/day - £40,000/day
RECENTLY POSTED
Job Description

Post Title: ICT Data Engineer
Location: Highland-wide (any Highland Council office location)
Hours: 35 Hours Per Week
Duration: Fixed Term for 12 Months
Salary: £35,399 - £39,585 per annum

Salary placing will normally be at the first point of the scale.

Contact Details: Paul Patience Email: paul.patience@highland.gov.uk

Job Purpose: The Highland Council’s ICT department has undergone significant transformation, bringing ICT services in-house. A key focus now is maximizing the use of data and systems to improve data insights across all Council services, enabling better data-driven decisions and supporting customer self-services.

The ICT Data Engineer role, as part of a broader data integrations team, will contribute to delivering outcomes for both internal and external customers. This position will support the Council’s operational data integrations and digital transformation projects to achieve the Council’s strategic objectives.

In this role you will assist on the design, development, implementation and the day-to-day support of enterprise level data integration solutions using a broad range of technologies and tools for key areas such as operational line of business applications, income and payment processing, BI reporting, and web site processes.

Your experience with data integration tools / techniques (such as ETL) and technical skills in programming, relational databases, and data protection / ICT security will help you achieve success in this role.

Please APPLY ONLINE.

The Highland Council understands that diversity fosters creativity and innovation. We are committed to equality of opportunity and being fair and inclusive. We welcome applications from people from all backgrounds, representative of the communities we serve and particularly encourage applications from candidates who are likely to be under-represented in our workforce.

As a disability confident employer, we guarantee to interview all disabled applicants who meet the minimum essential criteria requirements for the post.

Short listed applicants will normally be contacted by email, unless otherwise stated. Please check your emails regularly, including your junk/spam folder.

Requirements
  • Educated to degree level in a numerate/business related subject, or equivalent experience in data management and ICT security.
  • Knowledge of ETL tools, such as SSIS, Azure Data Factory, or AWS Glue
AWS Data Engineer
83zero Ltd
Shropshire
Hybrid
Senior
£55,000 - £70,000
RECENTLY POSTED

We are looking for experienced Data Engineers to join a long-standing, high-impact public sector partnership. This isn’t just about moving data; it’s about modernizing essential services and delivering secure, reliable data products at scale. You will play a pivotal role in shaping engineering design, mentoring talent, and helping our clients reimagine what’s possible through technology.

Data Engineer (AWS)

Location: Telford / Worthing Base Locations (Hybrid 2-3 days onsite)Salary: £55,000 - £70,000 + Bens, Perks, Healthcare Options, Unlimited Training Budget Security Clearance: Must be eligible for SC Clearance (5+ years UK residency)Sector: Public Sector & Government Client

Build the Data Infrastructure That Powers the Public Sector

We are looking for experienced Data Engineers to join a long-standing, high-impact public sector partnership. This isn’t just about moving data; it’s about modernizing essential services and delivering secure, reliable data products at scale. You will play a pivotal role in shaping engineering design, mentoring talent, and helping our clients reimagine what’s possible through technology.

The Role

As a Senior member of our engineering team, you will:

  • Design & Implement: Create robust, secure, and performant data integration solutions (both batch and near-real-time).
  • Build & Optimize: Develop and improve end-to-end data pipelines-from ingestion to curation-ensuring high availability through rigorous monitoring and alerting.
  • Collaborate: Work closely with product teams and client stakeholders to align technical decisions with cost, performance, and security requirements.
  • Innovate: Support incident resolution and contribute to our internal Engineering Communities of Practice.
  • Lead: Actively participate in Agile ceremonies and mentor junior colleagues to grow our collective capability.

Your skills and experience ?

  • Strong SQL and hands-on experience with data modelling.
  • Hands-on with ETL/ELT tooling (at least one of Talend, Pentaho DI, Informatica, AWS Glue, or SAS).
  • Experience with databases/data platforms (ideally Oracle or Cloudera)
  • Knowledge of cloud platforms (ideally AWS)
  • Good experience with programming/scripting languages (e.g. Python, Bash).
  • Strong grasp of data engineering fundamentals, including integration, transformation, orchestration, and version control.
  • Excellent client-facing and consultancy skills.

NOTE: This role requires Security Check (SC) clearance. To be eligible, you must have resided continuously in the UK for the last 5 years.

Interested? Apply now or send your CV

Senior Technical Lead
Stackstudio Digital Ltd.
Norwich
Hybrid
Senior
Private salary
RECENTLY POSTED
+2

Job Title: Senior Technical Lead
Location: Norwich, Norfolk (3 Days a week)
Job Type: Contract (Inside IR35)
Duration: 6 Months The Role

We are looking for a Senior Technical Lead who combines hands-on engineering excellence with strong leadership and stakeholder management. You will own the end-to-end technical delivery of data platforms and pipelines built in AWS-with a focus on AWS Glue, Managed Workflows for Apache Airflow (MWAA), and Python-and collaborate closely with Directors, Senior Architects, and Program Leadership to deliver business outcomes at scale.

This is a player-coach role: you will design, build, review, and optimize complex data workflows while mentoring engineers and driving engineering best practices.

Ideal for: Someone who has delivered multiple production programs in a modern AWS data engineering landscape, can communicate trade-offs clearly to senior stakeholders, and can lead teams through ambiguity to predictable, high-quality outcomes.

Your Responsibilities:

  • Lead the design and implementation of scalable, secure, and cost-efficient ETL/ELT pipelines using AWS Glue, Python (PySpark), and MWAA (Airflow).
  • Define solution architectures, data models, orchestration patterns, and CI/CD for data workflows.
  • Own the technical roadmap, decomposition, and delivery plan-including sizing, sprint planning, and risk mitigation.
  • Drive performance optimization (e.g., partitioning strategies, Glue job tuning, job bookmarks, dynamic frames vs DataFrames, retry/backoff strategies in Airflow).
  • Ensure robust observability (logging, metrics, tracing) and data quality (unit tests, Great Expectations/Deequ-style checks, validations).
  • Act as the technical point of contact for Senior Architects, and Program Managers; translate business needs into technical designs and delivery milestones.
  • Present architecture decisions, trade-offs, and TCO to senior stakeholders with clarity, data, and rationale.
  • Manage vendor/partner coordination where relevant.
  • Establish coding standards, code review practices, branching strategies, and secure-by-design principles.
  • Implement DevSecOps for data: infrastructure-as-code (IaC), secrets management, environment promotion, and automated testing.
  • Ensure compliance with data governance, security, and regulatory requirements (e.g., PII/PCI, encryption, auditability, lineage).
  • Mentor and upskill engineers; foster a culture of learning, ownership, and continuous improvement.

Your Profile

Essential skills/knowledge/experience:

  • 10+ years of total experience in software/data engineering, with 5+ years leading delivery of production solutions in an AWS data engineering environment.

Advanced hands-on expertise with:

  • Python (including PySpark & data engineering patterns)
  • AWS Glue (Jobs, Crawlers, Glue Studio, Glue Catalog, PySpark, Job bookmarks)
  • MWAA (Apache Airflow) (DAG design, scheduling, sensors, retries, XComs, task isolation, best practices)

Strong across broader AWS services:

  • S3, Lambda, Step Functions, IAM, CloudWatch, KMS, Secrets Manager, Athena, EMR (nice to have), Redshift (nice to have)
  • Proven experience delivering multiple end-to-end programs (architecture build test deploy operate) with measurable outcomes (SLAs, cost targets, performance).
  • Excellent stakeholder communication and executive presence; able to engage Directors, Senior Architects, and Program Leadership.
  • Solid grounding in data modeling, data governance, security/compliance, and cost optimization on AWS.
  • Experience with CI/CD (e.g., CodePipeline/GitHub Actions/Bitbucket Pipelines), IaC (CloudFormation/Terraform), and containerization (Docker).
  • Architectural thinking: designs for scale, reliability, cost, and evolvability.
  • Delivery excellence: breaks down complex work, sets milestones, manages risks, and delivers on time.
  • Communication & influence: distills complexity for senior stakeholders; backs decisions with data.
  • Hands-on leadership: sets the technical bar through reviews, pairing, and exemplars.
  • Ownership & clarity: aligns teams on problem statements, success criteria, and measurable outcomes.

Languages:

  • Python (PySpark), SQL

AWS:

  • Glue, MWAA (Airflow), S3, IAM, KMS, CloudWatch, Lambda, Step Functions, Athena, Redshift (nice), EMR (nice)

DevOps:

  • Git, CI/CD (CodePipeline/GitHub Actions), Terraform/CloudFormation, Docker

Data Quality/Observability:

  • Great Expectations/Deequ (nice), OpenLineage (nice)

Desirable skills/knowledge/experience:

  • Domain experience in BFSI (risk, pricing, regulatory reporting, underwriting, fraud, payments, or actuarial data).
  • Experience with event-driven and near-real-time pipelines (Kafka/Kinesis, streaming ETL).
  • Knowledge of data quality frameworks (Great Expectations, Deequ) and data lineage/catalog (Atlas, Alation, Collibra).
  • Exposure to Databricks or EMR for advanced Spark workloads.
  • Certifications: AWS Solutions Architect / Data Analytics / DevOps Engineer.
  • Prior experience leading multi-team programs with offshore/nearshore models.
Snowflake Data Architect
Hirexa Solutions UK
Hemel Hempstead
Remote or hybrid
Senior - Leader
£400/day
RECENTLY POSTED
+2

Experience

12+ years of experience in Data Engineering, Data Warehousing, Cloud Data Platforms, and Enterprise Analytics solutions, with strong expertise in modern cloud data architectures.

Job Summary

We are seeking an experienced Data Architect with strong expertise in Snowflake on Amazon Web Services and DBT to design, architect, and optimize scalable enterprise data platforms.

The role involves defining data platform architecture, governance standards, and scalable data transformation frameworks, while ensuring high performance, security, and cost efficiency. The architect will provide technical leadership to data engineering and analytics teams and ensure the platform supports enterprise reporting, advanced analytics, and AI/ML initiatives.

The ideal candidate should also have exposure to AI/ML data platforms and experience in the hospitality domain, supporting systems such as reservations, guest management, and operational analytics.

Key Responsibilities

  • Define and lead the architecture and design of enterprise data platforms using Snowflake on AWS.
  • Architect scalable data ingestion frameworks for integrating multiple source systems into the cloud data platform.
  • Design and govern data transformation frameworks using DBT.
  • Define and enforce data modelling standards including dimensional modelling, star schema, and enterprise data models.
  • Lead architecture reviews and solution design discussions for new data initiatives.
  • Optimize Snowflake performance, workload management, and cost governance.
  • Establish data governance frameworks including access control, data security, and compliance standards.
  • Design and support AI/ML-ready data architecture for advanced analytics and predictive modelling.
  • Provide architectural guidance to data engineering, BI, and analytics teams.
  • Design architecture to support data consumption for reporting systems, operational applications, and analytics platforms.
  • Implement automation, orchestration, and scalable pipeline frameworks using tools such as Apache Airflow.
  • Collaborate with business stakeholders and technical teams to align the data platform with enterprise data strategy.
  • Support hospitality analytics use cases, including guest behaviour analysis, booking trends, revenue analytics, and operational reporting.

Required Technical Skills

Data Platform

  • Strong expertise in
  • Snowflake
  • Deep knowledge of Snowflake architecture, performance tuning, data sharing, security, and workload optimization.

Cloud Platform

Strong experience with Amazon Web Services, including:

  • S3
  • IAM
  • AWS Glue
  • Lambda
  • CloudWatch

Data Transformation

  • Strong experience with
  • DBT
  • for enterprise-scale data modelling, testing, and transformation pipelines.

Programming / Query

  • Strong expertise in SQL for data transformation and performance optimization
  • Python (preferred) for automation and data engineering tasks.

Data Engineering

  • Enterprise ETL / ELT pipeline architecture
  • Data warehousing and enterprise data modelling
  • Dimensional modelling (Star Schema, Snowflake Schema)
  • Data pipeline scalability and reliability design.

AI / Data Science Exposure

  • Experience supporting AI/ML data pipelines and data preparation for machine learning models.
  • Understanding of predictive analytics, recommendation engines, and customer behaviour analytics.
  • Ability to design AI-ready data platforms for future analytics use cases.

Preferred Skills

  • Experience with Apache Airflow for pipeline orchestration.
  • Knowledge of CI/CD pipelines, DevOps, and Git-based development workflows.
  • Experience with data governance, metadata management, and enterprise data catalog tools.
  • Experience with BI tools such as
  • Tableau
  • Microsoft Power BI.
  • Domain experience in hospitality, travel, or hotel systems, including reservation systems, guest analytics, and operational reporting.

Education

Bachelor’s or Master’s degree in Computer Science, Information Technology, Data Engineering, Data Science, or a related field.

Data Architect
Stackstudio Digital Ltd.
UK
Hybrid
Senior - Leader
Private salary
RECENTLY POSTED
+1

Job Description Role Details

  • Job Title: Data Architect
  • Location: Heathrow (2 to 3days)
  • Duration of the Assignment: 6 Months

Job Purpose and Primary Objectives

  • Design and evolve data architecture aligned with business, analytical, and engineering needs.
  • Lead end to end architecture for cloud native data solutions on AWS.
  • Define standards, patterns, and best practices for data modelling, data flows, and platform usage.
  • Develop scalable Snowflake based data architectures including warehouses, databases, schemas, roles, and secure data sharing.

Key Responsibilities

(Please specify if the position is an individual one or part of a team)

  • Architect and optimize ETL/ELT pipelines using services such as:
    • DBT, AWS Glue, Lambda, Kinesis, S3, EMR
  • Design high performance Snowflake data models (3NF, dimensional, Data Vault, etc.)
  • Optimize query performance, clustering, micro-partitioning, and resource consumption
  • Work closely with Data Engineers, Analysts, Product Managers, and Business Stakeholders
  • Provide architectural guidance and mentorship across data and engineering teams
  • Translate business needs into scalable data solutions

Key Skills / Knowledge

  • Strong grasp of data modelling, data warehousing concepts and performance optimization techniques
  • Hands on expertise with Snowflake (architecture, performance tuning, security, Snowpipe, streams/tasks)
  • Strong experience with AWS cloud data ecosystem, ideally including: S3, Glue, Lambda, Redshift, EMR, Kinesis, IAM, CloudWatch
  • Strong SQL skills and proficiency in Python, DBT
  • Understanding of data governance frameworks (e.g., Collibra, Alation is a plus)
  • Hands-on exposure to cloud platforms, especially AWS
  • Experience working in agile teams

Experience Required

  • 6+ years in Data Architecture, Data Engineering, and related fields with a similar remit to the descriptions here
  • 2+ years’ experience working as an independent contractor
Data Consultant (SC Cleared)
Syntax Consultancy Ltd
Corsham
Hybrid
Mid - Senior
£450/day
RECENTLY POSTED

Corsham (Wiltshire)
6 Month Contract
£450/day (Outside IR35)

Data Consultant needed with active SC Security Clearance and strong Power BI, Tableau and AWS Cloud experience.

6 Month Rolling Contract based in Corsham (Wiltshire). Start asap in Spring 2026.

Hybrid Working - 3 days/week remote (WFH) + 2 days/week working from the office in Corsham (Wiltshire).

A chance to work with a leading global IT + Digital transformation business specialising in large-scale Government projects:

  • Design, build + implementation of an AWS cloud-based Data Analytics platform to provide data workflow, data visualisation + dashboard solutions.
  • Key Skills: Power BI and Tableau dashboard + data visualisation tools.
  • AWS Cloud Infrastructure environments including: AWS Glue, S3 Buckets, Athena + Serverless data pipelines.
  • Python / R-style workflow development, SQL querying + data transformation skills.
  • Hands-on experience with secure, cloud-based Data Analytics platforms.
  • Supporting definition of the POC scoping, success criteria + value proposition.
  • Shaping the technical path from POC to MVP considering scalability, security, governance + reuse.
  • Supporting build, configuration + implementation activities alongside developers + data engineers.
  • Ability to simplify complexity + communicate technical trade-offs clearly to key stakeholders.
  • In-depth experience in Data Engineering, Data Analytics platforms + Data Science.
Data Architect
Damia Group Ltd
London
Hybrid
Senior - Leader
£70,000 - £90,000
RECENTLY POSTED
+12

Data Architect - 70-90K base (DOE) - West London (hybrid)

We are recruiting a Data Architect for one of our clients based in West London on a permanent basis. The Data Architect is responsible for leading the definition, standardization, and governance of data architecture across platforms and products. This role balances technical leadership, data architecture, and collaboration with engineering, product, and security teams to ensure scalable, reliable, and secure systems.

Key responsibilities:

  • Enforce data architectural guidelines and consistency across development teams and services
  • Support established Data Governance and Data Quality frameworks, including tooling, policy enforcement, and stewardship models
  • Ensure robust metadata management, lineage tracking, and data cataloguing using business glossaries and modern catalog tools
  • Review and approve data architecture for major features, platforms, and technical initiatives
  • Collaborate with technical leads and DevOps on system scalability, performance, and reliability
  • Ensure data platforms are AI/ML-ready, with scalable infrastructure and clean, well-structured data pipelines
  • Collaborate with data science and analytics teams to enable model deployment, automation, and MLOps best practices
  • Promote innovation in generative AI, predictive analytics, and real-time decision support
  • Align data architecture with security, compliance, and data governance requirements
  • Lead the evolution of technical architecture documentation, models, and decision records
  • Conduct architecture and design reviews with cross-functional teams
  • Guide teams in the adoption of best practices in API design, modularity, cloud-native patterns, and event-driven systems
  • Recommend data management best practices, covering data flows, architecture patterns, retention, archival, and purging strategies
  • Coach and mentor engineers on data design, refactoring, and architectural reasoning

Essential skills and experience:

  • Proven experience designing and scaling enterprise-grade cloud data platforms (AWS preferred)
  • Deep experience with AWS, Databricks, Power Platform, and Redshift (Snowflake a plus)
  • Proficiency in AWS Glue, Qlik Talend, DBT, Airflow, and modern data integration tools.
  • Excellent knowledge of Python, SQL, PowerQuery (M), and preferably Scala or PySpark
  • Working knowledge with enterprise architecture frameworks (e.g., TOGAF), MLOps, and BI tools like Power BI and QuickSight
  • Experience of generative AI platforms (e.g., Amazon Bedrock, Anthropic)
  • Familiarity with infrastructure as code (Terraform), CI/CD practices (Jenkins, GitHub Actions), and observability (Grafana, Kibana)
  • Proficiency in scripting and automation using Bash, Groovy, or equivalent
  • Ability to balance long-term architectural vision with immediate delivery constraints

Damia Group Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this permanent job, you accept our Data Protection Policy which can be found on our website.

Please note that no terminology in this advert is intended to discriminate on the grounds of a person’s gender, marital status, race, religion, colour, age, disability or sexual orientation. Every candidate will be assessed only in accordance with their merits, qualifications and ability to perform the duties of the job.

Damia Group is acting as an Employment Business in relation to this vacancy and in accordance to Conduct Regulations 2003.

The advertised salary range is dependent on experience and the required qualifications.

Damia Group Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept our Data Protection Policy which can be found on our website.

Please note that no terminology in this advert is intended to discriminate on the grounds of a person’s gender, marital status, race, religion, colour, age, disability or sexual orientation. Every candidate will be assessed only in accordance with their merits, qualifications and ability to perform the duties of the job.

Damia Group is acting as an Employment Business in relation to this vacancy and in accordance to Conduct Regulations 2003.

Data Engineer
SF Recruitment
Shropshire
Hybrid
Mid - Senior
£55,000 - £70,000
RECENTLY POSTED
+1

Location: Shropshire, South Staffordshire, West Midlands, East Sussex, West Sussex, Surrey
Type: Permanent, Hybrid (2 days p/week onsite)
Salary: £55,000-£70,000 base D.O.E + benefits
Security clearance: Must be eligible for SC clearance

We are looking for an experienced Data Engineer to join a growing technology team delivering modern data solutions for a range of complex environments. This role involves building reliable data pipelines, integrating data from multiple sources, and supporting the development of secure, scalable data platforms. You will work closely with engineers, analysts and stakeholders to ensure data is accurately collected, transformed and made available for reporting, analytics and operational use.

Key responsibilities

  • Design and implement secure and reliable data integration solutions
  • Build and maintain data pipelines for ingestion, transformation and curation
  • Work with data from multiple systems and platforms
  • Monitor and support data pipelines to ensure reliability and performance
  • Collaborate with technical teams and stakeholders to gather requirements
  • Investigate and resolve issues affecting data services
  • Contribute to team knowledge sharing and engineering best practices
  • Work within Agile delivery teams

Skills and experience

  • Strong SQL skills and experience with data modelling
  • Experience building ETL or ELT data pipelines
  • Experience working with databases or data platforms such as Oracle or Cloudera
  • Experience with ETL tools such as Talend, Pentaho DI, Informatica, AWS Glue or SAS
  • Experience with programming or scripting languages such as Python or Bash
  • Familiarity with cloud platforms, ideally AWS
  • Understanding of data integration, transformation and orchestration
  • Experience using version control such as Git

Security clearance

Due to the nature of the work, candidates must be eligible for Security Check (SC) clearance. Applicants will normally need to have lived continuously in the UK for the past five years and hold the right to work in the UK.

So, if you’re an experienced data engineer with the desired skills and feel this could be the role for you, please apply now to be considered!

Data Engineer
SF Partners
Shrewsbury
Hybrid
Mid - Senior
£55,000 - £70,000
+1

Location: Shropshire, South Staffordshire, West Midlands, East Sussex, West Sussex, Surrey Type: Permanent, Hybrid (2 days p/week onsite) Salary: £55,000-£70,000 base D.O.E + benefits Security clearance: Must be eligible for SC clearance We are looking for an experienced Data Engineer to join a growing technology team delivering modern data solutions for a range of complex environments. This role involves building reliable data pipelines, integrating data from multiple sources, and supporting the development of secure, scalable data platforms. You will work closely with engineers, analysts and stakeholders to ensure data is accurately collected, transformed and made available for reporting, analytics and operational use. Key responsibilities Design and implement secure and reliable data integration solutions Build and maintain data pipelines for ingestion, transformation and curation Work with data from multiple systems and platforms Monitor and support data pipelines to ensure reliability and performance Collaborate with technical teams and stakeholders to gather requirements Investigate and resolve issues affecting data services Contribute to team knowledge sharing and engineering best practices Work within Agile delivery teams Skills and experience Strong SQL skills and experience with data modelling Experience building ETL or ELT data pipelines Experience working with databases or data platforms such as Oracle or Cloudera Experience with ETL tools such as Talend, Pentaho DI, Informatica, AWS Glue or SAS Experience with programming or scripting languages such as Python or Bash Familiarity with cloud platforms, ideally AWS Understanding of data integration, transformation and orchestration Experience using version control such as Git Security clearance Due to the nature of the work, candidates must be eligible for Security Check (SC) clearance. Applicants will normally need to have lived continuously in the UK for the past five years and hold the right to work in the UK. So, if you're an experienced data engineer with the desired skills and feel this could be the role for you, please apply now to be considered

Page 1 of 1
Frequently asked questions
Our job board features a variety of AWS Glue roles including AWS Glue Developer, ETL Engineer, Data Engineer, and Cloud Data Specialist positions that specifically require experience with AWS Glue.
Common skills for AWS Glue jobs include proficiency in ETL processes, experience with AWS Glue Studio, knowledge of AWS ecosystem services like S3, Lambda, and Athena, programming skills in Python or Scala, and understanding of data cataloging and data transformation.
Yes, our job board includes remote, hybrid, and on-site AWS Glue job opportunities from companies worldwide to fit a variety of work preferences.
To improve your chances, gain hands-on experience with AWS Glue, obtain relevant AWS certifications like AWS Certified Data Analytics - Specialty, showcase your ETL and data engineering projects, and tailor your resume to highlight your AWS Glue skills.
While not always mandatory, many employers prefer candidates with AWS certifications related to data analytics or cloud architecture, as they demonstrate verified expertise with AWS services including AWS Glue.