Make yourself visible and let companies apply to you.
Roles
Data Architect Jobs
Overview
Looking for top Data Architect jobs? Explore the latest data architect vacancies on Haystack, your go-to IT job board. Whether you're an experienced data architect or aspiring to design scalable data solutions, find exciting opportunities with leading companies hiring now. Start your data architect career today!
Software Engineer
JAM Recruitment Ltd
Multiple locations
In office
Mid - Senior
ÂŁ500/day - ÂŁ570/day
RECENTLY POSTED

DV Cleared Software Engineer (DBA / Data-Focused)

Contract: 12 months

Location: Cheltenham (5 days per week onsite, occasional travel to Gloucester)

Rate: ÂŁ500 - ÂŁ570 per day (Umbrella, Inside IR35)

Must hold live UKIC DV clearance

About the Role

An exciting opportunity has arisen for an experienced Software Engineer with a strong DBA background to support a growing national security programme based in Cheltenham. This role sits within a high-performing technical team delivering solutions that have real-world impact across the UK Government sector.

You’ll work on data-focused systems, contributing across the full software lifecycle while supporting and improving secure Oracle-based platforms and automation using Ansible.

Key Responsibilities

Design, develop, test, document and support software components within a secure system environment

Provide accurate estimates for development effort from given specifications

Work with a strong focus on data processing and database-driven systems

Support and develop Oracle and Ansible-based infrastructure, including automation

Analyse code defects and deliver timely, robust fixes

Required Skills & Experience

Strong background as a DBA / data-focused software engineer

Hands-on experience with Oracle and Ansible

Solid understanding of backend systems and data processing

Proven ability to troubleshoot, analyse and resolve complex technical issues

Comfortable working collaboratively within Agile teams

Desirable

Exposure to cloud or platform technologies such as AWS, Docker, microservices

Experience working in secure or regulated environments

The Ideal Candidate

You’ll be technically strong, curious, and motivated to solve complex problems. This role would suit someone who enjoys working close to the data layer, values clean, well-documented solutions, and wants to contribute to meaningful, high-impact programmes.

TPBN1_UKTJ

Lead Data Engineer - Databricks
JLA Resourcing Ltd
Basingstoke
Hybrid
Senior
ÂŁ75,000
RECENTLY POSTED

Lead Data Engineer - Databricks - ÂŁ75-80k + bonus + benefits - Basingstoke 3 days a week

The Opportunity:
We are looking for a Lead Data Engineer with good Databricks expeirence to join a Basingstoke based organisation who are investing heavily in their Digital Transformation Programme.

The Role:
You’ll play a proactive role in the delivery of next-generation data platforms, will manage / mentor the existing person and drive the design, development and governance of the data pipelines. You’ll be working really closely with stakeholders across the technology function and within the business and will the availability, integrity and compliance of the systems. You’ll play a key role in the ownership of the core architecture / engineering across the new Azure Databricks ecosystem. This will include incorporating AI / ML capability.

They are currently working with a 3rd Party Data Partner who have recommended a number of improvements - you’ll work closely with them selecting, implementing and managing technology so it’s a great opportunity to really make a difference.

The Person:
Key to this is proactivity - they’re really looking for someone who is always looking at “what’s next” but also able to deliver / engineer “the now”

Skills / attributes to include:

  • In depth experience of modern data solution architecture design and delivery in a hybrid cloud environment but predominantly Azure / Databricks
  • The ability to lead a small team whilst contributing personally
  • Be able to drive agile, modern engineering practises
  • Champion quality, observability and good engineering discipline, ensuring pipeline and models run cleanly and predictably
SAS Data Engineer
Deerfoot Recruitment Solutions Limited
Telford
Hybrid
Mid - Senior
ÂŁ50,000 - ÂŁ70,000
RECENTLY POSTED
TECH-AGNOSTIC ROLE

SAS Consultant/Data Engineer
Location: Telford or Worthing (hybrid working 2 days onsite)
Type: Full Time, Permanent
Salary: ÂŁ50,000 - ÂŁ70,000 DOE + comprehensive benefits package

Deerfoot Recruitment is working with a major consultancy partner on a long-term public sector engagement and is seeking experienced SAS Consultants/Data Engineers to join a growing data team.
This is a key role within a large-scale data portfolio, supporting critical programmes focused on revenue optimisation, fraud detection, data management and analytics. The successful candidate will play a hands-on role in designing, building and supporting robust SAS-based data solutions, working closely with product owners, architects, engineers and senior client stakeholders.

Key responsibilities include:

  • Designing and delivering secure, high-performance SAS solutions for data integration and analytics
  • Building and enhancing data pipelines covering ingestion, transformation, reporting and fraud detection
  • Supporting live services, incident resolution and continuous improvement
  • Collaborating in Agile delivery teams and contributing to engineering best practice

Key skills and experience:

  • Minimum 5 years experience as a data engineer or similar role
  • Strong background as a Data Engineer on complex, large-scale data platforms
  • Proven expertise with SAS 9.x
  • (SAS Viya (3.x/4) bonus to have)
  • Solid ETL, data modelling and batch scheduling experience
  • Understanding of performance optimisation, CI/CD and scalable solution design
  • Confident client-facing and consultancy skills

Security Clearance:
This role requires SC clearance, or eligibility to obtain it. Applicants must have lived in the UK continuously for the past 5 years. Some restrictions may apply based on nationality and residency.
This is an excellent opportunity to work on high-impact public sector systems within a collaborative, technically strong environment.

Apply today to find out more.

SAS Consultant/Senior SAS Consultant/SAS Developer/SAS Data Engineer/SAS Programmer/Lead SAS Programmer/SAS Analytics Consultant/SAS Technical Consultant/SAS Solutions Consultant/Data Engineer/Senior Data Engineer/Analytics Engineer/Data Platform Engineer/Data Integration Engineer/Data Pipeline Engineer/Data Solutions Engineer/Enterprise Data Engineer

Deerfoot Recruitment Solutions Ltd is a leading independent tech recruitment consultancy in the UK. For every CV sent to clients, we donate £1 to The Born Free Foundation. We are a Climate Action Workforce in partnership with Ecologi. If this role isn’t right for you, explore our referral reward program with payouts at interview and placement milestones. Visit our website for details. Deerfoot Recruitment Solutions Ltd is acting as an Employment Agency in relation to this vacancy.

Data Engineer
Birchwell Associates Ltd
Wallingford
Hybrid
Mid - Senior
ÂŁ50,000
RECENTLY POSTED

Birchwell Associates is recruiting on behalf of a client seeking an experienced Data Warehouse Specialist to join a growing data function based in Benson. This role requires onsite presence at least three days per week.

Reporting to senior leadership, the successful candidate will take ownership of an existing data warehouse while leading its evolution, including the design and delivery of a new architecture and the structured management of its migration. Working closely with technical and non-technical stakeholders, you will deliver reliable, high-quality data solutions that support informed decision-making across the business.

Key Responsibilities

  • Manage, optimise, and enhance the current data warehouse, ensuring strong data quality, performance, and governance.
  • Design and implement a scalable data warehouse architecture, integrating multiple internal and external data sources via APIs and connectors.
  • Maintain data integrity, security, and consistency across all reporting and analytics environments.
  • Identify and resolve data model and performance issues, producing clear technical documentation.
  • Collaborate with business stakeholders to translate requirements into effective data solutions.
  • Deliver data initiatives to agreed timelines and standards.
  • Support additional data-related projects as required.

Key Requirements

  • At least two years experience in data warehouse or database architecture roles, including semantic model or cube development.
  • Strong SQL skills with proven experience in data modelling and transformation.
  • Ability to analyse complex data and clearly communicate technical concepts to non-technical audiences.
  • Experience with Microsoft data platforms, including Azure Fabric.
  • Knowledge of DAX, Jet Analytics, or NAV is beneficial but not essential.
Data and Insights Manager
Erin Associates
Blackpool
Hybrid
Mid - Senior
ÂŁ40,000 - ÂŁ45,000
RECENTLY POSTED

Location: Lytham St Annes, Lancashire
Salary: ÂŁ45k + BUPA Private Healthcare, Pension, Life assurance, Bonus etc

We are working with a leading organisation in Lytham St Annes who are expanding their dynamic Digital team. They are now seeking a Data and Insights Manager to join their IT department. In this role, you will be responsible for ensuring the business maintains a highly secure and unified data ecosystem. The role bridges data management and insights to deliver impactful customer experiences.

Core Responsibilities for this Data and Insights Manager role

Ensuring a robust method of customer data collection across all channels.
Overseeing customer data flow between business systems.
Providing data cleansing and conducting analysis to identify trends and insights.
Producing reports for CRM and marketing teams and helping to develop strategies for segmentation for a personalised approach to marketing activity.
Working with the tech team to optimise customer data integration.
Developing reports/dashboards to keep stakeholders up to date with customer KPIs.
Core Experience for this Data and Insights Manager Role

Strong experience working in a data management/analytics role.
Knowledge of retail-focussed CRM and CDP systems.
Strong experience of data integration and hands-on experience with analytics and BI platforms.
Strong SQL knowledge and CRM segmentation and modelling strategies.
Strong knowledge of data legislation and privacy regulations.
Ideally experience with Bloomreach CDP, AB testing strategies and eCommerce platforms.
This fantastic opportunity to join a growing and collaborative Digital team, where you will work on work on impactful CRM initiatives within a supportive environment. This is an office-based role in Lytham St Annes, with occasional opportunity to work from home.

Keywords: Data & Insights Manager, Customer Data, CRM, Analytics, Excel, SQL, Lytham St Annes, Blackpool, Preston, Fleetwood, Lancashire, IT Change

To apply, please send your CV to Alex or call (phone number removed) for more details.

Erin Associates welcomes applications from people of all ethnicities, genders, sexual orientations, and disabilities. Please inform us if you require any reasonable adjustments at any stage of the application process

Azure Data Architect
ARC IT Recruitment
Brighton
Hybrid
Mid - Senior
ÂŁ80,000 - ÂŁ90,000
RECENTLY POSTED

Brighton, East Sussex, ÂŁ80 - ÂŁ90k

Azure Data Architect is required by our client who are a fast-growing and technically advanced multi-award-winning company going through an extended period of growth.

Key responsibilities:

  • Design, develop, and maintain data architectures using Azure Databricks and Synapse.
  • Collaborate with cross-functional teams to understand data requirements and translate them into robust architectural solutions.
  • Optimize data workflows and ensure seamless integration with various data sources.
  • Implement data governance and security best practices.
  • Provide technical leadership and guidance to the development team.
  • Conduct performance tuning and optimization of data processes.

Skills required:

  • Proven experience as a Data Architect working within Azure.
  • Expertise in Azure Databricks and Azure Synapse.
  • Strong understanding of Datamodelling, ETL processes, and data warehousing concepts.
  • Proficiency in SQL, Python, and other relevant programming languages.
  • Experience with data governance, data quality, and data security best practices.
  • Excellent problem-solving skills and attention to detail.
  • Strong communication and stakeholder management skills.

Worthing based hybrid opportunity, easily commutable from Portsmouth Hampshire, Guildford Surrey or Brighton, East Sussex.
Azure, Synapse, Databricks
Brighton, Hybrid (3 days in the office)

Configuration Engineer
Rullion Limited
Bridgwater
In office
Junior - Mid
ÂŁ220/day - ÂŁ300/day

Role: Configuration Engineer
Position: Contract
Location: Hinkley Point C and SDC, Somerset
Duration: 12 Months Rolling
Rate: Circa ÂŁ220 p/d PAYE + 36 days annual leave // Circa ÂŁ300 p/d Umbrella

Job Purpose / Overview

As a Configuration Engineer, you will be part of a growing multidisciplinary team responsible for delivering Work Management Support and maintaining the digital As-Built configuration required to build, commission, and operate Hinkley Point C Power Station.

This role focuses on data extraction, validation, and assembly into datasets aligned with business rules and ready for submission to Asset Suite 9 (HPC’s chosen Enterprise Asset Management (EAM) system). You will support the population of the Project’s Master Equipment List and ensure accurate attribute data within HPC’s EAM tool.

Principal Accountabilities

  • Population of the Equipment module in Asset Suite 9 with accurate asset identifiers and attributes.
  • Performing data quality assurance for equipment installation and configuration references.
  • Maintaining the asset/system schedules and resolving data anomalies.
  • Producing weekly performance reports into the line manager for review, and upward reporting.
  • Supporting the digital configuration through work management processes.
  • Collaboration with Construction Contract Partners, Completions, and Handover teams to ensure consistent data across platforms.

Essential Skills:

  • Strong experience in asset data analysis and validation.
  • Proficiency in Microsoft Excel, Word, and Power BI.
  • Ability to work independently and manage data integrity.
  • Experience with SAP, EDRMS or other CMMS systems.

Desirable Skills:

  • Familiarity with Asset Suite/Passport or other EAM tools.
  • Background/experience in engineering disciplines or interpreting engineering drawings.
  • Previous experience of working in a construction, completions, and/or data management related industry.

Rullion celebrates and supports diversity and is committed to ensuring equal opportunities for both employees and applicants.

Data Engineer (Automation)
Network IT
Milton Keynes
Hybrid
Mid - Senior
ÂŁ55,000
+1

Role: Data Engineer (Automation)

Location: Milton Keynes (Hybrid 3 Days In-Office Weekly)

Salary: ÂŁ45,000 - ÂŁ55,000

Network IT are supporting a large, enterprise-scale organisation as they continue to evolve and modernise their Data Analytics and Automation Platform ; were looking for an experienced Data Engineer to design, build, and optimise secure, automated data pipelines that enable scalable analytics, business intelligence, and data-driven decision-making across multiple business units.

This is a highly technical role with strong exposure to cloud and on-prem data platforms , automation, and emerging AI-driven capabilities, and is specifically suited to candidates with strong, hands-on data engineering experience , particularly in building, operating, and optimising complex data pipelines, data models, and integration workflows at scale.

Role Overview and Responsibilities

As a Data & Automation Engineer , you will be responsible for the end-to-end delivery, operation, and continuous improvement of enterprise data pipelines and analytics platforms. Youll work closely with architects, application managers, and international teams to ensure data solutions are reliable, scalable, and aligned with data governance standards.

Key responsibilities include:

Designing, developing, and maintaining automated end-to-end data pipelines across cloud and on-premise source systems.

Delivering reliable data ingestion, transformation, and delivery processes using technologies such as Azure Data Factory, Databricks, SSIS, and SQL .

Reducing manual interventions through automation and standardisation , including data preparation, feature engineering, and training-data pipelines.

Preparing data models and datasets (DWH / Lakehouse) to support business intelligence, analytics, and operational reporting.

Monitoring and supporting live data pipelines, resolving issues in line with ITIL best practices , and implementing proactive alerting and self-healing mechanisms.

Identifying and implementing performance optimisations across data pipelines, queries, and reporting workloads.

Supporting data governance processes , including data archiving, masking, encryption, and versioning, with opportunities to integrate AI-driven automation.

Contributing to CI/CD processes for data pipelines, ensuring releases are tested, compliant, and deployed with minimal operational impact.

Collaborating with cross-functional and international teams to support aligned, scalable data operations.

Supporting the ongoing evolution and execution of the organisations data strategy .

Essential Skills and Experience

To be successful in this role, you will bring:

Strong commercial experience delivering end-to-end data engineering and automation solutions in complex environments.

Advanced SQL skills, including performance tuning and optimisation across large datasets.

Knowledge of Python or R for data processing or analytics.

Hands-on experience with data integration and ingestion tools such as Azure Data Factory and Databricks .

Proven experience in data modelling , data warehousing, and relational database platforms (e.g. MS SQL Server).

Experience designing cloud-native and/or on-prem data solutions .

Exposure to automation and AI-enabled data processes , with awareness of LLM use cases within data workflows.

Experience working in Agile delivery environments (Scrum, Kanban, DevOps).

Strong analytical and problem-solving skills, with the ability to translate complex data into meaningful insights.

Excellent communication skills and the ability to work effectively with both technical and non-technical stakeholders.

Desirable Experience

Experience coordinating or supporting AI / ML initiatives within data platforms.

Experience working within large, regulated, or international organisations.

TPBN1_UKTJ

Information Manager
Lanesra Technical Recruitment Limited
Glasgow
Hybrid
Mid - Senior
ÂŁ60,000
TECH-AGNOSTIC ROLE

Position: Information Manager

Location: Glasgow

Salary Guide: ÂŁ60,000 Plus Car or Allowance, Bonus and Excellent Package

Our client is a Tier 1 Design & Build Engineering Contractor who operate predominately in the water industry. They are delivering a number of water and wastewater non-infrastructure projects for Scottish Water and they have a new vacancy within their Systems Management and Digital Delivery team for an Information Manager, based from their offices in Glasgow with hybrid working available.

You will report directly to the Head of Strategy and PMO and your role will cover the management of the digital delivery requirements and support project teams’ understanding of delivery in accordance with ISO 19650. Responsibilities will include:

  • Assessment of existing information systems in use, identification of gaps and management of their restructuring where relevant (e.g. SharePoint setup, use of Master Information Delivery Plans in Power Bi, ProjectWise, etc.), along with the introduction of new information systems.
  • Prepare and present reliable data for monthly Operational, JV Partner, and Strategic Board reports.
  • Ensure alignment with client reporting tools to maintain a single version of the truth.
  • Plan and co-ordinate the transition of information from existing to new information systems as necessary (e.g. data migration for historic projects where relevant).
  • Execute the responsibilities of the Information Management Function as described in ISO 19650 - primarily ensure compliance with the Client’s requirements and maintenance of contracted deliverables (Project BEPs and MIDPs).
  • Work with the relevant individuals to establish and support the Common Data Environment (CDE).
  • Support the region Document Controllers and provide support to the Quality Assurance Lead in carrying out Quality Assurance / Quality Control Audits and Continuous Improvement Systems.
  • Maintain a good working relationship with the Engineering Systems team, contributing towards lessons learned for the ProjectWise data source.
  • Management of asset information utilising databases and discipline models where applicable.
  • Engage with clients and suppliers to identify, explore, and challenge their Information Management requirements.
  • Report on project compliance in line with Client stage gate delivery processes.
  • Establishing & maintain a system of reports and metrics to monitor project success.
  • Interface with all project stakeholders to ensure that data is exchanged effectively and in formats that support their onward purpose.
  • Support supply chain members to ensure all parties work to the Project BEP.
  • Provide Information Management technical support and direction to all Project Team members.
  • Work to maximize the benefits of improved information management in Health and Safety, Sustainability, Operations and Maintenance etc.
  • Championing digital delivery on the project and promote, articulate, and lead digital behaviours.
  • Coordinate and facilitate information management best practice within the allocated projects.

Skills, Qualifications & Experience:

  • Experience in the role of Project Information Manager for a framework or large-scale project
  • Experience using and managing Common Data Environments such as Power BI, ProjectWise, BIM360, Viewpoint (4Projects), Aconex or similar.
  • Experience with 3D review tools such as Autodesk Navisworks or Bentley iTwin is desirable.
  • A certified postgraduate information management profession qualification desirable
  • Membership of a professional institution desirable
  • Individual BIM certification (ISO19650 or BS1192) with recognised body e.g., ICE, BSi, BRE, etc. an advantage
Data Engineer
MBDA UK
Manchester
Hybrid
Mid - Senior
ÂŁ45,000 - ÂŁ55,000
+2

Bolton

As a data engineer specialising in generative AI ; this role will see you working in a developing international and transversal structure. You will have the responsibility to evaluate, build and maintain data sets for internal customers whilst ensuring they can be maintained.
Salary: Circa ÂŁ45,000 - ÂŁ55,000 depending on experience
Dynamic (hybrid) working: 2-3 days per week on-site due to workload classification
Security Clearance: British Citizen
Restrictions and/or limitations relating to nationality and/or rights to work may apply. As a minimum and after offer stage, all successful candidates will need to undergo HMG Basic Personnel Security Standard checks (BPSS), which are managed by the MBDA Personnel Security Team.
What we can offer you:

  • Company bonus: Up to ÂŁ2,500 (based on company performance and will vary year to year)
  • Pension: maximum total (employer and employee) contribution of up to 14%
  • Overtime: opportunity for paid overtime
  • Flexi Leave: Up to 15 additional days
  • Flexible working: We welcome applicants who are looking for flexible working arrangements
  • Enhanced parental leave: offers up to 26 weeks for maternity, adoption and shared parental leave -enhancements are available for paternity leave, neonatal leave and fertility testing and treatments
  • Facilities: Fantastic site facilities including subsidised meals, free car parking and much more

The opportunity:
The MBDA IM GenAI delivery Office department is looking for an experienced data engineer able to evaluate design, deploy, improve and support MBDA data sets.

You will ensure MBDA data pipelines are designed to be resilient, secure and responsive. You will use your data engineering expertise to collaborate with different internal customers regarding their data, ensuring they are optimised and secured for their needs.

You will provide your knowledge in data management and data quality to guarantee compliance to MBDA data governance. A key part of this role is keeping up to date with new technology, where you will provide insight on our technology roadmap and deliver cutting edge solutions to our internal customers.
What we’re looking for from you:

  • SQL technologies skills (e.g. MS SQL, Oracle )
  • noSQL technologies skills (e.g. MongoDB, InfluxDB, Neo4J )
  • Data exchange and processing skills (e.g. ETL, ESB, API )
  • Development (e.g. Python) skills
  • Big data technologies knowledge (e.g. Hadoop stack)
  • Knowledge in NLP (Natural Language Processing)
  • Knowledge in OCR (Object Character Recognition)
  • Knowledge in Generative AI (Artificial Intelligence) would be advantageous
  • Experience in containerisation technologies (e.g. Docker) would be advantageous
  • Knowledge in the industrial and / or defence sector would be advantageous

Our company: Peace is not a given, Freedom is not a given, Sovereignty is not a given

MBDA is a leading defence organisation. We are proud of the role we play in supporting the Armed Forces who protect our nations. We partner with governments to work together towards a common goal, defending our freedom.

We are proud of our employee-led networks, examples include: Gender Equality, Pride, Menopause Matters, Parents and Carers, Armed Forces, Ethnic Diversity, Neurodiversity and more

We recognise that everyone is unique, and we encourage you to speak to us should you require any advice, support or adjustments throughout our recruitment process.

Follow us on LinkedIn (MBDA), X Instagram (MBDA_UK) and Glassdoor or visit our MBDA Careers website for more information.

Database Engineer
Grundon
Wallingford
Hybrid
Junior - Mid
Private salary

At Grundon, we are on the lookout for a dynamic, passionate, and driven individual to join our Data team based in Benson.

Reporting into the Head of IT/ Finance Director you will play a key role in driving our mission forward managing and optimising the organisation’s data warehouse to ensure high performance, reliability, and accessibility. This role will progressively encompass the design and development of a new data warehouse, along with the structured oversight of its migration process. Working closely within a cross-functional team, you will help translate business requirements into effective data solutions that drive informed, data-driven decision-making across the organisation.

Please note as part of this role you will be required to attend to the office a minimum of 3 days each week

What will you do

  • Take ownership of the current Data Warehouse, ensuring its ongoing optimisation and development while maintaining best practices in data mapping, coding standards, and data quality.
  • Design, develop, and maintain a scalable data warehouse architecture, integrating data from multiple internal and external sources using connectors and APIs.
  • Safeguard data integrity, accuracy, consistency, and security across all warehouse and reporting environments.
  • Diagnose and resolve data model performance issues, maintaining comprehensive documentation of data flows and reporting structures.
  • Collaborate with Business Analysts and stakeholders to identify needs, integrate new data sources, and support evolving business requirements.
  • Oversee project deliverables and timelines, ensuring the timely completion of all data-related initiatives.
  • Ensure full compliance with all Company policies and procedures including health and safety and employment.
  • Any other duties, such as ad hoc projects, as requested by the job holder’s Manager/Supervisor or the Board of Directors that are within the skills and capabilities of the job holder.

About You

  • Minimum 2 years’ experience in database architecture and management, including semantic cube development.
  • Proficient in SQL, data modelling, and transforming raw data into structured formats.
  • Strong analytical and communication skills, with the ability to interpret complex data and translate business needs into technical solutions.
  • Familiarity with Azure Fabric; experience with DAX, Jet Analytics, or NAV is advantageous.

About Grundon?

Grundon is the UK’s largest family-owned supplier of integrated waste management and environmental services. Founded in 1929, we have developed a distinctive approach that has helped us to maintain a leading position within the waste industry. This approach is underpinned by our commitment to quality of service, innovation and technical progress, together with a genuine and demonstrable concern for the environment.

#INDSPO

Software Engineer
JAM Recruitment Ltd
Dumbarton
In office
Mid - Senior
ÂŁ500/day - ÂŁ570/day

DV Cleared Software Engineer (DBA / Data-Focused)

Contract: 12 months

Location: Cheltenham (5 days per week onsite, occasional travel to Gloucester)

Rate: 500 - 570 per day (Umbrella, Inside IR35)

Must hold live UKIC DV clearance

About the Role

An exciting opportunity has arisen for an experienced Software Engineer with a strong DBA background to support a growing national security programme based in Cheltenham. This role sits within a high-performing technical team delivering solutions that have real-world impact across the UK Government sector.

You’ll work on data-focused systems, contributing across the full software lifecycle while supporting and improving secure Oracle-based platforms and automation using Ansible.

Key Responsibilities

  • Design, develop, test, document and support software components within a secure system environment
  • Provide accurate estimates for development effort from given specifications
  • Work with a strong focus on data processing and database-driven systems
  • Support and develop Oracle and Ansible-based infrastructure, including automation
  • Analyse code defects and deliver timely, robust fixes

Required Skills & Experience

  • Strong background as a DBA / data-focused software engineer
  • Hands-on experience with Oracle and Ansible
  • Solid understanding of backend systems and data processing
  • Proven ability to troubleshoot, analyse and resolve complex technical issues
  • Comfortable working collaboratively within Agile teams

Desirable

  • Exposure to cloud or platform technologies such as AWS, Docker, microservices
  • Experience working in secure or regulated environments

The Ideal Candidate

You’ll be technically strong, curious, and motivated to solve complex problems. This role would suit someone who enjoys working close to the data layer, values clean, well-documented solutions, and wants to contribute to meaningful, high-impact programmes.

Data & Integration Solution Architect
Agincare Group
Portland
Hybrid
Senior - Leader
ÂŁ40,000

Package Description:

At Agincare, we believe technology should enable exceptional care. As we continue our exciting journey of growth and digital transformation, were investing in our systems, data, and infrastructure to support our expanding services across the UK.

Were now looking for a Data & Integration Solution Architect to play a pivotal role in shaping our evolving digital landscape.

This is a newly created role within our IT team a rare opportunity to build, influence, and future-proof our data and integration ecosystem from the ground up. With scope to lead a small team in the future, this role offers real impact, visibility, and progression.

Why join us?

As Agincares data and integration specialist, youll be at the heart of our digital transformation. Youll design and deploy integration solutions, build robust data pipelines, and support the development of Power BI reporting that helps our teams make smarter, faster decisions.

Youll work closely with stakeholders across the business, influence technical direction, and help create a future-proof architecture that supports our ambition. And as the role grows, theres potential to lead a small team giving you the chance to shape not just our systems, but our capability.

What Youll Be Doing

Integration Solutions
Design and implement robust integration solutions (APIs, ETL, microservices, ESBs), developing reusable patterns and frameworks to ensure secure, reliable, and scalable data exchange across enterprise systems.

Middleware & Platforms
Select, configure, and manage middleware/iPaaS platforms, optimising performance, availability, and security with strong monitoring and error-handling practices.

Data Architecture & Governance
Define data architectures, schemas, and ETL pipelines, embedding governance, quality, security, and lifecycle management to support both operational and analytical needs.

Power BI & Analytics
Collaborate with business teams to deliver high-quality Power BI dashboards and reports, designing optimised datasets that enable accurate, timely, data-driven decisions.

Stakeholder Engagement & Standards
Work with internal teams and vendors to align integration strategy with business goals, clearly communicating technical solutions. Produce robust documentation and establish best-practice standards for integration and data management.

Future Leadership
Mentor junior colleagues, with scope to line manage and grow a small team as the function develops.

What Were Looking For

  • Proven experience designing and implementing integration solutions (APIs, ETL, microservices, ESBs) in complex enterprise environments.
  • Hands-on expertise with middleware or iPaaS platforms.
  • Strong programming skills (Java, C#, Python or similar), with solid understanding of JSON, XML, REST, and SOAP.
  • Experience architecting and maintaining data pipelines, ETL processes, and data warehouse/schema design.
  • Skilled in Power BI (dataset design, modelling, report/dashboard delivery).
  • Strong understanding of data governance, security, compliance, and performance tuning.
  • Excellent communication skills able to engage confidently with both technical and business stakeholders.

About Agincare

Were a family run business thats been caring and supporting people since 1986. With over 4,500 team members, were one of the UKs largest care providers and are continuing to grow. We have over 100 locations across England including care& nursing homes, home care branches, extra care schemes, supported living properties and live-in offices.

Agincare are signatories of the Care Leaver Covenant and are committed to supporting care leavers to live independently.We are proud to be able to offer a guaranteed interview to care leavers, or an informal conversation about our career opportunities.

All of our care services are regulated by the Care Quality Commission (CQC).

Equal opportunities are important to us at Agincare and we welcome applications from all.

#LI-KD1

Lead Data Engineer
Rebel Recruitment
Sheffield
Hybrid
Senior
ÂŁ85,000

Role: Lead Data Engineer - Azure/Databricks

Location: Sheffield

Working Model: Hybrid - 2 days per week in person

Salary: Up to ÂŁ85k depending on experience

Own the data platform. Shape the future architecture.

This role is for a hands-on data engineering leader who wants to build something properly, not babysit legacy pipelines.

Youll take full ownership of the data engineering domain, leading the design and building of a modern, highly scalable data platform that handles complex, high-frequency industrial data. Youll work closely with data scientists, software engineers, and senior leadership, with the trust and mandate to make meaningful architectural decisions.

If you enjoy balancing strong engineering principles with real-world business needs, and you like mentoring others while staying deeply technical, this role is built for you.

What youll be working on

  • Architecting and rebuilding the data transformation layer in Databricks
  • Designing robust data flows that support both real-time operational views and deep historical analysis
  • Moving pipelines from ad-hoc scripts to software-engineering standards(CI/CD, testing, modular design)
  • Defining clear data models, schemas, and standards across a complex data estate
  • Establishing a stable, high-performance serving layer for analytics, visualisation, and data science workloads
  • Working closely with data scientists to remove bottlenecks and enable better modelling and experimentation
  • Pair-programming, mentoring, and raising the engineering bar for a small, capable team

You wont just be delivering features; youll be setting direction.

The tech youll work with

  • Core:Azure, Databricks, Python, SQL, dbt, MQTT
  • Storage & Serving:Delta Lake, Postgres, TimescaleDB
  • Modelling & ML:MLflow
  • Visualisation:Grafana

What makes you a great fit

Youre someone who can zoom out to architecture and zoom in to code, comfortably.

  • Proven experience building and owning data platforms on Azure
  • Deep, hands-on knowledge of Databricks(lakehouse architecture, cluster management, performance and cost optimisation)
  • Strong opinions on data modelling, schema design, and standardisation
  • You treat data pipelines as software: version control, CI/CD, automated testing
  • Comfortable challenging architectural decisions, and explaining why
  • Able to translate technical trade-offs into business impact for non-technical stakeholders
  • A mentor by nature: you raise the team through pairing, guidance, and example

Industry experience

  • Experience with industrial, sensor-driven, or time-series data is highly desirable
  • Alternatively, background in high-volume or highly variable data environments (missing data, duplicates, schema drift, spiky load) will transfer well

Why youll want this role

  • Real autonomy:Youre trusted to make architectural calls, thats why youre being hired
  • Visible impact:Your work directly unlocks better analytics and machine learning outcomes
  • Modern problems:Youll work on advanced data patterns and architectures, not cosmetic refactors
  • Still technical:This is a leadership role without stepping away from code

What youll get

  • 5 weeks paid holiday plus bank holidays
  • Tax-efficient stock options
  • Company pension scheme
  • Salary sacrifice EV scheme
  • Training and professional development support
  • Regular all-hands sessions with real transparency
  • Hybrid working (2 days in person)
  • Quarterly employee recognition awards
  • Access to discounts via BrightHR

We welcome diverse applicants and are dedicated to treating all applicants with dignity and respect, regardless of background.

Lead Data Engineer
Fruition Group
Leeds
Hybrid
Senior
ÂŁ80,000
+3

Job Title: Lead Data Engineer
Location: Leeds, 2x per week
Salary: Up to ÂŁ80,000 per annum

Why Apply?
This is an exciting opportunity to work as a Lead Data Engineer delivering scalable, high quality data solutions for a leading client in the technology sector. This position offers professional growth, challenging projects, and access to cutting edge cloud data technologies.

Lead Data Engineer Responsibilities:

  • Design, develop, and optimise robust, scalable data pipelines and architectures to support Business Intelligence and analytics initiatives.
  • Manage and maintain cloud-based data platforms (AWS, Azure, or Google Cloud) including data lakes, warehouses, and lakehouse solutions.
  • Transform and process structured and unstructured data using modern ETL/ELT frameworks (Apache Spark, Airflow, dbt).
  • Collaborate closely with product managers, analysts, and software developers to ensure seamless integration and high-quality data availability.
  • Develop, maintain, and enhance reporting and analytics capabilities through tools such as PowerBI, Tableau, or QuickSight.
  • Apply best practices in data governance, data quality, and performance optimisation.
  • Operate in an agile environment, contributing to technical discussions and problem-solving initiatives.

Lead Data Engineer Requirements:

  • Proven experience in building and managing cloud-based data platforms (AWS Redshift/Glue, Azure Data Factory/Synapse, Google BigQuery/Dataflow).
  • Strong programming skills in Python, SQL, and Java for data engineering tasks.
  • Experience designing reliable, maintainable, and high-performance data pipelines and architectures.
  • Broad understanding of data warehousing, data lakes, and lakehouse architectures.
  • Familiarity with Business Intelligence and data visualisation tools.
  • Excellent analytical thinking, attention to detail, and problem-solving skills.
  • Strong collaboration and communication skills, able to work with both technical and non-technical stakeholders.
  • Comfortable with complexity, ambiguity, and working independently or as part of a team in a fast-paced environment.

We are an equal opportunities employer and welcome applications from all suitably qualified persons regardless of their race, sex, disability, religion/belief, sexual orientation or age.

SAS Data Engineer
Deerfoot Recruitment Solutions
Telford
Hybrid
Mid - Senior
ÂŁ70,000
TECH-AGNOSTIC ROLE

SAS Consultant / Data Engineer
Location: Telford or Worthing (hybrid working 2 days onsite)
Type: Full Time, Permanent
Salary: ÂŁ50,000 - ÂŁ70,000 DOE + comprehensive benefits package

Deerfoot Recruitment is working with a major consultancy partner on a long-term public sector engagement and is seeking experienced SAS Consultants / Data Engineers to join a growing data team.
This is a key role within a large-scale data portfolio, supporting critical programmes focused on revenue optimisation, fraud detection, data management and analytics. The successful candidate will play a hands-on role in designing, building and supporting robust SAS-based data solutions, working closely with product owners, architects, engineers and senior client stakeholders.

Key responsibilities include:

  • Designing and delivering secure, high-performance SAS solutions for data integration and analytics
  • Building and enhancing data pipelines covering ingestion, transformation, reporting and fraud detection
  • Supporting live services, incident resolution and continuous improvement
  • Collaborating in Agile delivery teams and contributing to engineering best practice

Key skills and experience:

  • Minimum 5 years experience as a data engineer or similar role
  • Strong background as a Data Engineer on complex, large-scale data platforms
  • Proven expertise with SAS 9.x
  • (SAS Viya (3.x / 4) bonus to have)
  • Solid ETL, data modelling and batch scheduling experience
  • Understanding of performance optimisation, CI/CD and scalable solution design
  • Confident client-facing and consultancy skills

?? Security Clearance:
This role requires SC clearance, or eligibility to obtain it. Applicants must have lived in the UK continuously for the past 5 years. Some restrictions may apply based on nationality and residency.
This is an excellent opportunity to work on high-impact public sector systems within a collaborative, technically strong environment.

Apply today to find out more.

SAS Consultant / Senior SAS Consultant / SAS Developer / SAS Data Engineer / SAS Programmer / Lead SAS Programmer / SAS Analytics Consultant / SAS Technical Consultant / SAS Solutions Consultant /Data Engineer / Senior Data Engineer / Analytics Engineer / Data Platform Engineer / Data Integration Engineer / Data Pipeline Engineer / Data Solutions Engineer / Enterprise Data Engineer
???

Deerfoot Recruitment Solutions Ltd is a leading independent tech recruitment consultancy in the UK. For every CV sent to clients, we donate £1 to The Born Free Foundation. We are a Climate Action Workforce in partnership with Ecologi. If this role isn’t right for you, explore our referral reward program with payouts at interview and placement milestones. Visit our website for details. Deerfoot Recruitment Solutions Ltd is acting as an Employment Agency in relation to this vacancy.

Principal GCP Data Engineer
Anson McCade
Multiple locations
Hybrid
Senior
ÂŁ95,000

ÂŁUp to ÂŁ95,000 GBP
Hybrid WORKING
Location: Bristol; Gloucester; Cardiff; Corsham; Cheltenham, Bristol, South West - United Kingdom Type: Permanent

Principal GCP Data Engineer
Join an award-winning innovation and transformation consultancy recognised for its cutting-edge work in data engineering, cloud solutions, and enterprise transformation. This organisation is known for bringing ingenuity to life, helping clients turn complexity into opportunity, and fostering a culture where technical specialists thrive and grow.

An opportunity has arisen for a Principal GCP Data Engineer to join the London-based data and analytics practice. This Principal GCP Data Engineer role offers the chance to lead the design and delivery of end-to-end data solutions on Google Cloud Platform for high-profile clients, shaping data strategy and driving technical excellence across complex programmes.

With a reputation for combining breakthrough technologies with pragmatic delivery, the organisation empowers senior data engineers to influence architecture, mentor teams, and deliver production-ready solutions that create lasting impact.

The Role - Principal GCP Data Engineer
The Principal GCP Data Engineer is a senior technical role responsible for leading data engineering solutions, guiding teams, and acting as a subject matter expert in Google Cloud Platform. As a Principal GCP Data Engineer, you will define end-to-end solution architectures, implement best practices, and lead the development of robust, scalable data pipelines.

This role combines hands-on technical leadership with coaching, mentorship, and client engagement, making it ideal for a Principal GCP Data Engineer who enjoys delivering complex solutions while shaping the capabilities of their team and influencing enterprise-wide data strategy.

What You’ll Be Doing as a Principal GCP Data Engineer
As a Principal GCP Data Engineer, you will:

  • Lead the design, development, and delivery of data processing solutions using GCP tools such as Dataflow, Dataproc, and BigQuery
  • Design automated data pipelines using orchestration tools like Cloud Composer
  • Contribute to architecture discussions and design end-to-end data solutions
  • Own development processes for your team, establishing robust principles and methods across architecture, code quality, and deployments
  • Shape team behaviours around specifications, acceptance criteria, sprint planning, and documentation
  • Define and evolve data engineering standards and practices across the organisation
  • Lead technical discussions with client stakeholders, achieving buy-in for solutions
  • Mentor and coach team members, building technical expertise and capability

Key Responsibilities

  • Develop production-ready data pipelines and processing jobs using batch and streaming frameworks such as Apache Spark and Apache Beam
  • Apply expertise in data storage technologies including relational, columnar, document, NoSQL, data warehouses, and data lakes
  • Implement modern data pipeline patterns, event-driven architectures, ETL/ELT processes, and stream processing solutions
  • Translate business requirements into technical specifications and actionable solution designs
  • Work with metadata management and data governance tools such as Cloud Data Catalog, Collibra, or Dataplex
  • Build data quality alerting and data quarantine solutions to ensure downstream reliability
  • Implement CI/CD pipelines with version control, automated tests, and automated deployments
  • Collaborate in Agile teams, using Scrum or Kanban methodologies

Key Requirements
The successful Principal GCP Data Engineer will bring deep technical expertise, client-facing experience, and leadership skills. You will have:

  • Proven experience delivering production-ready data solutions on Google Cloud Platform
  • Strong knowledge of batch and streaming frameworks, data pipelines, and orchestration tools
  • Expertise in designing and managing structured and unstructured data systems
  • Experience translating business needs into technical solutions
  • Ability to mentor and coach teams and guide technical decision-making
  • Excellent communication skills, with the ability to explain technical concepts to technical and non-technical stakeholders
  • A pragmatic approach to problem solving, combined with a drive for technical excellence

Why Join

  • Take a senior technical leadership role as a Principal GCP Data Engineer within a globally recognised innovation and transformation consultancy
  • Lead the delivery of complex data engineering programmes on Google Cloud Platform
  • Shape the data engineering standards, practices, and architecture across client engagements and internal teams
  • Work in a collaborative, inclusive, and learning-focused culture where technical specialists are empowered to grow and succeed

Reference: AMC/AON/PGCPDataEnginer

#aaon

Data Architect
Stackstudio Digital Ltd.
Leeds
Remote or hybrid
Mid - Senior
Private salary

Databricks ArchitectRole Overview
We are seeking a Senior Databricks Architectto lead the design and delivery of scalable, high-performance data platforms built on Databricks. This is a strategic role requiring strong architectural experience, hands-on design capability, and a deep understanding of modern data ecosystems.

The ideal candidate will have several years’ experience operating atarchitect level, with proven success designing and/or migrating data platforms into Databricks. Candidates with strong architecture backgrounds on comparable platforms (such as Snowflake) will also be considered, provided they hold relevant Databricks certifications.

Key Responsibilities

  • Design end-to-end data architectures usingDatabricksfor analytics, data engineering, and data science use cases.
  • Lead and supportdata platform migrationsinto Databricks from legacy or alternative data platforms.
  • Define architectural standards, best practices, and reference patterns for Databricks implementations.
  • Collaborate withdata engineers, platform teams, and stakeholders to translate business requirementsinto scalable technical solutions.
  • Ensure solutions meetperformance, security, scalability, and cost-optimizationrequirements.
  • Provide technical leadership and architectural governance across Databricks initiatives.
  • Review existing data architectures and recommend improvements or modernisation strategies.
  • Support teams with architectural guidance, troubleshooting, and design reviews.
  • Essential Skills & ExperienceProven experience working as aData Architect / Platform Architectat a senior level.
  • Hands-on experience designing solutions within theDatabricksecosystemOR strong architectural experience on a competing platform (e.g.Snowflake) combined withDatabricks certifications.
  • Demonstrated experience withdata platform migrations, modernisation, or large-scale data transformations.
  • Strong understanding of data architecture principles, including:
  • Data lakes / lake house architectures.
  • Data modelling and data integration patterns
  • Performance optimisation and scalability
  • Experience working with cloud-based data platforms.
  • Strong stakeholder communication and documentation skills.

Desirable Skills & Experience

  • Databricks certifications(Data Engineer, Data Architect, or equivalent).
  • Experience designing solutions prior to Databricks (traditional data warehouses, big data platforms, or cloud-native data stacks).
  • Knowledge of modern data engineering tools and frameworks.
  • Experience operating in complex enterprise environments
SAP Data Modeller
Morson Edge
Glasgow
In office
Mid - Senior
Private salary

Role: SAP Data Model Analyst
Business Unit: Scottish Power Energy Networks
Location: Glasgow HQ
Rate: Negotiable, Inside IR35, umbrella and PAYE rates
Job purpose statement:
SP Energy Networks is migrating its business-critical asset data from SAP EEC to SAP S4/HANA. This programme spans multiple business areas and will impact thousands of users and critical safety processes.
As the SAP Data Model Analyst with Network Planning & Regulation (NP&R) your primary responsibility is to lead the analysis of the current SAP Data Model and support non-technical users in understanding how business processes and priorities are delivered within the system, facilitating key decision making by senior stakeholders.
You will also collaborate with the Data Governance Analysts to capture core functionality rules within the current SAP Data Model to support the data quality assessment, and support a team of SAP Data Administrators in managing SAP data model change requests, ensuring that they adhere to Data Best Practices and consider the impacts upon the SAP S4/HANA Migration programme.
Main Duties:
• Collaborate with the NP&R SAP S4/HANA project team, and the wider business stakeholders, in the delivery of the project to migrate from SAP EEC to SAP S4/HANA.
• Translate between Data Model concepts and business concepts for non-technical stakeholders with NP&R, and the SAP Migration Programme team.
• Contribute to the SAP S/4HANA Data Model development, championing critical Asset Management business processes and priorities throughout the development of the conceptual and logical models.
• Maintain a holistic view of data model changes being applied to the SAP EEC Data Model, ensuring that they are considered and addressed during SAP S4/HANA Data Model development.
• Provide SP Data Model expertise to SAP Data Administrators and Data Governance Analysts to deliver project milestones.
Minimum Criteria:
Entry Qualifications:
HND or Degree level qualification
Specific:
Agile tooling (Jira) / Agile & DevOps
Entry Experience:
• 5+ years’ experience working with SAP for utilities – ideally electricity networks
• Experience with SAP S4/HANA would be advantageous but not mandatory
• Experience working on enterprise-level asset management with SAP PM Firsthand experience with SAP modules (e.g SAP OM, SAP MM, SAP SD)
• Excellent knowledge and understanding of relational data models and in the specification of To-Be data models
• Project Involvement: Experience working on SAP implementation or upgrade projects, even in supporting roles, is essential
• System Integration: Understanding of how different SAP modules integrate and work together
• Excellent attention to detail, with a focus on data quality and data management
• Excellent communication skills with the ability to present and report on outputs in a clear, concise way, and to meet the expectations of the target audience
Other:
Relevant SAP certifications are highly beneficial

Data Architect
Stackstudio Digital Ltd.
Multiple locations
Remote or hybrid
Mid - Senior
Private salary

Databricks ArchitectRole OverviewWe are seeking a Senior Databricks Architectto lead the design and delivery of scalable, high-performance data platforms built on Databricks. This is a strategic role requiring strong architectural experience, hands-on design capability, and a deep understanding of modern data ecosystems.

The ideal candidate will have several years’ experience operating atarchitect level, with proven success designing and/or migrating data platforms into Databricks. Candidates with strong architecture backgrounds on comparable platforms (such as Snowflake) will also be considered, provided they hold relevant Databricks certifications.

Key Responsibilities

  • Design end-to-end data architectures usingDatabricksfor analytics, data engineering, and data science use cases.
  • Lead and supportdata platform migrationsinto Databricks from legacy or alternative data platforms.
  • Define architectural standards, best practices, and reference patterns for Databricks implementations.
  • Collaborate withdata engineers, platform teams, and stakeholders to translate business requirementsinto scalable technical solutions.
  • Ensure solutions meetperformance, security, scalability, and cost-optimizationrequirements.
  • Provide technical leadership and architectural governance across Databricks initiatives.
  • Review existing data architectures and recommend improvements or modernisation strategies.
  • Support teams with architectural guidance, troubleshooting, and design reviews.
  • Essential Skills & ExperienceProven experience working as aData Architect / Platform Architectat a senior level.
  • Hands-on experience designing solutions within theDatabricksecosystemOR strong architectural experience on a competing platform (e.g.Snowflake) combined withDatabricks certifications.
  • Demonstrated experience withdata platform migrations, modernisation, or large-scale data transformations.
  • Strong understanding of data architecture principles, including:
  • Data lakes / lake house architectures.
  • Data modelling and data integration patterns
  • Performance optimisation and scalability
  • Experience working with cloud-based data platforms.
  • Strong stakeholder communication and documentation skills.

Desirable Skills & Experience

  • Databricks certifications(Data Engineer, Data Architect, or equivalent).
  • Experience designing solutions prior to Databricks (traditional data warehouses, big data platforms, or cloud-native data stacks).
  • Knowledge of modern data engineering tools and frameworks.
  • Experience operating in complex enterprise environments
Senior Data Engineer (AWS, Airflow, Python)
Triad
London
Remote or hybrid
Senior
ÂŁ60,000 - ÂŁ65,000
+1

Based at client locations, working remotely, or based in our Godalming or Milton Keynes offices.
Salary up to ÂŁ65k plus company benefits.

About Us

Triad Group Plc is an award-winning digital, data, and solutions consultancy with over 35 years’ experience primarily serving the UK public sector and central government. We deliver high-quality solutions that make a real difference to users, citizens and consumers.

At Triad, collaboration thrives, knowledge is shared, and every voice matters. Our close-knit, supportive culture ensures you’re valued from day one. Whether working with cutting-edge technology or shaping strategy for national-scale projects, you’ll be trusted, challenged, and empowered to grow.

We nurture learning through communities of practice and encourage creativity, autonomy, and innovation. If you’re passionate about solving meaningful problems with smart and passionate people, Triad could be the place for you.

Glassdoor score of 4.7

96% of our staff would recommend Triad to a friend

100% CEO approval

See for yourself some of the work that makes us all so proud:

Helping law enforcement with secure intelligence systems that keep the UK safe

Supporting the UK’s national meteorological service in leveraging supercomputers for next-level weather forecasting

Assisting a UK government department responsible for consumer product safety with systems to track unsafe products

Powering systems that help the government monitor and reduce greenhouse gas emissions from commercial transport

Role Summary

Triad is seeking a Senior Data Engineer to play a key role in delivering high-quality data solutions across a range of client assignments, primarily within the UK public sector. You will design, build, and optimise cloud-based data platforms, working closely with multidisciplinary teams to understand data requirements and deliver scalable, reliable, and secure data pipelines. This role offers the opportunity to shape data architecture, influence technical decisions, and contribute to meaningful, data-driven outcomes.

Key Responsibilities

Design, develop, and maintain scalable data pipelines to extract, transform, and load (ETL) data into cloud-based data platforms, primarily AWS.

Create and manage data models that support efficient storage, retrieval, and analysis of data.

Utilise AWS services such as S3, EC2, Glue, Aurora, Redshift, DynamoDB and Lambda to architect and maintain cloud data solutions.

Maintain modular Terraform based IaC for reliable provisioning of AWS infrastructure.

Develop, optimise and maintain robust data pipelines using Apache Airflow.

Implement data transformation processes using Python to clean, preprocess, and enrich data for analytical use.

Collaborate with data analysts, data scientists, developers, and other stakeholders to understand and integrate data requirements.

Monitor, optimise, and tune data pipelines to ensure performance, reliability, and scalability.

Identify data quality issues and implement data validation and cleansing processes.

Maintain clear and comprehensive documentation covering data pipelines, models, and best practices.

Work within a continuous integration environment with automated builds, deployments, and testing.

Skills and Experience

  • Strong experience designing and building data pipelines on cloud platforms, particularly AWS.
  • Excellent proficiency in developing ETL processes and data transformation workflows.
  • Strong SQL skills (postgresql) and advanced Python coding capability (essential).
  • Experience working with AWS services such as S3, EC2, Glue, Aurora, Redshift, DynamoDB and Lambda (essential).
  • Understanding of Terraform codebases to create and manage AWS infrastructure.
  • Experience developing, optimising, and maintaining data pipelines using Apache Airflow.
  • Familiarity with distributed data processing systems such as Spark or Databricks.
  • Experience working with high-performing, low-latency, or large-volume data systems.
  • Ability to collaborate effectively within cross-functional, agile, delivery-focused teams.
  • Experience defining data models, metadata, and data dictionaries to ensure consistency and accuracy.

Qualifications & Certifications

  • A degree or equivalent qualification in Computer Science, Data Science, or a related discipline (desirable).
  • Due to the nature of this position, you must be willing and eligible to achieve a minimum of SC clearance. To be eligible, you must have been a resident in the UK for a minimum of 5 years and have the right to work in the UK.

Triad’s Commitment to You

As a growing and ambitious company, Triad prioritises your development and well-being:

  • Continuous Training & Development: Access to top-rated Udemy Business courses.
  • Work Environment: Collaborative, creative, and free from discriminatioBenefits:
    • 25 days of annual leave, plus bank holidays.
    • Matched pension contributions (5%).
    • Private healthcare with Bupa.
    • Gym membership support or Lakeshore Fitness access.
    • Perkbox membership.
    • Cycle-to-work scheme.

What Our Colleagues Have to Say

Please see for yourself on Glassdoor and our “Day in the Life” videos at the bottom of our Careers Page.

Our Selection Process

After applying for the role, our in-house talent team will contact you to discuss Triad and the position. If shortlisted, you will be invited for:

  1. A technical test including numerical, logical and verbal reasoning
  2. A technical interview with our consultants
  3. A management interview to assess cultural fit

We aim to complete interviews and progress candidates to offer stage within 2-3 weeks of the initial conversation.

Other Information

If this role is of interest to you or you would like further information, please contact Ryan Jordanand submit your application now.

Triad is an equal opportunities employer and welcomes applications from all suitably qualified people regardless of sex, race, disability, age, sexual orientation, gender reassignment, religion, or belief. We are proud that our recruitment process is inclusive and accessible to disabled people who meet the minimum criteria for any role. Triad is a signatory to the Tech Talent Charter and a Disability Confident Leader.

Frequently asked questions
A Data Architect is a professional responsible for designing, creating, and managing an organization's data architecture. They develop strategies to ensure data is collected, stored, and utilized efficiently to support business goals.
Key skills include expertise in database design, data modeling, ETL processes, cloud data services, SQL, knowledge of big data technologies, programming skills in Python or Java, and strong analytical and communication abilities.
Most roles require a bachelor's degree in Computer Science, Information Technology, or related fields. Certifications like AWS Certified Data Analytics, Google Professional Data Engineer, or Microsoft Certified: Azure Data Engineer are advantageous.
You can search for Data Architect jobs using relevant keywords or filter by location, experience level, and employment type on the Haystack platform to find the most suitable opportunities.
Data Architects are in demand across industries such as technology, finance, healthcare, retail, and telecommunications, as organizations seek to leverage data for business insights and decision-making.