Make yourself visible and let companies apply to you.
Roles

Data Engineer Jobs in Glasgow

Overview

Looking for top Data Engineer jobs in Glasgow? Explore the latest and best opportunities on Haystack, your go-to IT job board. Whether you’re an experienced data engineer or just starting out, find roles with leading Glasgow companies ready to boost your career in data engineering. Start your job search today and land your next data engineering position in Glasgow!
Filters applied
Glasgow
Data Engineer
Search
Salary
Location
Remote preference
Role type
Seniority
Tech stack
Sectors
Contract type
Company size
Visa sponsorship
Lead Data Engineer
Cathcart Technology
Glasgow
Fully remote
Senior
£85,000 - £100,000
RECENTLY POSTED
python
snowflake
I’m partnering with a fast-growing, highly respected analytics and consultancy organisation working at the forefront of the energy, chemicals and low-carbon sectors to hire a Lead Snowflake Data Engineer. Backed by major investors and trusted by global clients, it’s a great time to be joining their team (fully remote - MUST be UK based).This is a rare greenfield leadership opportunity where you’ll define the data strategy, architecture and tooling from day one, while building and mentoring a high-performing data engineering team. You’ll deliver scalable cloud-native pipelines on Snowflake, help unify data across multiple acquired businesses, and develop machine learning solutions that drive real commercial insight. Working closely with senior stakeholders, you’ll turn complex requirements into robust technical solutions and set standards for data quality and governance. With a modern Python-first stack, no legacy constraints and strong business backing, the role offers genuine technical ownership and strategic impact in a fully remote position.You’ll ideally have most of the following:** Hands-on experience with Snowflake (essential)** Proven experience leading data engineering projects end-to-end** Strong Python background and modern data engineering practices** Deep understanding of ETL / ELT, data modelling and transformations** Practical machine learning experience (desirable)** Strong communication with technical and non-technical teamsYou’ll be joining at a pivotal time, with the chance to build a modern data function from the ground up and make a visible, long-term impact on the business. The role offers significant technical autonomy, real influence at senior stakeholder level, and the opportunity to work on meaningful problems connected to the global energy transition.In return they offer a very competitive salary and strong benefits package, flexible remote working across the UK with occasional travel, and clear long-term progression path into senior data leadership.It’s genuinely a really exciting opportunity to combine hands-on engineering, strategic thinking and team leadership in an environment that actively supports innovation, learning and ambitious technical ownership.If you’re keen to learn more, please apply or drop Matthew MacAlpine at Cathcart Technology a message.Cathcart Technology is acting as an Employment Agency in relation to this vacancy
Data Scientist
InfinityQuest Ltd,
Glasgow
Remote or hybrid
Mid - Senior
Private salary
RECENTLY POSTED
processing-js
aws
tensorflow
kubernetes
python
docker
+4
Data Scientist:We are seeking a highly skilled Data Scientist AI to design, develop, and deploy advanced machine learning and artificial intelligence solutions. The ideal candidate will work on large datasets, build predictive models, and collaborate cross-functionally to deliver scalable, data-driven products.Key ResponsibilitiesDesign, develop, and optimize machine learning and deep learning models.Work on AI/ML projects including NLP, computer vision, recommendation systems, and generative AI.Perform data cleaning, feature engineering, and exploratory data analysis (EDA).Build and manage data pipelines and model training workflows.Deploy models into production and monitor performance.Collaborate with Product, Engineering, and Business teams to translate business problems into AI solutions.Conduct model evaluation, A/B testing, and performance tuning.Document models, experiments, and technical processes.Required Skills & QualificationsClassic Machine learning (Regression, predictive Analysis, Classification, Clustering)Machine learning Model OptimisationStrong proficiency in Python (NumPy, Pandas, Scikit-learn).Hands-on experience with Deep Learning frameworks: TensorFlow, PyTorch, or Keras.Experience in Natural Language Processing (NLP) and/or Computer Vision.Strong knowledge of Machine Learning algorithms and statistics.Experience with SQL/NoSQL databases and big data tools (Spark, Hadoop preferred).Experience with MLOps tools such as Docker, Kubernetes, CI/CD pipelines.Preferred SkillsExperience with LLMs / Generative AI (OpenAI, Hugging Face, LangChain).Cloud experience (AWS, Azure, or GCP).Experience building AI APIs and microservices.EducationBachelors or Masters degree in Computer Science, Data Science, AI, or a related field. (PhD preferred for advanced research roles)Soft SkillsStrong problem-solving and analytical thinkingExcellent communication and storytelling skillsAbility to work in fast-paced, cross-functional teams
Ai Engineer Placement Programme
Career Change
Multiple locations
Remote or hybrid
Graduate
£25,000 - £45,000
RECENTLY POSTED
github
git
python
Trainee AI Engineer - No Experience Needed! Ready to launch a future-proof career in AI - even if you have zero experience? We’re looking for ambitious individuals ready to break into the world of artificial intelligence and data science. Our industry-leading AI Traineeship is designed to take you from complete beginner to a highly employable Junior AI Engineer, with full training provided and a guaranteed job offer (£25K-£45K starting salary) within 20 miles of your location. Whether you’re working full-time, part-time, or currently unemployed, our flexible, self-paced program allows you to train around your lifestyle and be job-ready in as little as 6-12 months. How it works - your journey to AI success Step 1: Discover AI Start your journey with our beginner-friendly mini-course bundle. Explore the fundamentals of AI through interactive videos, presentations, and quizzes - all accessible anytime, anywhere, through our easy-to-use online platform. Step 2: Full-Stack AI Mastery Dive deeper with in-demand technical skills including Python programming, data handling, machine learning, and version control with Git and GitHub. You’ll work on hands-on mini projects that mirror real-world challenges, helping you build confidence and a strong portfolio as you learn. Step 3: Get Certified Prepare for and earn your Microsoft AI-900: Azure AI Fundamentals certification, an internationally recognised qualification that proves your knowledge and boosts your credibility with employers. Step 4: Real-World Projects Showcase your new skills with two industry-based, practical projects assigned by your personal tutor. These projects are designed to give you real, portfolio-worthy experience, making you stand out to employers from day one. Your guaranteed future After completing your training, you’ll be placed in a Junior AI Engineer role with a guaranteed starting salary of £25K-£45K. We work with top employers across the UK to ensure you get matched with a role close to home. We guarantee a job offer upon completion or 100% of your course fees refunded. We’re proud to have helped over 1,000 people each year transform their careers. Explore our success stories on our website and see what your future could look like. At a one off cost of £990, or a deposit of £149 followed by 10 interest free monthly instalments of £124, this represents a great opportunity to start a rewarding career in Ai and have a real career ladder to start climbing. If you are not offered a role at the end of the training we will refund 100% of your course fees. Ready to future-proof your career? If you’re passionate about tech and ready to break into one of the fastest-growing industries in the world, apply now. One of our friendly advisors will be in touch to guide you through the next steps. ‘Please note that this is a training course and fees apply
Data Engineer
InfinityQuest Ltd,
Glasgow
Hybrid
Mid - Senior
Private salary
RECENTLY POSTED
aws
python
sql
Key Responsibilities
Develop and optimize data pipelines for ingestion, transformation, and storage.
Ensure data quality, integrity, and security across systems.
Collaborate with Data Scientists and Analysts to enable advanced analytics.
Implement best practices for scalability and performance in cloud environments.
Support integration of MRO AI Solutions into Client operational workflows.
Design data architectures and pipelines that support multi-OpCo deployment, ensuring modularity and interoperability.
Required Skills & Experience
Expertise in Python, SQL, and modern ETL frameworks.
Hands-on experience with cloud platforms (AWS preferred).
Strong knowledge of data modeling and API integration.
Proven experience in developing, testing, and deploying data solutions into production environments, ensuring reliability, scalability, and maintainability beyond proof-of-concept or prototype stages.
Familiarity with airline or logistics data domains is a plus.
Significant experience in similar roles, with a proven ability to integrate quickly into new teams and deliver immediate value.
Initial co-location with Client teams in London is essential to ensure close collaboration. Candidates must also be prepared to travel internationally during later stages to facilitate group-wide deployment.
Preferred Consulting-Level Competencies
Ability to design enterprise-grade data solutions under tight timelines.
Strong stakeholder engagement and solution-oriented mindset.
Experience in advisory or consulting roles for data engineering projects.
Track record of creating high-impact outcomes and driving stakeholder satisfaction from day one.
Ability to implement standards and frameworks for scalable data solutions across multiple operating companies.
Lead Enterprise AI Engineer
SMS
Glasgow
Fully remote
Senior
Private salary
RECENTLY POSTED
fabric
unity-3d
python
sql
dimensions
Why choose us?Choosing to work for SMS means choosing to make a difference. We are changing how businesses and consumers use energy for the better, helping achieve a greener, sustainable, and more affordable energy system for everyone. Through our range of innovative energy solutions, we are delivering the future of smart energy working closely with private and public sector partners we are playing a critical role in transforming and decarbonising the UK economy by 2050.What’s in it for you?
25 personal holiday days per year (with additional 8 public holidays) increasing to 30 personal days after 5 years of service (includes options to buy and sell)
Hybrid working options (for some positions).
Enhanced Maternity leave. Paternity and Adoption leave.
24/7 free and confidential employee assistance service.
Simply Health plan offers a wide variety of benefits from cashback on everyday healthcare treatments like optical, dental and physio treatments. Discounted gym memberships and free 24/7 online GP.
Life Insurance (4 x annual salary)
Pension matching scheme (up to 5% of salary)
What’s the role?Step into a role where strategy meets engineering excellence. As our Lead Enterprise AI Engineer, youll be the driving force turning bold business opportunities into production-ready AI solutions. Youll shape intelligent agents, craft next-generation conversational experiences using Microsoft Copilot and Databricks Mosaic AI, and architect the Semantic Layer that ensures our models deliver accurate, trusted insights every time.If youre ready to take ownership of an organisations AI futureand build the systems that make it realthis is your stage.You will report to the Data, Analytics and AI Director and work remotely on a full-time, 40-hour contract.Please note: The closing date for applications is Wednesday 21st January 2026.Key responsibilities:AI Solution Development & Agent Building
Design, build, and deploy low-code and pro-code AI agents using Microsoft Copilot tools to automate business workflows (e.g., HR queries, IT support, operational data retrieval).
Develop custom RAG (Retrieval-Augmented Generation) solutions within Databricks to allow LLMs to reason over proprietary SMS documents and data.
Integrate AI agents with enterprise systems (Dynamics 365, SharePoint, etc.) via APIs and Power Automate connectors.
Semantic Modelling & Databricks Genie Curation
Own the creation and maintenance of Databricks Genie Spaces. This involves translating complex database schemas into business-friendly semantic models.
Define and govern standard metrics, dimensions, and synonyms within Unity Catalog to ensure the AI “speaks the language of the business.”
Continuously monitor Genie performance, reviewing “human feedback” on answers to refine the semantic model and improve accuracy over time.
Business Engagement & Prototyping
Partner with business stakeholders (Finance, Operations, Commercial) to decompose high-level use cases into technical requirements.
Rapidly prototype AI solutions to demonstrate value and gain quick traction within business units.
Act as a technical evangelist, demonstrating to non-technical teams how to interact with Genie spaces and Copilots effectively.
AI Operations (LLMOps) & Governance
Implement monitoring frameworks to track the cost, latency, and quality of AI model outputs.
Ensure all AI solutions adhere to the organisation’s data governance and security standards (e.g., preventing data leakage via LLMs).
Manage the lifecycle of AI models and agents from development through to production and retirement.
To be considered for this role, we would love you to have:
Degree in Computer Science, Data Engineering, Artificial Intelligence, or related field, or equivalent industry experience.
Deep hands-on experience with Azure/Microsoft Fabric ecosystem (specifically Copilot Studio & Power Platform) and Databricks (SQL, Unity Catalog, Mosaic AI).
Strong ability to design data models for analytics. Experience with defining metrics layers (e.g., DBX semantic layer or Databricks Genie) is essential.
Practical experience with Large Language Models (LLMs), Prompt Engineering, and RAG architectures.
Proficient in Python (for data manipulation and API interaction) and SQL (for data modelling).
Experience working with REST APIs and connecting disparate business systems.
Ability to explain technical AI concepts to non-technical business users and translate their feedback into code.
#LI-Remote
Data Engineer
Infinitive Resources
Glasgow
Hybrid
Mid - Senior
£350/day - £425/day
RECENTLY POSTED
processing-js
aws
python
sql
Job Title: Data EngineerLocation: Hybrid based in GlasgowEmployment Type: ContractDay rate: £350 - £425Infinitive is a small but highly successful and growing company at the cutting edge oftech within the rail industry, utilising hardware, software and data. We have workedon many exciting projects recently (with more in the pipeline) and we have animpressive list of clients such as Network Rail, Transport for Wales, Transport forLondon & Keolis to name just a few. We also delivered on a key project for the FifaWorld Cup Qatar 2022.Role Overview:Data Engineers will be responsible for helping collect and convert the data intouseful data products, ready to be consumed and visualised. The successfulcandidate will play a key role in developing and maintaining data pipelines, ensuringthe accuracy and availability of data, and supporting the creation of insightfulvisualisations. The ideal candidate will have experience working with railway industrydata, along with proficiency in tools such as FME and Microsoft Power Platform.Key Responsibilities:* Design, develop, and maintain robust data pipelines to collect, process, and store data from various sources within Network Rail’s routes* Ensure data quality, integrity, and availability by implementing appropriate validation, transformation, and cleaning processes.* Collaborate with the data visualisation team to provide clean and well-structured data that supports the development of insightful and user friendly visualisations.* Apply industry-specific knowledge to work with railway data, ensuring that the data solutions developed are relevant, accurate, and aligned with industry standards.* Implement and maintain data governance standards to ensure compliance with legal and regulatory requirements.* Ensure that all data handling practices meet Network Rail’s security* and privacy policies.* Monitor and optimise data pipeline performance to ensure efficient data* processing and minimal downtime.* Troubleshoot and resolve data-related issues in a timely manner.* Maintain comprehensive documentation of data processes, pipelines,* and system architecture.Qualifications & Experience:* Proven experience as a Data Engineer, preferably within the railway or transport sector* Strong proficiency with FME and Microsoft Power Platform tools.* Solid understanding of data integration, ETL processes, and data warehousing concepts.* Experience working with railway industry data, including familiarity with relevant standards and regulations.* Proficiency in SQL, Python, and/or other relevant programming languages.* Experience with cloud-based data solutions (e.g., Azure, AWS).* Experience of working in an agile environment.* Analytical mindset with a focus on data accuracy and quality.* Strong problem-solving skills and attention to detail.* Ability to work independently and as part of a team in a fast-paced environment.* Strong communication skills with the ability to explain complex technical concepts to a non-technical audience.We want to attract a wide range of diverse applicants to Infinitive Group, so if youhave read through the job advert and don’t tick all the boxes but are very interestedin the opportunity, please reach out for a chat
Data Architect
Fruition Group
Multiple locations
Hybrid
Senior
£100,000
RECENTLY POSTED
aws
snowflake
Liverpool (Hybrid - 2 days per week in office) Basic salary circa £100kAn exciting opportunity for a Data Architect to join a large, established organisation and become part of a growing, centralised data function. This role offers a genuine mix of data strategy and hands-on data architecture, giving you the chance to influence how data is designed, governed, and leveraged across the wider business.As a Data Architect, you’ll operate within a broader Data and Analytics team, working alongside data engineers, governance specialists, and technology architects. You’ll help set the direction for the organisation’s data landscape while remaining close to delivery, ensuring architectural decisions translate into practical, scalable solutions that teams can build against.This is a role for a Data Architect comfortable operating at both a strategic and technical level. You’ll contribute to long-term data direction, define architectural standards, and support delivery teams with clear, well-considered data designs. A strong grounding in data engineering concepts, data governance, and modern cloud-based data platforms is essential, though the focus is on capability and approach rather than specific tools.Data Architect - Key Requirements:
Strong experience designing data architectures within complex or enterprise environments
Experience contributing to data strategy as well as hands-on architectural design
Understanding of modern data architecture patterns and approaches
Solid grasp of data engineering practices, including integration, transformation, and pipelines
Good awareness of data governance principles, data quality, and ownership
Experience working with modern data tooling and cloud platforms (e.g. Snowflake, AWS, Azure, etc.)
Confident working with and influencing stakeholders across engineering and architecture teams
Previous experience working in a highly regulated environment would be preferred
Data Architect - Salary & Benefits:
Basic salary up to £100k
Excellent pension scheme
Discretionary bonus
25 days holiday (+/-)
Private medical cover
Life assurance and income protection
Share save scheme
Additional flexible benefits, L&D opportunities, and perks
If you’re a Data Architect looking for a role where you can shape data direction, stay close to delivery, and work as part of a collaborative data team, this is a strong opportunity to make a meaningful impact.We are an equal opportunities employer and welcome applications from all suitably qualified candidates, regardless of race, sex, disability, religion/belief, sexual orientation, or age.
Senior Data Engineer/ PowerBI
Head Resourcing
Glasgow
Hybrid
Senior
£60,000 - £80,000
RECENTLY POSTED
powerbi
processing-js
fabric
unity-3d
git
python
+4
Lead Data Engineer - Azure & Databricks Lakehouse Glasgow (3/4 days onsite) | Exclusive Role with a Leading UK Consumer Business A rapidly scaling UK consumer brand is undertaking a major data modernisation programme-moving away from legacy systems, manual Excel reporting and fragmented data sources into a fully automated Azure Enterprise Landing Zone + Databricks Lakehouse. They are building a modern data platform from the ground up using Lakeflow Declarative Pipelines, Unity Catalog, and Azure Data Factory, and this role sits right at the heart of that transformation. This is a rare opportunity to join early, influence architecture, and help define engineering standards, pipelines, curated layers and best practices that will support Operations, Finance, Sales, Logistics and Customer Care. If you want to build a best-in-class Lakehouse from scratch-this is the one. ? What You’ll Be Doing Lakehouse Engineering (Azure + Databricks) Engineer scalable ELT pipelines using Lakeflow Declarative Pipelines, PySpark, and Spark SQL across a full Medallion Architecture (Bronze ? Silver ? Gold). Implement ingestion patterns for files, APIs, SaaS platforms (e.g. subscription billing), SQL sources, SharePoint and SFTP using ADF + metadata-driven frameworks. Apply Lakeflow expectations for data quality, schema validation and operational reliability. Curated Data Layers & Modelling Build clean, conformed Silver/Gold models aligned to enterprise business domains (customers, subscriptions, deliveries, finance, credit, logistics, operations). Deliver star schemas, harmonisation logic, SCDs and business marts to power high-performance Power BI datasets. Apply governance, lineage and fine-grained permissions via Unity Catalog. Orchestration & Observability Design and optimise orchestration using Lakeflow Workflows and Azure Data Factory. Implement monitoring, alerting, SLAs/SLIs, runbooks and cost-optimisation across the platform. DevOps & Platform Engineering Build CI/CD pipelines in Azure DevOps for notebooks, Lakeflow pipelines, SQL models and ADF artefacts. Ensure secure, enterprise-grade platform operation across Dev ? Prod, using private endpoints, managed identities and Key Vault. Contribute to platform standards, design patterns, code reviews and future roadmap. Collaboration & Delivery Work closely with BI/Analytics teams to deliver curated datasets powering dashboards across the organisation. Influence architecture decisions and uplift engineering maturity within a growing data function. ? Tech Stack You’ll Work With Databricks: Lakeflow Declarative Pipelines, Workflows, Unity Catalog, SQL Warehouses Azure: ADLS Gen2, Data Factory, Key Vault, vNets & Private Endpoints Languages: PySpark, Spark SQL, Python, Git DevOps: Azure DevOps Repos, Pipelines, CI/CD Analytics: Power BI, Fabric ? What We’re Looking For Experience 5-8+ years of Data Engineering with 2-3+ years delivering production workloads on Azure + Databricks. Strong PySpark/Spark SQL and distributed data processing expertise. Proven Medallion/Lakehouse delivery experience using Delta Lake. Solid dimensional modelling (Kimball) including surrogate keys, SCD types 1/2, and merge strategies. Operational experience-SLAs, observability, idempotent pipelines, reprocessing, backfills. Mindset Strong grounding in secure Azure Landing Zone patterns. Comfort with Git, CI/CD, automated deployments and modern engineering standards. Clear communicator who can translate technical decisions into business outcomes. Nice to Have Databricks Certified Data Engineer Associate Streaming ingestion experience (Auto Loader, structured streaming, watermarking) Subscription/entitlement modelling experience Advanced Unity Catalog security (RLS, ABAC, PII governance) Terraform/Bicep for IaC Fabric Semantic Model / Direct Lake optimisation
Database Administrator
Utopian Professional Recruitment Services Ltd
Glasgow
Hybrid
Junior - Mid
Private salary
RECENTLY POSTED
TECH-AGNOSTIC ROLE
Utopian Professional Recruitment are delighted to be recently appointed to work in partnership with an award winning client who is looking to recruit a Database Administrator / LMS for a period of 12 months.As part of the Reward & People Data team located in Glasgow, you will provide technical support for the firm’s main learning management system and other related systems.Whilst training will be full time in the office, our client can offer hybrid working with 3 days in the office, 2 days from home.Your main responsibilities will include:
Provide technical support to end users and internal teams.
System and records maintenance on firms learning management system (LMS).
Provide system training for all new users.
Assist with configuration and implementing system improvements.
Maintain system functionality by troubleshooting, analysing technical problems and deficiencies.
Provide scheduled and ad-hoc reporting.
Maintain and upload new content onto learning management system.
Maintain governance and quality assurance by reviewing data entry across learning management system.
Manage relationships with external vendors.
Participate in projects.
Reporting production and analysis.
Build trusted relationships across the business.
Risk and Control: Ensure that all activities and duties are carried out in full compliance with our regulatory requirements and internal policies
To be considered for this opportunity, we are looking for the following:
LMS or systems experience
Advanced Excel Skills (including pivot tables and vlookups)
Demonstrated aptitude for problem solving and analysis
Configuration skills
Excellent analytical and critical thinking skills
Strong attention to detail
Ability to manage multiple tasks
Strong communication skills and ability to communicate with a wide variety of audiences
Reward:
Hours of Work – Monday to Friday 9-5.30pm (flexible)
Hybrid Working – 3 days at home / 2 in the office, after training
Salary – Excellent and will be discussed at time of application
Generous holiday allowance
Pension
Longer term career opportunities
Study Support
Social Activities
Private Medical Insurance
Dental Insurance
Life Assurance
Employee Assistance & so much more!
Next Steps:For further information, please apply by emailing your CV to Utopian Professional Recruitment ASAP.To keep up to date with current opportunities be sure to follow the Utopian Professional Recruitment page on LinkedIn, Instagram & FacebookWhilst Utopian Professional Recruitment strives to get in touch with all our applicants, it is not always possible. If you have not heard back from us within 5 working days after sending us your CV unfortunately you have not been shortlisted for this position.Equal OpportunitiesUtopian Professional Recruitment is committed to equal opportunities regardless of gender, race, disability, sexual orientation, religion or belief and age.
Databricks Solution Architect
Morgan Hunt Group Limited
Glasgow
Hybrid
Mid - Senior
£600/day - £800/day
RECENTLY POSTED
unity-3d
azure-databricks
Morgan Hunt are working with a large public sector organisation to recruit a Databricks Solution Architect on a contract basis. The main focus of the role is to design, implement and enable a secure Azure databricks platform that aligns with organisations architecture, security standards, and operating model. The role will establish reference patterns, governance controls and onboarding guidance to support decentralised teams.Key Responsibilities:Design and deploy Azure Databricks using secure, enterprise patternsConfigure Databricks workspaces, clusters, and policies aligned with cost and security guardrails.Define and implement least-privilege access models for multiple teams.Support use cases across analytics, data engineering, and machine learning.Establish standards for data isolation, sharing, and lineage.Create onboarding playbooks and guardrails for internal teams.Essential Skills:
Proven experience designing and delivering Azure databricks
Hands on experience of: Unity Catalogue, Azure networking, Entra ID/Azure AD integration
Experience supporting security assurance and architecture governance
Strong understanding of data platform operating models and data product concepts
Contract until 31/03/26, extension possibleUp to 800 per day Inside IR35Hybrid working (One day per week in office)Please get in touch for further information.Morgan Hunt is a multi-award-winning recruitment business for interim, contract and temporary recruitment and acts as an Employment Agency in relation to permanent vacancies. Morgan Hunt is an equal opportunities employer. Job suitability is assessed on merit in accordance with the individual’s skills, qualifications and abilities to perform the relevant duties required in a particular role.
Databricks Architect
Morgan Hunt Group Limited
Glasgow
In office
Senior - Leader
£650/day - £750/day
RECENTLY POSTED
TECH-AGNOSTIC ROLE
Morgan Hunt are working with a large public sector organisation to recruit a Data Architect on a contract basis. The role is for someone who is experienced in implementing and configuring Databricks.Candidates must have active SC Clearance. (SC clearance can take 3 months to get)You will guide high-performing teams to design and implement scalable data solutions leveraging Databricks and cloud-native tools. This role blends technical leadership with strategic oversight, shaping the future of data platforms for some of our most high-impact clients and guiding our teams in modern approaches and best practices.Key Responsibilities
Lead and mentor cross-functional data engineering teams to deliver end-to-end Databricks solutions.
Own project delivery, stakeholder communication, and issue resolution across cloud data engagements.
Enforce best practices around data governance, observability, and CI/CD pipelines.
Drive innovation by evaluating emerging data technologies and promoting continuous learning.
Role
Flexible on rate
Inside of IR35
Glasgow based
Candidates must have active SC Clearance
750 per day Inside IR35
Contract initially until 31/03/26 (With a strong chance of extension possible)
Please get in touch for further information.Morgan Hunt is a multi-award-winning recruitment business for interim, contract and temporary recruitment and acts as an Employment Agency in relation to permanent vacancies. Morgan Hunt is an equal opportunities employer. Job suitability is assessed on merit in accordance with the individual’s skills, qualifications and abilities to perform the relevant duties required in a particular role.
GIS Data Scientist
Morgan Hunt Group Limited
Glasgow
In office
Mid - Senior
£650/day - £750/day
RECENTLY POSTED
r
python
sql
postgis
arcgis
Location: GlasgowEmployment Type: Contract, 2 months, strong chance of extensionAbout the RoleMorgan Hunt are working with a leading government organisation to recruit a GIS Data Scientist who can blend spatial analysis, advanced analytics, and problem-solving to turn geospatial data into actionable insights. You’ll work with large, complex datasets, build predictive models, and support data-driven decisions across the organisation. If you love maps, patterns, and answering real-world questions with data, this role has your name all over it.Key Responsibilities
Acquire, clean, and manage geospatial datasets from diverse sources
Perform spatial analysis, spatial statistics, and geoprocessing to support strategic and operational projects.
Develop predictive models and machine-learning workflows using spatial and non-spatial data.
Build and maintain spatial databases, data pipelines, and automated ETL processes.
Create high-quality maps, dashboards, and visualisations for both technical and non-technical stakeholders.
Collaborate with cross-functional teams to define requirements and deliver geospatial insights.
Implement QA/QC best practices to ensure accuracy, reproducibility, and data governance.
Stay current with emerging geospatial technologies, standards, and research.
Skills & ExperienceEssential
Strong experience with GIS platforms (ArcGIS, QGIS) and geospatial libraries (e.g., GeoPandas, GDAL/OGR, Shapely, Rasterio).
Proficiency in Python and/or R for data science and automation.
Solid grounding in statistics, spatial analysis, and machine-learning methodologies.
Experience with spatial databases (PostGIS, BigQuery GIS, SQL Server Spatial).
Ability to communicate complex spatial insights clearly to diverse audiences.
Experience working with remote sensing and raster datasets.
Details
650- 750 per day
inside of IR35
2 months, strong chance of extension
Glasgow based
Morgan Hunt is a multi-award-winning recruitment business for interim, contract and temporary recruitment and acts as an Employment Agency in relation to permanent vacancies. Morgan Hunt is an equal opportunities employer. Job suitability is assessed on merit in accordance with the individual’s skills, qualifications and abilities to perform the relevant duties required in a particular role.
Data Architect
Ashurst
Glasgow
Hybrid
Mid - Senior
Private salary
RECENTLY POSTED
fabric
azure-databricks
togaf
The Opportunity:Working within the Enterprise Architecture Team, the purpose of this role is to develop the firm’s data architecture, data models, processes and standards, to enable the effective use of data to deliver valueThe role will work closely with business stakeholders, IT, data governance, compliance, security and third party teams to understand business needs and develop solutions in support of business objectives.We are a global team of architects with, enterprise, solutions and domain specific architects (cloud, integration, security, network, business etc).If you want to join a maturing team where your voice can be heard, where you can make a difference with solutions that do have enterprise-wide impact, Ashurst EA is the place for you.Key responsibilities of the role include:We are seeking a talented Data Architect to join our team and contribute to the delivery of the firm’s strategic objectives. The ideal candidate will have experience as a Data Architect and familiarity with Microsoft data technologies such as Azure Databricks. Fabric and Purview.You will collaborate with internal technical teams, third party implementation partners and business stakeholders to define requirements, create architectural designs, and oversee the successful implementation of solutions.
Develop data architecture roadmaps in support of business objectives.
Develop conceptual, logical and physical data models to support the use of data across Ashurst.
Develop data standards, policies, processes, data structures and architectures for data modelling and design.
Simplify the existing data architecture, delivering reusable services and cost-saving opportunities in line with Ashurst policies and standards.
Evaluate and advise on data collection, analysis and integration technologies.
Aid efforts to improve business performance through enterprise information solutions and capabilities, such as master data management (MDM), metadata management, analytics, content management, and data integration.
This is a full-time, permanent role based in our Glasgow office with hybrid working.More information can be found in the job description attached to the role on our careers siteAbout you:The successful candidate will have:
Prior experience as a Data Architect.
Hands on experience of conceptual, logical, and physical data modelling.
Knowledge of Azure data services including Azure Databricks, Fabric and Purview for data governance.
Ability to communicate complex data concepts in an engaging manner to technical and non-technical audiences.
Understanding of data management issues.
Knowledge of Enterprise Architecture frameworks, TOGAF or similar.
What makes Ashurst a great place to work?We offer you all the things you should expect from an international law firm, some of which include:
competitive remuneration with the flexibility to reward high performance;
flexible working;
corporate health plans;
a global professional development offering for all employees; and
an industry-leading programme that celebrates diversity and inclusion.
We are committed to delivering positive impacts to our communities through our Social Impact programme.We aim to recruit, retain and promote the best people from the widest possible talent pools. We are committed to offering a safe and welcoming environment for all employees to ensure they are supported to work at their best.Beyond this, what sets Ashurst apart from others is our global strength, our drive to innovate and collaborate, and our commitment to excellence. It is these values that make Ashurst a unique place to work.
Data Engineer
Nine Twenty
Glasgow
Remote or hybrid
Mid - Senior
£60,000 - £70,000
fabric
aws
python
apache-spark
An established technology consultancy is looking to hire an experienced Data Engineer to work on large-scale, customer-facing data projects while also contributing to the development of internal data services. This role blends hands-on engineering with architecture design and technical advisory work, offering exposure to enterprise clients and modern cloud platforms.You will play a key role in designing and delivering cloud-native data platforms, working closely with engineering teams, stakeholders, and customers from initial design through to production release. The role offers variety, autonomy, and the opportunity to work with leading-edge data technologies across Azure and AWS.The roleAs a Data Engineer, you will be responsible for designing, building, and maintaining scalable data platforms and pipelines. You will support and lead technical workshops, contribute to architecture decisions, and act as a trusted technical partner on complex data initiatives.Key responsibilities include:Designing and building scalable data platforms and ETL/ELT pipelines in Azure and AWSImplementing serverless, batch, and streaming data architecturesWorking hands-on with Spark, Python, Databricks, and SQL-based analytics platformsDesigning Lakehouse-style architectures and analytical data modelsFeeding behavioural and analytical data back into production systemsSupporting architecture reviews, design sessions, and technical workshopsCollaborating with engineering, analytics, and commercial teamsAdvising customers throughout the full project lifecycleContributing to internal data services, standards, and best practicesWhat we are looking forEssential experienceProven experience as a Data Engineer working with large-scale data platformsStrong hands-on experience in either Azure or AWS, with working knowledge of the otherAzure experience with Lakehouse concepts, Data Factory, Synapse and/or FabricAWS experience with Redshift, Lambda, and SQL-based analytics servicesStrong Python skills and experience using Apache SparkHands-on experience with DatabricksExperience designing and maintaining ETL/ELT pipelinesSolid understanding of data modelling techniquesExperience working in cross-functional teams on cloud-based data platformsAbility to work with SDKs and APIs across cloud servicesStrong communication skills and a customer-focused approachDesirable experienceData migrations and platform modernisation projectsImplementing machine learning models using PythonConsulting or customer-facing engineering rolesFeeding analytics insights back into operational systemsCertifications (beneficial but not required)AWS Solutions Architect – AssociateAzure Solutions Architect – AssociateAWS Data Engineer – AssociateAzure Data Engineer – AssociateWhat’s on offerThe opportunity to work on modern cloud and data projects using leading technologiesA collaborative engineering culture with highly skilled colleaguesStructured learning paths and access to training and certificationsCertification exam fees covered and certification-related bonusesCompetitive salary and comprehensive benefits packageA supportive and inclusive working environment with regular knowledge sharing and team eventsThis role would suit a Data Engineer who enjoys combining deep technical work with customer interaction and wants to continue developing their expertise across cloud and data platforms. If you would like to find out more, then please get in contact with Jack at (url removed)
Data Engineer
Head Resourcing
Glasgow
Hybrid
Mid
£45,000 - £55,000
fabric
unity-3d
terraform
git
graphql
python
+5
Mid-Level Data Engineer (Azure / Databricks)NO VISA REQUIREMENTSLocation: Glasgow (3+ days) Reports to: Head of IT My client is undergoing a major transformation of their entire data landscape-migrating from legacy systems and manual reporting into a modern Azure + Databricks Lakehouse. They are building a secure, automated, enterprise-grade platform powered by Lakeflow Declarative Pipelines, Unity Catalog and Azure Data Factory. They are looking for a Mid-Level Data Engineer to help deliver high-quality pipelines and curated datasets used across Finance, Operations, Sales, Customer Care and Logistics.What You’ll DoLakehouse Engineering (Azure + Databricks)
Build and maintain scalable ELT pipelines using Lakeflow Declarative Pipelines, PySpark and Spark SQL.
Work within a Medallion architecture (Bronze ? Silver ? Gold) to deliver reliable, high-quality datasets.
Ingest data from multiple sources including ChargeBee, legacy operational files, SharePoint, SFTP, SQL, REST and GraphQL APIs using Azure Data Factory and metadata-driven patterns.
Apply data quality and validation rules using Lakeflow Declarative Pipelines expectations.
Curated Layers & Data Modelling
Develop clean and conforming Silver & Gold layers aligned to enterprise subject areas.
Contribute to dimensional modelling (star schemas), harmonisation logic, SCDs and business marts powering Power BI datasets.
Apply governance, lineage and permissioning through Unity Catalog.
Orchestration & Observability
Use Lakeflow Workflows and ADF to orchestrate and optimise ingestion, transformation and scheduled jobs.
Help implement monitoring, alerting, SLAs/SLIs and runbooks to support production reliability.
Assist in performance tuning and cost optimisation.
DevOps & Platform Engineering
Contribute to CI/CD pipelines in Azure DevOps to automate deployment of notebooks, Lakeflow Declarative Pipelines, SQL models and ADF assets.
Support secure deployment patterns using private endpoints, managed identities and Key Vault.
Participate in code reviews and help improve engineering practices.
Collaboration & Delivery
Work with BI and Analytics teams to deliver curated datasets that power dashboards across the business.
Contribute to architectural discussions and the ongoing data platform roadmap.
Tech You’ll Use
Databricks: Lakeflow Declarative Pipelines, Lakeflow Workflows, Unity Catalog, Delta Lake
Azure: ADLS Gen2, Data Factory, Event Hubs (optional), Key Vault, private endpoints
Languages: PySpark, Spark SQL, Python, Git
DevOps: Azure DevOps Repos & Pipelines, CI/CD
Analytics: Power BI, Fabric
What We’re Looking ForExperience
Commercial and proven data engineering experience.
Hands-on experience delivering solutions on Azure + Databricks.
Strong PySpark and Spark SQL skills within distributed compute environments.
Experience working in a Lakehouse/Medallion architecture with Delta Lake.
Understanding of dimensional modelling (Kimball), including SCD Type 1/2.
Exposure to operational concepts such as monitoring, retries, idempotency and backfills.
Mindset
Keen to grow within a modern Azure Data Platform environment.
Comfortable with Git, CI/CD and modern engineering workflows.
Able to communicate technical concepts clearly to non-technical stakeholders.
Quality-driven, collaborative and proactive.
Nice to Have
Databricks Certified Data Engineer Associate.
Experience with streaming ingestion (Auto Loader, event streams, watermarking).
Subscription/entitlement modelling (e.g., ChargeBee).
Unity Catalog advanced security (RLS, PII governance).
Terraform or Bicep for IaC.
Fabric Semantic Models or Direct Lake optimisation experience.
Why Join?
Opportunity to shape and build a modern enterprise Lakehouse platform.
Hands-on work with Azure, Databricks and leading-edge engineering practices.
Real progression opportunities within a growing data function.
Direct impact across multiple business domains.
Data Engineer
Nine Twenty
Glasgow
Remote or hybrid
Mid - Senior
£60,000 - £70,000
fabric
aws
python
apache-spark
An established technology consultancy is looking to hire an experienced Data Engineer to work on large-scale, customer-facing data projects while also contributing to the development of internal data services. This role blends hands-on engineering with architecture design and technical advisory work, offering exposure to enterprise clients and modern cloud platforms.You will play a key role in designing and delivering cloud-native data platforms, working closely with engineering teams, stakeholders, and customers from initial design through to production release. The role offers variety, autonomy, and the opportunity to work with leading-edge data technologies across Azure and AWS.The roleAs a Data Engineer, you will be responsible for designing, building, and maintaining scalable data platforms and pipelines. You will support and lead technical workshops, contribute to architecture decisions, and act as a trusted technical partner on complex data initiatives.Key responsibilities include:
Designing and building scalable data platforms and ETL/ELT pipelines in Azure and AWS
Implementing serverless, batch, and streaming data architectures
Working hands-on with Spark, Python, Databricks, and SQL-based analytics platforms
Designing Lakehouse-style architectures and analytical data models
Feeding behavioural and analytical data back into production systems
Supporting architecture reviews, design sessions, and technical workshops
Collaborating with engineering, analytics, and commercial teams
Advising customers throughout the full project lifecycle
Contributing to internal data services, standards, and best practices
What we are looking forEssential experience
Proven experience as a Data Engineer working with large-scale data platforms
Strong hands-on experience in either Azure or AWS, with working knowledge of the other
Azure experience with Lakehouse concepts, Data Factory, Synapse and/or Fabric
AWS experience with Redshift, Lambda, and SQL-based analytics services
Strong Python skills and experience using Apache Spark
Hands-on experience with Databricks
Experience designing and maintaining ETL/ELT pipelines
Solid understanding of data modelling techniques
Experience working in cross-functional teams on cloud-based data platforms
Ability to work with SDKs and APIs across cloud services
Strong communication skills and a customer-focused approach
Desirable experience
Data migrations and platform modernisation projects
Implementing machine learning models using Python
Consulting or customer-facing engineering roles
Feeding analytics insights back into operational systems
Certifications (beneficial but not required)
AWS Solutions Architect Associate
Azure Solutions Architect Associate
AWS Data Engineer Associate
Azure Data Engineer Associate
What s on offer
The opportunity to work on modern cloud and data projects using leading technologies
A collaborative engineering culture with highly skilled colleagues
Structured learning paths and access to training and certifications
Certification exam fees covered and certification-related bonuses
Competitive salary and comprehensive benefits package
A supportive and inclusive working environment with regular knowledge sharing and team events
This role would suit a Data Engineer who enjoys combining deep technical work with customer interaction and wants to continue developing their expertise across cloud and data platforms. If you would like to find out more, then please get in contact with Jack at (url removed).
Page 1 of 1

Frequently asked questions

What types of Data Engineer jobs are available in Glasgow?
Glasgow offers a wide range of Data Engineer positions, including roles focusing on data pipeline development, ETL processes, big data analytics, cloud data solutions, and data warehousing across various industries such as finance, healthcare, and technology.
What skills do I need to qualify for a Data Engineer job in Glasgow?
Key skills for Data Engineer roles in Glasgow typically include proficiency in SQL, Python, or Scala, experience with big data technologies like Hadoop or Spark, knowledge of cloud platforms such as AWS or Azure, and expertise in data modeling and ETL toolsets.
Are remote Data Engineer jobs available in Glasgow?
Yes, many employers in Glasgow offer remote or hybrid working options for Data Engineers, allowing flexibility depending on the company's policies and the specific role requirements.
How can I apply for Data Engineer positions through Haystack?
You can search for Data Engineer jobs in Glasgow on the Haystack job board, create a profile, upload your CV, and apply directly through our platform. You can also set up job alerts to receive notifications for new opportunities matching your criteria.
What is the average salary for Data Engineers in Glasgow?
The average salary for Data Engineers in Glasgow typically ranges from £40,000 to £65,000 per year, depending on experience, skills, and the specific employer. Senior roles or those with specialized expertise may offer higher compensation.