Make yourself visible and let companies apply to you.
Roles

Contract Data Engineer Jobs

Overview

Looking for contract Data Engineer jobs? Explore top freelance and contract opportunities for Data Engineers on Haystack. Find flexible, high-paying gigs where you can design, build, and optimize data pipelines using the latest tools and technologies. Start your contract Data Engineer career today with Haystack’s curated listings!
Filters applied
Data Engineer
Contract
Search
Salary
Location
Remote preference
Role type
Seniority
Tech stack
Sectors
Contract type
Company size
Visa sponsorship
Lead Data Engineer
TALENT INTERNATIONAL UK LTD
Liverpool
Hybrid
Senior
£500/day - £550/day
RECENTLY POSTED
TECH-AGNOSTIC ROLE
Job Description:Lead Data Engineer. Central Government .Inside IR35.Hybrid - Manchester£550 per day - Duration - 12 monthsOur Central Government Client is looking to bring in experienced Data Engineers with extensive Power BI, full life cycle experience in Agile Digital (DDaT) environmentsYou will be responsible for developing accurate, efficient data solutions, which meet our client’s Live Service team and that of any customer needs and to agreed timescales.You ensure the stability, robustness and resilience of the products you design and build and are able to effect changes to those products where necessary.You will support continuous improvement of standards and provide leadership to develop Associate Data Engineers, providing technical guidance alongside other data engineering functions for customers.At this role level, you will:inspire best practice for data products and services within your teamsbuild data engineering capability by providing technical leadership and career development for the communitywork with other senior team members to identify, plan, develop and deliver data servicesExperience of Dataverse and Power PlatformExperience with data strategy and implementation to work alongside data architects.Experience in the Public/government sector is essential for this role as well as an understanding of GDS principles and DDaT environments. You will need to be eligible for SC clearance£500.00 - £550.00 / dayTalent International UK and it’s subsidiaries, Digital Gurus, Infinite Talent and Rethink act as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this opportunity, you accept the T&C’s, Privacy Policy and Disclaimers which can be found at talentinternational.co.ukTPBN1_UKTJ
Senior Data Engineer Contract Dublin Contract
Adecco
Not Specified
In office
Senior
£517/day - £604/day
RECENTLY POSTED
processing-js
python
csharp
xml
Senior Data Engineer Contract Dublin Contract 6-24 Months. My client a leading name in their respected industry is in urgent need of a talented and experienced Senior Data Engineer to join them on a contact basis.You will design and implement scalable and efficient data pipelines, ETL processes, and data integration solutions to collect, process, and store large volumes of data. You will collaborate with others in the development of data models, schema designs, and data architecture frameworks to support diverse analytical and reporting needs. You will build and optimise data processing workflows using distributed computing frameworks available on Azure, our preferred cloud provider. Integrate data from various internal and external sources including databases, APIs, and streaming platforms into centralised data repositories and data warehouses.Successful candidates will have 8 years of commercial experience working as an analyst developer or data engineer in a data-centric environment. You will have Proven experience in designing and implementing end-to-end data solutions from ingestion to consumption. You will have a Strong experience with Azure data PaaS services and data pipeline delivery on the Azure platform with Databricks. You will have experience delivering data platforms with C#, Python, JSON, XML, APIs, and message bus technology. Or similar technologies. Strong knowledge of database systems, data modelling, and data integration technologies. This a unique opportunity to work with a team that is at the beginning phase of the projects. If this sounds of interest drop me a CV so we can speak in more detail.
Data Engineer
InfinityQuest Ltd,
Glasgow
Hybrid
Mid - Senior
Private salary
RECENTLY POSTED
aws
python
sql
Key Responsibilities
Develop and optimize data pipelines for ingestion, transformation, and storage.
Ensure data quality, integrity, and security across systems.
Collaborate with Data Scientists and Analysts to enable advanced analytics.
Implement best practices for scalability and performance in cloud environments.
Support integration of MRO AI Solutions into Client operational workflows.
Design data architectures and pipelines that support multi-OpCo deployment, ensuring modularity and interoperability.
Required Skills & Experience
Expertise in Python, SQL, and modern ETL frameworks.
Hands-on experience with cloud platforms (AWS preferred).
Strong knowledge of data modeling and API integration.
Proven experience in developing, testing, and deploying data solutions into production environments, ensuring reliability, scalability, and maintainability beyond proof-of-concept or prototype stages.
Familiarity with airline or logistics data domains is a plus.
Significant experience in similar roles, with a proven ability to integrate quickly into new teams and deliver immediate value.
Initial co-location with Client teams in London is essential to ensure close collaboration. Candidates must also be prepared to travel internationally during later stages to facilitate group-wide deployment.
Preferred Consulting-Level Competencies
Ability to design enterprise-grade data solutions under tight timelines.
Strong stakeholder engagement and solution-oriented mindset.
Experience in advisory or consulting roles for data engineering projects.
Track record of creating high-impact outcomes and driving stakeholder satisfaction from day one.
Ability to implement standards and frameworks for scalable data solutions across multiple operating companies.
Data Scientist
InfinityQuest Ltd,
Glasgow
Remote or hybrid
Mid - Senior
Private salary
RECENTLY POSTED
processing-js
aws
tensorflow
kubernetes
python
docker
+4
Data Scientist:We are seeking a highly skilled Data Scientist AI to design, develop, and deploy advanced machine learning and artificial intelligence solutions. The ideal candidate will work on large datasets, build predictive models, and collaborate cross-functionally to deliver scalable, data-driven products.Key ResponsibilitiesDesign, develop, and optimize machine learning and deep learning models.Work on AI/ML projects including NLP, computer vision, recommendation systems, and generative AI.Perform data cleaning, feature engineering, and exploratory data analysis (EDA).Build and manage data pipelines and model training workflows.Deploy models into production and monitor performance.Collaborate with Product, Engineering, and Business teams to translate business problems into AI solutions.Conduct model evaluation, A/B testing, and performance tuning.Document models, experiments, and technical processes.Required Skills & QualificationsClassic Machine learning (Regression, predictive Analysis, Classification, Clustering)Machine learning Model OptimisationStrong proficiency in Python (NumPy, Pandas, Scikit-learn).Hands-on experience with Deep Learning frameworks: TensorFlow, PyTorch, or Keras.Experience in Natural Language Processing (NLP) and/or Computer Vision.Strong knowledge of Machine Learning algorithms and statistics.Experience with SQL/NoSQL databases and big data tools (Spark, Hadoop preferred).Experience with MLOps tools such as Docker, Kubernetes, CI/CD pipelines.Preferred SkillsExperience with LLMs / Generative AI (OpenAI, Hugging Face, LangChain).Cloud experience (AWS, Azure, or GCP).Experience building AI APIs and microservices.EducationBachelors or Masters degree in Computer Science, Data Science, AI, or a related field. (PhD preferred for advanced research roles)Soft SkillsStrong problem-solving and analytical thinkingExcellent communication and storytelling skillsAbility to work in fast-paced, cross-functional teams
Lead Data Engineer
Experis
London
Hybrid
Senior
Private salary
RECENTLY POSTED
aws
python
sql
snowflake
Job Title: Lead Data Engineer Location: London (Hybrid) Contract: 6 Months (Potential Extension) Start Date: ASAPAbout the Client Our client is transforming their industry by replacing cigarettes with innovative, smoke-free alternatives. They are leveraging technology, data, and AI to drive a global shift toward a smoke-free world. This is a fast-paced, high-impact environment, perfect for candidates who are strategic, independent, and excited to work at the forefront of data and AI innovationThe Role We are looking for a skilled Data Engineer to design, build, and optimize enterprise-scale data pipelines and cloud platforms. You will translate business and AI/ML requirements into robust, scalable solutions while collaborating across multi-disciplinary teams and external vendors.As a key member of the data architecture you will:
Build and orchestrate data pipelines across Snowflake and AWS environments.
Apply data modeling, warehousing, and architecture principles (Kimball/Inmon).
Develop pipeline programming using Python, Spark, and SQL; integrate APIs for seamless workflows.
Support Machine Learning and AI initiatives, including NLP, Computer Vision, Time Series, and LLMs.
Implement MLOps, CI/CD pipelines, data testing, and quality frameworks.
Act as an AI super-user, applying prompt engineering and creating AI artifacts.
Work independently while providing clear justification for technical decisions.
Key Skills & Experience
Strong experience in data pipeline development and orchestration.
Proficient with cloud platforms (Snowflake, AWS fundamentals).
Solid understanding of data architecture, warehousing, and modeling.
Programming expertise: Python, Spark, SQL, API integration.
Knowledge of ML/AI frameworks, MLOps, and advanced analytics concepts.
Experience with CI/CD, data testing frameworks, and versioning strategies.
Ability to work effectively in multi-team, vendor-integrated environments.
Why This Role
Join a global, transformative initiative shaping a smoke-free future.
Work with cutting-edge cloud, AI, and data technologies.
Opportunity to influence technical and strategic decisions across enterprise data delivery.
Dynamic, innovative environment where your work has real business impact.
Postgres Data Architect
Stackstudio Digital Ltd.
London
Hybrid
Mid - Senior
£500/day - £550/day
RECENTLY POSTED
aws
terraform
kafka
postgresql
gitlab
db2
Job DetailsRole / Job Title:Postgres Data Architect with CDC SkillsWork Location:250 Bishopsgate, London, UKOffice Presence (Hybrid):2 days per weekThe RolePostgreSQL Data Architect with strong hands-on experience in Change Data Capture (CDC). The candidate will design and implement robust data migration strategies, ensuring seamless integration between legacy systems and modern cloud-based architectures.Responsibilities
Architect CDC Pipelines: Design and optimize Change Data Capture workflows (IBM CDC or equivalent), including subscription design, bookmarks, resync, and replay strategies
Cloud Migration & Hosting: Lead PostgreSQL migration from on-premises/mainframe to cloud platforms (AWS Aurora preferred), ensuring performance, security, and scalability
Integration & ETL Pipelines: Build robust pipelines using CDC Kafka/S3 Aurora with UPSERT/MERGE patterns; guarantee idempotency, ordering, and reliable delivery
Data Encoding & Validation: Manage EBCDIC UTF-8 conversions, packed decimal/binary numeric, and validate transformations with automated test suites
Cutover & Governance: Execute dual-run validations, reconciliation (counts, checksums), rollback strategies, and ensure compliance with masking, encryption, and IAM policies
Performance & Observability: Monitor lag, throughput, and error rates; develop dashboards (CloudWatch/Grafana) and operational runbooks for proactive alerting
Automation & Tooling: Utilize schema conversion tools, IaC (Terraform), CI/CD pipelines (GitLab), and AWS services (Glue, Athena, Redshift) for downstream analytics
Data Modelling & Conversion (Good to have): Transform Db2 schemas to Aurora PostgreSQL; design logical/physical models, enforce referential integrity, and apply best practices for normalization/denormalization
Your ProfileEssential Skills / Knowledge / Experience
Strong hands-on experience with PostgreSQL (Aurora preferred) and advanced data modelling
Expertise in CDC tools (IBM CDC or similar) and data migration strategies
Proven experience in PostgreSQL cloud hosting and migration
Proficiency in ETL/ELT pipelines, Kafka, and AWS ecosystem
Solid understanding of data encoding, transformation, and validation techniques
Familiarity with IaC, CI/CD, and observability frameworks
Excellent problem-solving and communication skills
Desirable Skills / Knowledge / Experience
Mainframe Modernization
Data Architect Senior
Stackstudio Digital Ltd.
Glasgow
In office
Senior
£550/day - £600/day
RECENTLY POSTED
react
aws
javascript
python
elasticsearch
pandas
+3
Role DetailsRole / Job Title:Data Architect SeniorJob Type:ContractingLocation:Glasgow (5 days onsite)Role Administration DetailsRole OverviewWe are seeking an experienced Senior Data Architect to join our Market Data Services team, focusing on our high-impact 3rd Party Data Project. This is a contract role for an initial 6-month period, based in our Glasgow office, with full-time onsite presence required.As a key member of the team, you will play a pivotal role in modelling currently available data, defining and structuring new data catalogues of marketplace data products, and validating existing contracts and data usage rights. Your expertise in RDF data and data architecture will be essential as you review and enhance our data management practices, develop new data architecture policies, and strategically structure data flows for the project.You will be working within an Agile project environment, participating in two-week sprint cycles, and collaborating closely with cross-functional teams to deliver high-quality solutions. The Engineering Team develop predominantly be in Python, with Pandas and SPARQ Data frames for backend APIs, and with JavaScript with React for the front end and an understanding of these technologies is beneficial.Key Responsibilities
Model and structure currently available marketplace data and define new data catalogs.
Validate existing data contracts and usage rights, ensuring compliance and optimal utilization.
Review current data management practices and develop robust, future-proof data architecture policies.
Design and implement strategic data flows for the 3rd Party Data Project.
Work extensively with RDF data, leveraging SPARQL for graph queries and data models.
Utilize SQL for Oracle databases and No-SQL DSL for ElasticSearch to manage and query data.
Collaborate with stakeholders to ensure data solutions align with business and regulatory requirements.
Actively participate in Agile ceremonies and two-week sprint cycles.
Required Skills & Experience
Proven experience in data architecture and data modeling within professional services or contract environments.
Strong hands-on expertise with RDF data and SPARQL graph query languages.
Deep familiarity with W3C standards for data modeling and interoperability.
Experience with AWS Neptune, AWS Glue, ElasticSearch, and Oracle database products.
Proficient in SQL for Oracle and No-SQL DSL for ElasticSearch.
Demonstrated ability to validate data contracts and usage rights.
Experience working in Agile teams.
Track record of developing and implementing data management policies and best practices.
Excellent communication and stakeholder management skills.
Managing expectations of non-technical stakeholders.
Good to Have (As Applicable)
Familiar with AWS via hands on experience or certification
Familiarity with orchestration tools like Airflow
Familiarity with BASEL regulatory reporting framework
Familiarity with engineering in a regulatory controlled environmen
Azure Data Engineer
Opus Recruitment Solutions
Bristol
Hybrid
Mid - Senior
£350/day - £450/day
RECENTLY POSTED
azure-databricks
delta-lake
sql
Azure Data Engineer | £(Apply online only) per day | Bristol | Hybrid | 6‑Month Initial Term |A large‑scale data transformation programme is underway, and our client is looking for an experienced Azure Data Engineer to support the rebuild of their cloud data platform. This role is hands‑on and delivery‑focused — you’ll be designing and developing Azure‑native data pipelines, working extensively with Databricks, and shaping scalable data models across the Microsoft ecosystem. The role would require you to be on site in Bristol 3 days per week.What you’ll be doingBuild, enhance and maintain data pipelines using Azure Databricks, Data Factory, and Delta LakeDevelop and optimise Lakehouse components and cloud‑based data flowsCreate robust data models to support analytics, MI and downstream reportingAssist in migrating legacy warehouse assets into a modern Azure environmentContribute to cloud architecture decisions, data standards and best‑practice engineering patternsWhat you’ll bringStrong hands‑on experience across Azure Data Services (ADF, ADLS, Synapse, Databricks)Excellent SQL skills, with experience in performance tuning and optimisationSolid understanding of data modelling (star schema, medallion, ETL frameworks)Ability to work with complex, inconsistent or legacy data sourcesExperience building scalable, production‑ready pipelines in a cloud environmentAzure Data Engineer | £(Apply online only) per day | Bristol | Hybrid | 6‑Month Initial Term
Data Engineer
Hays Technology
York
Hybrid
Mid - Senior
£500/day - £600/day
RECENTLY POSTED
sql
snowflake
Our client, a leading organisation within the public sector, is seeking an experienced Data Engineer to join their Digital & Data function. The role is focused on transforming and integrating data across the institution, supporting strategic reporting, analytics, and operational needs. With major ongoing development of their Snowflake-based data platform, this is an excellent opportunity to take ownership of high-impact data engineering work in a complex, data-rich environment. What You Will Be Doing Designing, developing, and maintaining data pipelines to ingest, clean, transform, and deliver high‑quality datasets into Snowflake. Applying strong SQL skills to build transformation logic, optimise performance, and ensure efficient query execution. Supporting data integration across core institutional systems, including potential exposure to Workday data imports. Collaborating with analysts, stakeholders, and cross‑functional teams to ensure data is accurate, timely, and fit for purpose. Participating in troubleshooting, performance tuning, monitoring, and continuous improvement of data workflows. Ensuring best practices in data modelling, documentation, governance, and platform optimisation. What You Will Need (Responsibilities / Requirements) Strong, demonstrable expertise in SQL and hands-on experience transforming data. Experience delivering data pipelines and ETL/ELT processes into Snowflake (or similar cloud data warehousing platforms). Understanding of modern data engineering concepts: modelling, integration, quality, monitoring, optimisation. Experience working with large datasets and multiple data sources. Ability to work collaboratively, communicate effectively, and engage confidently with stakeholders. Workday inbound data experience is beneficial but not essential. What you will get in returnThis is a great opportunity to work for a highly esteemed organisation on a contract basis, paying around £550.00 per day Inside IR35. Some travel to site in York will be required, with flexibility. What you need to do now If you’re interested in this role, click ‘apply now’ to forward an up-to-date copy of your CV, or call us now. If this job isn’t quite right for you, but you are looking for a new position, please contact us for a confidential discussion about your career. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C’s, Privacy Policy and Disclaimers which can be found at (url removed)
Data Engineer
Coventry Building Society
Coventry
Hybrid
Mid
£50,000/day
RECENTLY POSTED
python
sql
pyspark
About the roleWorking in our Data and Analytics Delivery department, the Data Engineer will join the group on a 12-month fixed term contract to focus on the migration and integration of data into our new ecosystem.The Data Engineer will be designing, developing and testing quality data engineering solutions and will look to challenge and improve our processes, tools and approach. The person in post will undertake review and assurance activity, providing other team members with guidance on design, build and test activity.Adhering to standard driven code development, the Data Engineer will deliver solutions that meet business needs in a timely manner and will take responsibility for the testing of their solution, including the analysis of requirement, designs of test cases & scripts, preparing test data and creating and executing tests to ensure effective and accurate deliverables.We operate on a team ledhybrid approach with at least 1 days a week in the Coventry or Manchester office.Our benefits include:
28 days holiday a yearplus bank holidays and a holiday buy/sell scheme
Annual discretionary bonus scheme
Personal pension with matched contributions
Life assurance (6 times annual salary)
We reserve the right toclosethis advertearlyif we receive ahigh volumeof suitable applicationsAbout youYoull either have a Data Engineering related qualification and/or extensive Data Development experience in a commercial or Agile environment.To be successful in this role its essential that you will:
Have experience of AWS, Python, SQL, Gitand PySpark
Desirable experience needed will be:
SISS or SAS experience
Quality Assurance and Test Automation experience
Experience of Database technologies
Experience in Financial Services organisation
About usIn 2025, Coventry Building Society purchased The Co-operative Bank. Bringing together our purpose-led building society with the UKs original ethical bank was the start of an exciting journey.Trusted by over four million people, were a mutually owned business free from shareholders, and with our combined experience of almost 300 years, our ethics and dedication will continue to guide us. Together, we have shared values and an ethical approach towards our members, customers and colleagues.Were officially recognised as a Great Place to Work and our benefits go beyond basic pay, with a discretionary bonus scheme, a culture of reward and recognition and comprehensive support for wellbeing.Were serious about equality, of race, age, faith, disability, and sexual orientation and we celebrate diversity. By working together, we know youll build more than just a career with us.Flexibility and why it matters We understand the need for flexibility, so wherever possible, well consider alternative working patterns.Have a chat with us before you apply to see what the possibilities are for this role. Proud to be a Disability Confident Committed Employer Were proud to offer an interview or assessment to every disabled applicant who meet the minimum criteria for our vacancies. As part of the application process, disabled applicants can opt in for the Disability Confident Interview Scheme. If there are ever occasions where it is not practicable to interview all candidates that meet the essential criteria, such as when we receive a high number of applications, we commit to interviewing disabled candidates who best meet the minimum essential and desirable criteria.
Power BI Specialist
Gleeson Recruitment Group
Birmingham
Remote or hybrid
Mid - Senior
Private salary
RECENTLY POSTED
fabric
Power BI Specialist - Contract OUTSIDE IR35Remote role - Office based in BirminghamWe’re looking for a Power BI Specialist to turn sales data into real-time, high-impact dashboards that drive decision-making.You’ll build and maintain live Power BI dashboards, working from a newly implemented Microsoft Fabric data warehouse, with scope to shape and enhance the data model as it evolves. You’ll also liaise with third-party partners to ensure smooth integration and delivery.What you’ll do:
Build real-time Power BI dashboards from sales data
Develop and enhance a new Fabric-based data warehouse
Work with stakeholders and third-party providers
Turn complex data into clear, actionable insights
What we’re looking for:
Strong Power BI experience
Solid understanding of data warehousing (Fabric experience a big plus)
Confident communicator who can work with internal teams and external partners
If you like fast-moving projects, clean data models, and dashboards that actually get used - this could be for you!Please apply asap if interested. GleeITAt Gleeson Recruitment Group, we embrace inclusivity and welcome applicants of all backgrounds, experiences, and abilities. We are proud to be a disability confident employer.By applying you will be registered as a candidate with Gleeson Recruitment Limited. Our Privacy Policy is available on our website and explains how we will use your data.
Lead Data Engineer
TALENT INTERNATIONAL UK LTD
UK
Hybrid
Senior
£500/day - £550/day
RECENTLY POSTED
TECH-AGNOSTIC ROLE
Job Description:Lead Data Engineer. Central Government.Inside IR35.Hybrid - Manchester£550 per day - Duration - 12 monthsOur Central Government Client is looking to bring in experienced Data Engineers with extensive Power BI, full life cycle experience in Agile Digital (DDaT) environmentsYou will be responsible for developing accurate, efficient data solutions, which meet our client’s Live Service team and that of any customer needs and to agreed timescales.You ensure the stability, robustness and resilience of the products you design and build and are able to effect changes to those products where necessary.You will support continuous improvement of standards and provide leadership to develop Associate Data Engineers, providing technical guidance alongside other data engineering functions for customers.At this role level, you will:
inspire best practice for data products and services within your teams
build data engineering capability by providing technical leadership and career development for the community
work with other senior team members to identify, plan, develop and deliver data services
Experience of Dataverse and Power PlatformExperience with data strategy and implementation to work alongside data architects.Experience in the Public/government sector is essential for this role as well as an understanding of GDS principles and DDaT environments. You will need to be eligible for SC clearance£500.00 - £550.00 / dayTalent International UK and it’s subsidiaries, Digital Gurus, Infinite Talent and Rethink act as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this opportunity, you accept the T&C’s, Privacy Policy and Disclaimers which can be found at talentinternational.co.uk
Snowflake Data Architect - £550 Inside IR35- Hybrid
Tenth Revolution Group
Warwick
Hybrid
Mid - Senior
£400/day - £550/day
RECENTLY POSTED
snowflake
processing-js
fabric
prometheus
python
microsoft-azure
+1
Snowflake Data Architect - £550 Inside IR35 - HybridWe are seeking an experienced Data Architect to design, build, and maintain scalable, secure, and high-performing data platforms. The ideal candidate will have strong expertise in Azure-based data solutions, Snowflake, and modern data engineering tools, and will play a key role in shaping our enterprise data architecture to support analytics, reporting, and advanced data use cases.Key Responsibilities . Design and implement end-to-end data architectures using Azure cloud services . Architect and optimize data solutions on Snowflake for performance, scalability, and cost efficiency . Build and maintain data pipelines using Azure Data Factory (ADF) . Develop and manage transformation workflows using DBT . Design and support ET/ELT processes for structured and semi-structured data . Develop data engineering solutions using Python for data processing, automation, and orchestration . Implement monitoring and observability for data systems using Prometheus . Define data models, schemas, and standards to ensure data consistency and quality . Collaborate with data engineers, analysts, and business stakeholders to translate requirements into technical solutions . Ensure data security, governance, and compliance with organizational and regulatory standards . Troubleshoot and optimize data pipelines and architectures for reliability and performanceRequired Qualifications . Proven experience as a Data Architect or Senior Data Engineer . Strong hands-on experience with Microsoft Azure data services . Extensive experience with Snowflake data warehousing . Proficiency in Azure Data Factory (ADF) for data orchestration . Strong Python programming skills . Hands-on experience with DBT for data transformation and modelling . Solid understanding of ET/ELT architecture and best practices . Experience with monitoring and observability tools such as Prometheus . Strong knowledge of data modelling, data warehousing concepts, and cloud architecture . Excellent problem-solving and communication skillsPreferred Qualifications . Experience with CI/CD for data pipelines . Familiarity with infrastructure-as-code tools . Experience working in Agile or DevOps environments . Knowledge of data governance, metadata management, and data quality frameworksTo apply for this role please submit your CV or contact Dillon Blackburn (see below)Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We’re the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
Viva Consultant - £500PD Inside IR35 - Hybrid
Tenth Revolution Group
London
Hybrid
Mid - Senior
£400/day - £500/day
RECENTLY POSTED
fabric
dax
Viva Consultant - £500PD Inside IR35 - RemoteWe’re looking for a data professional who can define, design, and deliver a comprehensive analytics strategy for Microsoft Viva. This role goes beyond building reports from a predefined brief - you’ll take ownership of Viva analytics end-to-end, partnering with stakeholders to shape the right questions, define best-practice metrics, and translate data into meaningful insight and action.You’ll act as a trusted advisor on how Viva data should be used to measure adoption, engagement, and business impact, while also rolling up your sleeves to implement the technical solution.Key ResponsibilitiesStrategy & Stakeholder Leadership
Define and own the analytics strategy for Microsoft Viva, aligning usage data to business outcomes and employee experience goals.
Partner with senior stakeholders across HR, IT, Communications, and the business to understand needs, influence priorities, and gain buy-in.
Advise on best-practice metrics and measurement approaches for engagement, learning, productivity, and goal alignment.
Translate ambiguous or evolving requirements into clear analytical frameworks and delivery plans.
Take full ownership of Viva analytics solutions from concept through to delivery and ongoing optimisation.
Solution Design & Delivery
Design and develop usage analytics solutions across Microsoft Viva modules: Connections, Engage, Insights, Learning, Goals, and Topics.
Define KPIs and success measures for adoption, engagement, and impact across Viva capabilities.
Build scalable and automated data pipelines integrating:
Microsoft Graph
Viva Insights
Microsoft 365 usage reports
Design and deliver interactive Power BI dashboards for executive, functional, and operational audiences.
Ensure analytics solutions are intuitive, actionable, and clearly tell a story - not just report numbers.
Data Engineering & Governance
Automate data extraction, transformation, and refresh using tools such as Power Automate, Azure Data Factory, or similar.
Ensure data quality, accuracy, security, and compliance with organisational policies and privacy standards.
Document data models, definitions, and analytics logic to support transparency and reuse.
Continuously improve performance, reliability, and scalability of analytics solutions.
Skills & Experience
Strong experience designing and delivering analytics solutions end-to-end, including both strategy and implementation.
Hands-on expertise with Power BI (data modelling, DAX, visual design, performance optimisation).
Experience working with Microsoft 365 data, ideally including Microsoft Graph and Viva Insights.
Ability to define KPIs and analytics frameworks in ambiguous or evolving environments.
Proven ability to engage stakeholders, influence decisions, and drive alignment.
Strong analytical thinking with the ability to translate data into insight and recommendations.
To apply for this role please submit your CV or contact Dillon Blackburn (see below)Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We’re the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
Snowflake Senior Developer
Resourgenix Ltd
Not Specified
Hybrid
Senior
£375 - £450
RECENTLY POSTED
snowflake
processing-js
kafka
python
airflow
sql
+4
Location: London (Hybrid) Employment Type: Full-time Seniority: Senior Individual Contributor Reports to: Data Engineering Lead / ManagerSummary of roleWe are seeking a Snowflake Senior Developer to design, develop, and optimise data solutions on our cloud data platform. You will work closely with data engineers, analysts, and architects to deliver high-quality, scalable data pipelines and models. Strong expertise in Snowflake, ETL/ELT, data modelling, and data warehousing is essential.Responsibilities
Snowflake Development: Build and optimise Snowflake objects (databases, schemas, tables, views, tasks, streams, resource monitors).
ETL/ELT Pipelines: Develop and maintain robust data pipelines using tools like dbt, Airflow, Azure Data Factory, or similar.
Data Modelling: Implement dimensional models (star/snowflake schemas), handle SCDs, and design efficient structures for analytics.
Performance Tuning: Optimise queries, manage clustering, caching, and warehouse sizing for cost and speed.
Data Quality: Implement testing frameworks (dbt tests, Great Expectations) and ensure data accuracy and freshness.
Security & Governance: Apply RBAC, masking policies, and comply with data governance standards.
Collaboration: Work with BI teams to ensure semantic alignment and support self-service analytics.
Documentation: Maintain clear technical documentation for pipelines, models, and processes.
Qualifications
Matric and a Degree in IT
Strong SQL skills (complex queries, performance tuning) and proficiency in Python for data processing.
Experience with ETL/ELT tools (dbt, Airflow, ADF, Informatica, Matillion).
Solid understanding of data warehousing concepts (Kimball, Data Vault, normalization).
Familiarity with cloud platforms (Azure preferred; AWS/GCP acceptable).
Knowledge of data governance, security, and compliance (GDPR).
Excellent problem-solving and communication skills.
Skills
Experience with Snowpark, UDFs, dynamic tables, and external tables.
Exposure to streaming/CDC (Kafka, Fivetran, Debezium).
BI tool integration (Power BI, Tableau, Looker).
Certifications: SnowPro Core or Advanced.
Oracle Analytics & AI Developer - SC Clearance Required
DWI Consulting Ltd
London
Remote or hybrid
Senior
Private salary
RECENTLY POSTED
processing-js
jira
Job Description: We are seeking an experienced SC Cleared Oracle Analytics & AI Developer to play a key role on a confidential national programme. You will be responsible for designing, developing, and demonstrating innovative proof-of-concepts involving data reporting, artificial intelligence, and machine learning within the client’s environment. The ideal candidate will be a senior, hands-on technologist with a strong foundation in Oracle’s cloud ecosystem and a particular emphasis on analytics and AI solutions.Key Responsibilities:
Serve as a lead developer, focusing on creating AI, ML, and reporting proof-of-concepts on enterprise systems and data.
Design and implement solutions using advanced analytics platforms, cloud data warehouses, and AI services.
Develop and optimize Back End systems with a focus on scalable AI frameworks, including Retrieval-Augmented Generation (RAG) architectures.
Apply expertise in AI model tuning, optimization, and the fine-tuning of large language models.
Work within structured delivery frameworks (Agile/Waterfall), contributing to the full software development life cycle (SDLC).
Produce comprehensive technical documentation and articulate complex concepts to both technical and non-technical stakeholders.
Collaborate effectively with geographically dispersed teams and external partners.
Ensure all solutions adhere to stringent security best practices for cloud environments.
Required Skills & Experience:
5+ years of hands-on Back End development experience, delivering complex, large-scale systems.
Proven expertise with the Oracle technology stack, including cloud data platforms, advanced analytics tools, and AI/ML functionalities.
Demonstrable project experience with AI technologies such as Generative AI, Natural Language Processing (NLP), AI agents, and machine learning.
Strong understanding of the SDLC and experience with methodologies like Agile and tools such as JIRA.
Excellent communication, interpersonal, and stakeholder management skills.
Proven ability to work proactively in a fast-paced, dynamic programme environment.
Experience collaborating with geographically distributed teams.
Knowledge of security best practices in cloud-based architectures.
Eligibility for UK Security Clearance (a mandatory requirement).
Other Information:
Experience with Oracle Cloud Infrastructure (OCI), application development tools, or integration platforms is highly beneficial.
Knowledge of data migration strategies for large-scale enterprise systems is advantageous.
Occasional travel to client sites in Southern England may be required.
Page 1 of 5

Frequently asked questions

What types of contract Data Engineer jobs are listed on Haystack?
Haystack features a wide range of contract Data Engineer positions, including short-term, long-term, remote, and on-site roles across various industries such as finance, healthcare, and technology.
How do I apply for contract Data Engineer positions on Haystack?
To apply, simply create a profile on Haystack, upload your resume, and submit applications directly through the platform to any contract Data Engineer job that matches your skills and experience.
Are contract Data Engineer jobs on Haystack remote-friendly?
Yes, many contract Data Engineer listings on Haystack offer remote or hybrid working options. You can filter job searches by location and remote availability to find the best fit.
What qualifications do I need to become a contract Data Engineer?
Typically, contract Data Engineers should have experience with big data tools (like Hadoop, Spark), SQL, Python/Scala, ETL processes, and data warehousing solutions. Specific requirements may vary by job.
Can I negotiate contract terms and rates for Data Engineer roles through Haystack?
While Haystack facilitates job postings and applications, contract negotiations including rates and terms are generally handled between you and the hiring company or recruiter directly.