We are looking for a D365 Data Migration Specialist to join a high-priority digital transformation project. You will lead the migration of complex legacy data sets into a centralised Dynamics 365 environment using Azure Data Factory (ADF).
This is a fully remote contract for a London-based client. You must hold an active SC Clearance to be considered.
Key Responsibilities
Technical Requirements
Contract Summary
Role/Job title: Starburst Developer
Work Location: London (2 days in a week MUFG Client Office)
The RoleStarburst DeveloperYour responsibilities: Design and maintain data pipelines using Starburst and related technologies.
Optimize query performance and resolve data processing bottlenecks.
Manage databases to ensure high availability, reliability and security.
Integrate Starburst with various data sources including cloud services and APIs.
Monitor data pipelines and troubleshoot issues proactively.
Collaborate with business users and stakeholders on data requirements.
Maintain comprehensive and up-to-date documentation for data processes.
Stay current with data engineering advancements and propose innovative solutions.
Implement best practices for data quality assurance and testing.Your ProfileEssential skills/knowledge/experience: (Up to 10, Avoid repetition) Minimum 8+ Years of Strong hands-on experience with Starburst Enterprise or Denodo Tool
Advanced SQL skills, including query optimization and performance tuning.
Experience with cloud data platforms (Snowflake, Databricks)
Proficiency in any Scripting language (e.g. Python/Scala)
Proficiency in data analytics and data engineeringDesirable skills/knowledge/experience: (As applicable) Handson experience Starburst or Denodo data virtualization
Data Integration background
Database development experience in any database (Oracle, SQL Server)
Banking domain preferred.
This job is with State Street, an inclusive employer and a member of myGwork – the largest global platform for the LGBTQ+ business community. Please do not contact the recruiter directly.
We are seeking an Analytics‑focused Data Engineer to design and build trusted, analytics‑ready datasets on a modern AWS and Databricks data platform. This role is critical to enabling business intelligence, reporting, and advanced analytics by transforming raw data into well‑modeled, high‑quality, and performant analytics layers.
You will work closely with analytics, BI, finance, and product teams to ensure data is easy to consume, well understood, and reliable for decision‑making at scale.
Key Responsibilities
Analytics‑Ready Data Modeling
Design and implement analytics‑optimized data models (fact/dimension, star/snowflake schemas).
Build and maintain curated analytics layers (gold tables) in Databricks using Delta Lake.
Translate business requirements into clear, reusable datasets for dashboards and reports.
Support semantic consistency across metrics, dimensions, and KPIs.
Data Pipelines & Transformations
Develop and maintain ETL/ELT pipelines using Databricks (Spark SQL, PySpark).
Transform raw and intermediate data into clean, documented, and performant analytics datasets.
Implement incremental processing, partitioning, and optimization techniques for BI workloads.
Ensure pipelines are resilient, observable, and production‑ready.
AWS Analytics Platform
Leverage AWS services such as S3, Glue, Redshift, Lambda, and IAM to support analytics use cases.
Integrate Databricks with AWS storage and security services.
Monitor pipeline execution, performance, and cost for analytics workloads.
Data Quality, Metrics & Trust
Implement data quality checks, reconciliation logic, and anomaly detection for analytics data.
Validate accuracy of business metrics used in executive dashboards and reports.
Support data lineage, documentation, and metric definitions.
Partner with stakeholders to ensure a single source of truth for analytics.
BI & Analytics Enablement
Support downstream tools such as Power BI, Tableau, or similar BI tools.
Optimize datasets for dashboard performance and concurrency.
Collaborate with analysts to improve query patterns and data usage.
Enable self‑service analytics through well‑designed datasets and documentation.
Required Qualifications
5-8+ years of experience in data engineering or analytics engineering roles.
Strong experience with Databricks for analytics workloads.
Advanced proficiency in SQL (complex transformations, window functions, performance tuning).
Solid experience with AWS‑based analytics architectures.
Strong experience with data modeling for analytics.
Proficiency in Python or PySpark.
Experience supporting BI, reporting, and analytics teams.
Nice to Have
Experience with Analytics Engineering / ELT patterns.
Familiarity with dbt or similar transformation frameworks.
Experience supporting finance or executive reporting.
Knowledge of data governance, metric catalogs, or data discovery tools.
Experience with streaming data for near‑real‑time analytics.
Exposure to regulated or enterprise analytics environments.
About State Street Across the globe, institutional investors rely on us to help them manage risk, respond to challenges, and drive performance and profitability. We keep our clients at the heart of everything we do, and smart, engaged employees are essential to our continued success.
We are committed to fostering an environment where every employee feels valued and empowered to reach their full potential. As an essential partner in our shared success, you’ll benefit from inclusive development opportunities, flexible work-life support, paid volunteer days, and vibrant employee networks that keep you connected to what matters most. Join us in shaping the future.
As an Equal Opportunity Employer, we consider all qualified applicants for all positions without regard to race, creed, color, religion, national origin, ancestry, ethnicity, age, disability, genetic information, sex, sexual orientation, gender identity or expression, citizenship, marital status, domestic partnership or civil union status, familial status, military and veteran status, and other characteristics protected by applicable law.
Discover more information on jobs at StateStreet.com/careers
Read our CEO Statement
]]>
We are seeking an Application Support Analyst to join our busy and expanding MIS team, to assist with the implementation, configuration and support of new and existing Management Information Systems.
Responsibilities of the role will include:
We are looking for a flexible, pro-active individual who can demonstrate:
This role is based at Uxbridge Campus, with travel to other college sites (Harrow & Richmond) when required. It is anticipated that the postholder will be able to work from home for 1 day per week, subject to business need, after settling into the role.
Please note, we are unable to offer sponsorship for this role.
Notice for Recruitment Agencies:
HRUC operates with a managed service provider (MSP) for recruitment services. We do not accept unsolicited emails, CVs, or contact from recruitment agencies. Any CVs or information sent to HRUC outside of this process will not be considered or acted upon, regardless of the terms and conditions stated by the agency.
Division Education for Industry Group
Hours 37 hours per week, Full-Time, Monday to Friday
Contract Fixed-Term, 2-year contract
Location FRA Academy: Electra House – London, Moorgate EC2M 6SE
About EFI Group
EFI Group has a bold mission to transform lives, careers and industries through pioneering, industry-led education in fashion, beauty and apprenticeships at the Fashion Retail Academy (FRA) and The London College of Beauty Therapy (LCBT). Our vision is to deliver exceptional learning experiences, driven by innovation, inclusion, employability, and excellence.
About the role
Are you passionate about data, analytics, and technology?
We’re looking for a Data Analyst Apprentice to join our IT team and support our work with data to improve the student experience at the EFI Group.
In this role, you’ll work alongside our Senior Business Intelligence Analyst to manage and prepare datasets, maintain dashboards and reports, and ensure data quality across our systems. You’ll gain hands-on experience with modern, cloud-based tools including SQL, Power BI, Python, Azure, and more, while learning how data supports real-world business decisions.
What we’re looking for:
We are seeking someone who is keen to gain hands-on experience, develop practical skills, and achieve the standards required to succeed in this role.
No prior work experience required - just enthusiasm for data, analytics, and technology!
Applicants will need:
5 GCSEs (or equivalent) at grades A-C / 9-4, including Maths, English, and a Science or Technology subject.
A Level 3 qualification (A-levels, apprenticeship or BTEC, etc.) totalling 48 UCAS points.
Eligibility for apprenticeship funding.
What you’ll gain:
On successful completion of the apprenticeship, you will achieve the Level 4 Data Analyst apprenticeship standard, approved by Skills England.
Hands-on experience with data platforms, visualisation tools, automation, APIs, and emerging technologies such as AI and machine learning.
Insight into real business systems and reporting processes, learning how data informs key decisions.
Mentorship and support from experienced professionals to develop your skills and grow your career in data analytics.
We welcome applications from all backgrounds and encourage anyone with an interest in data and technology to apply, including those from underrepresented groups in STEM, and anyone looking to start a rewarding career in analytics.
Why The EFI?
We foster a culture where our team members can lean on each other, recognise each other, and celebrate together! At EFI, we prioritise your growth and wellbeing with a range of fantastic benefits, including:
Funded Professional Qualifications: Support for personal and professional development, including a personal growth allowance and annual CPD.
Generous and Flexible Leave Options: Including an around-the-world trip after five years of service.
Market-Leading Family-Friendly Pay: Including six months of fully paid maternity, adoption, and shared parental leave.
Monthly Wellbeing Allowance: Including a customisable monthly wellbeing allowance, and funded counselling/CBT through Education Support Employee Assistant Programme.
Salary:
£26,650 per annum
How to apply/Next Steps:
Click ‘Apply for this job’ to submit your application.
Closing Date:
8am on Friday 6th March 2026
Interviews/Recruitment Day:
Tuesday 10th March 2026, in-person at FRA Academy: Electra House, Moorgate, EC2M 6SE
More Information/Contact us:
Click here to download a full job description
Click here to download the apprenticeship brochure
For more information about the EFI Group, visit our EFI website and refer to the job description.
Please contact recruitment@efigroup.ac.uk for further information.
The EFI is fully committed to safeguarding and promoting the welfare of young people and vulnerable adults. Candidates offered positions will be required to undergo thorough safeguarding background checks as a condition of the offer.
Thank you for sharing our values and commitment to student safety.
Senior Data Engineer (AWS Kafka Big Data) London / WFH to £85k
Are you a tech savvy Data Engineer with AWS expertise combined with client facing skills?
You could be joining a global technology consultancy with a range of banking, financial services and insurance clients in a senior, hands-on Data Engineer role.
As a Senior Data Engineer you will design and build end-to-end real-time data pipelines using AWS native tools, Kafka and modern data architectures, applying AWS Well-Architected Principles to ensure scalability, security and resilience. You’ll collaborate directly with clients to analyse requirements, define solutions and deliver production grade systems, leading the development of robust, well tested and fault tolerant data engineering solutions.
Location / WFH:
There’s a hybrid work from home model with two days a week in the London, City office (or at client site in London).
About you:
What’s in it for you:
As a Senior Data Engineer you will earn a highly competitive package:
Apply now to find out more about this Senior Data Engineer (AWS Kafka Python) opportunity.
At Client Server we believe in a diverse workplace that allows people to play to their strengths and continually learn. We’re an equal opportunities employer whose people come from all walks of life and will never discriminate based on race, colour, religion, sex, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status. The clients we work with share our values.
Kent
Competitive Salary
VIQU have partnered with a leading organisation that are seeking a Data Product Manager to shape and deliver high-impact customer data products within a fast-paced, innovative environment. This role will focus on defining product roadmaps, overseeing data engineering foundations, managing partners, and ensuring governance, quality, and measurable business value across customer data initiatives.
Key Responsibilities of the Data Product Manager:
• Own the product roadmap for customer data products, sequencing initiatives that maximise business value and accelerate data maturity.
• Manage delivery across engineering, data science, and external vendors, ensuring deadlines and quality standards are met.
• Oversee ingestion pipelines (Airflow, DBT, Glue) into AWS S3 and Redshift, ensuring reliability, scalability, and compliance.
• Enable Insights, Marketing, and Commercial teams through dashboards, APIs, and curated datasets.
• Collaborate directly with Marketing and Commercial stakeholders to capture requirements and prioritise features effectively.
• Define and track performance metrics, measuring engagement success, commercial uplift, and operational efficiency, publishing regular impact reviews.
• Identify and implement automation opportunities, including AI-driven solutions, to improve productivity across data workflows.
• Enforce governance standards covering privacy, security, and ethical AI usage across all data products.
Key Requirements for a Data Product Manager:
• Proven experience as a Product Manager or Data Product Owner within analytics, data platforms, or customer data domains.
• Hands-on familiarity with AWS technologies (S3, Redshift) and orchestration tools (Airflow, DBT, Glue).
• Strong understanding of data modelling, identity resolution, and data activation use cases.
• Ability to translate business needs into clear technical requirements and prioritise effectively.
• Comfortable working with metrics, experimentation frameworks, and ROI-driven reporting.
• Excellent stakeholder management skills across marketing, commercial, and technical teams.
• Self-motivated, ambitious, and creative, with strong written and verbal communication skills.
• Able to simplify complex technical concepts and operate confidently at all organisational levels.
Apply today to speak to VIQU in confidence or contact Fay Toomey via the VIQU website.
Know someone exceptional for this Data Product Manager position? Refer them and receive up to £1,000 if successful (terms apply). Follow us on LinkedIn @VIQU IT Recruitment for more exciting opportunities.
Data Product Manager
Kent
Competitive Salary
Senior Data Engineer (AWS Kafka Big Data) London / WFH to £85k
Are you a tech savvy Data Engineer with AWS expertise combined with client facing skills?
You could be joining a global technology consultancy with a range of banking, financial services and insurance clients in a senior, hands-on Data Engineer role.
As a Senior Data Engineer you will design and build end-to-end real-time data pipelines using AWS native tools, Kafka and modern data architectures, applying AWS Well-Architected Principles to ensure scalability, security and resilience. You’ll collaborate directly with clients to analyse requirements, define solutions and deliver production grade systems, leading the development of robust, well tested and fault tolerant data engineering solutions.
Location / WFH:
There’s a hybrid work from home model with two days a week in the London, City office (or at client site in London).
About you:
You are an experienced Data Engineer within financial services or consulting environments
You have expertise with AWS including Lake formation and transformation layers
You have strong Python coding skills
You have experience with real-time data streaming using Kafka
You’re collaborative and pragmatic with excellent communication and stakeholder management skills
You’re comfortable taking ownership of projects and working end-to-end
You have a good knowledge of Distributed Systems and DevOps tooling
Ideally you will also have Databricks experience
What’s in it for you:
As a Senior Data Engineer you will earn a highly competitive package:
Salary to £85k
Bonus c15%
Pension (up to 7% employer contribution), Life Assurance, Income Protection
Private medical care for you and your family, including mental health
Travel Insurance
Charitable giving
Gym membership for you and your family
Flexible holiday scheme
Apply now to find out more about this Senior Data Engineer (AWS Kafka Python) opportunity.
At Client Server we believe in a diverse workplace that allows people to play to their strengths and continually learn. We’re an equal opportunities employer whose people come from all walks of life and will never discriminate based on race, colour, religion, sex, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status. The clients we work with share our values.
TPBN1_UKTJ
Start April initially until Dec 2026 Central London Certain Advantage are seeking Engineers skilled in Python, PySpark and Azure infrastructure services to join a platform team supporting a major data and analytics initiative within a globally renowned Trading business in London. You must be willing to work 3 days a week onsite in central London. This Engineer position is critical to enhancing the team’s ability to deliver re-usable, high quality data pipelines, reduce bottlenecks, and accelerate insight availability for business stakeholders. We need Engineers to have a solid architectural foundation to be able to solve problems independently at a fast pace. You’ll need to offer strong effective communication skills to be able to understand requirements and communicate with technical leaders asynchronously across 5 time zones. Your initial work will involve abstracting code from our product teams into a shared, common python library leveraging PySpark/dataframes. You’ll also be building microservices in the form of python-based Azure Functions. After the initial pre-defined work, you’ll serve as an extension of the product teams building microservices and libraries to solve the common needs across the teams. Required skills: Python PySpark SQL Azure infrastructure Understanding of Containers, Microservices, and Functional design patterns Experience with Agile processes Experience with Terraform Experience with Unit Testing Preferably PyTestOptional, but recommended skills: HTML/CSS React Typescript FastAPI framework Does this sound like your next career move? Apply today! Working with Certain Advantage We go the extra mile to find the best people for the job. If you’re hunting for a role where you can make an impact and grow your career, we’ll work with you to find it. We work with businesses across the UK to find the best people in Finance, Marketing, IT and Engineering. If this job isn’t for you, head to (url removed) and register for job alerts and career guidance tips
London + 2 or 3 days work from home
Circ £60,000 - £70,000 + Excellent Benefits Package
A fantastic opportunity is available for a Data Engineer that enjoys working in a fast paced and collaborative team playing work environment. Our client has been expanding at a remarkable pace and have transformed their technical landscape with leading edge solutions. Having implemented a new MS Fabric based Data platform, the need is now to scale up and deliver data driven insights and strategies right across the business globally. The Data Engineer will be joining a close-knit team that is the hub of our client’s global data & analytics operation. Previous experience with MS Fabric would be beneficial but is by no means essential. Interested candidates must have experience in a similar role with MS Azure Data Platforms, Synapse, Databricks or other Cloud platforms such as AWS, GCP, Snowflake etc.
Key Responsibilities will include;
* Design, implement, and optimize end-to-end solutions using Fabric components:
* o Data Factory (pipelines, orchestration)
* o Data Engineering (Lakehouse, notebooks, Apache Spark)
* o Data Warehouse (SQL endpoints, schemas, MPP performance tuning)
* o Real-Time Analytics (KQL databases, event ingestion)
* o Manage and enhance OneLake architecture, delta lake tables, security policies, and data governance within Fabric.
* o Build scalable, reusable data assets and engineering patterns that support analytics, reporting, and machine learning workloads.
* Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver effective solutions.
* Troubleshoot and resolve data-related issues in a timely manner.
Key Experience, Skills and Knowledge:
* Proven 2 yrs+ experience as a Data Engineer or similar role, with a strong focus on PySpark, SQL, Microsoft Azure Data platforms and Power BI an advantage
* Proficiency in development languages suitable for intermediate-level data engineers, such as:
* Python / PySpark: Widely used for data manipulation, analysis, and scripting.
* SQL: Essential for querying and managing relational databases.
* Understanding of D365 F&O Data Structures is highly desirable
* Strong problem-solving skills and attention to detail.
* Excellent communication and collaboration abilities.
This is a hybrid role based in Central / West London with the flexibility to work from home 2 or 3 days per week. Salary will be dependent on experience and expected to be in the region of £60,000 - £70,000 + an attractive benefits package including bonus scheme.
For further information, please send your CV to Wayne Young at Young’s Employment Services Ltd. YES are operating as both a recruitment Agency and Recruitment Business
London + 2 or 3 days work from home
Circ £60,000 - £70,000 + Excellent Benefits Package
A fantastic opportunity is available for a Data Engineer that enjoys working in a fast paced and collaborative team playing work environment. Our client has been expanding at a remarkable pace and have transformed their technical landscape with leading edge solutions. Having implemented a new MS Fabric based Data platform, the need is now to scale up and deliver data driven insights and strategies right across the business globally. The Data Engineer will be joining a close-knit team that is the hub of our client s global data & analytics operation. Previous experience with MS Fabric would be beneficial but is by no means essential. Interested candidates must have experience in a similar role with MS Azure Data Platforms, Synapse, Databricks or other Cloud platforms such as AWS, GCP, Snowflake etc.
Key Responsibilities will include;
Key Experience, Skills and Knowledge:
This is a hybrid role based in Central / West London with the flexibility to work from home 2 or 3 days per week. Salary will be dependent on experience and expected to be in the region of £60,000 - £70,000 + an attractive benefits package including bonus scheme.
For further information, please send your CV to Wayne Young at Young’s Employment Services Ltd. YES are operating as both a recruitment Agency and Recruitment Business
Onsite Requirements: Remote
Start Date: ASAP
Role Duration: 1 year
Clerance Requirements: Active SC clearance
Inside IR35 - umbrella only
Role Description:
We’re looking for a Data Engineer whose main focus is understanding and documenting existing systems, with the goal of supporting decommissioning activities. The role centres on analysing current solutions built using Java, Node JS, and React, and developing a clear, end to end picture of how data flows across the wider programme.
This includes documenting data flows, system dependencies, and underlying data models, ensuring there is a clear record of how data is structured, stored, and used throughout the solution. The role involves investigating how systems are used on a day-to-day basis, clarifying ownership and integration points, and capturing this information in a way that supports risk assessment and decommissioning decisions.
Responsibilities:
Python and PySpark are required as supporting capabilities, used where needed to analyse data pipelines and confirm how data moves and transforms in practice. The role also requires strong experience with testing and data quality management, ensuring that documented data flows and models are accurate and trusted. Experience working in cloud environments such as AWS or Azure is expected, with Databricks considered a nice to have.
Required Skills:
Data Analyst - ETL, Power BI, PACE, Databricks, Sharepoint
Up to 500 per day (Inside IR35 - Umbrella)
My client is an International Consultancy who require a Data Analyst with demonstrable ETL and Data manipulation skills to play a key role in transforming data across multiple systems through the use of tools such as Power BI, Databricks and Sharepoint as well as PACE.
Key Requirements:
Nice to have:
If you’re interested in this role, click ‘apply now’ to forward an up-to-date copy of your CV, or call us now.
If this job isn’t quite right for you, but you are looking for a new position, please contact us for a confidential discussion about your career.
Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C’s, Privacy Policy and Disclaimers which can be found at (url removed)
Data Scientist
Hybrid Working - Edinburgh OR London - 2 day a week on site.
Financial Services
Lorien’s leading banking client are seeking an experienced Data Scientist with strong expertise in graph databases and hierarchical data modelling to join our dynamic team onsite in London or Edinburgh.
The successful candidate will be responsible for designing, developing, and optimising graph database solutions to support complex data relationships and hierarchies in our business applications with knowledge/experience of Neo4j.
This role is based in Edinburgh OR London.
This role will be Via Umbrella.
Working in a Hybrid Model of 1 day a week on site.
Key Responsibilities:
Required Skills and Experience:
Preferred Qualifications:
IND_PC3
Guidant, Carbon60, Lorien & SRG - The Impellam Group Portfolio are acting as an Employment Business in relation to this vacancy.
Role: Oracle DBA with AWS Migration ExperienceType: PERM
Location: London, UK
Working Model: Hybrid (3 Days in office per week)
Salary: 50K - 60K GBP/Annum
The Mission
We are looking for an Oracle Database Consultant to join our Enterprise Cloud team. You’ll be the technical lead for complex migrations, moving mission-critical workloads into the AWS Cloud Migration Factory (CMF).
Core Responsibilities
Requirements
This is an urgent vacancy with a deadline where the hiring manager is shortlisting for an interview immediately. Please apply with a copy of your CV or send it praveen. Com
Randstad Technologies Ltd is a leading specialist recruitment business for the IT & Engineering industries. Please note that due to a high level of applications, we can only respond to applicants whose skills & qualifications are suitable for this position. No terminology in this advert is intended to discriminate against any of the protected characteristics that fall under the Equality Act 2010. For the purposes of the Conduct Regulations 2003, when advertising permanent vacancies we are acting as an Employment Agency, and when advertising temporary/contract vacancies we are acting as an Employment Business.
Location: Central London - 3–4 days onsite each week
Salary: £ Negotiable + benefits
We are supporting an enterprise-level client who is investing heavily in a modern cloud data platform that will sit at the centre of its data strategy. This programme will enable more advanced analytics, reporting and insight across multiple business functions.
We are looking to appoint three experienced Senior Data Engineers with strong Azure and Snowflake expertise.
The Role
This is a senior, hands-on engineering position within a high-performing data team. You will play a key role in shaping, developing and enhancing a large-scale Azure-based data platform, ensuring it is scalable, reliable and built to enterprise standards.
The position requires regular collaboration with stakeholders and an onsite presence in Central London 3–4 days per week, so this is not a fully remote role.
What You Will Be Doing
* Building and enhancing scalable data pipelines using Azure and Snowflake
* Developing and improving ETL / ELT processes across batch and micro-batch workloads
* Working extensively with Azure Data Factory, Azure SQL, Azure Storage and Azure Functions
* Designing and maintaining data warehouse structures including star and snowflake schemas
* Applying recognised data warehousing approaches such as Kimball and Inmon
* Writing and optimising complex SQL queries to support analytics and reporting
* Ensuring strong data governance, quality, validation and reconciliation processes
* Partnering with BI teams to enable effective reporting solutions
* Contributing to architectural decisions around performance, scalability and infrastructure
* Identifying and resolving issues to improve platform reliability and efficiency
What We Are Looking For
* 7+ years in software engineering or development
* 5+ years working within data-focused environments
* At least 2 years hands-on experience with Azure cloud data platforms
* Strong expertise across the Azure Data Platform including Data Factory, SQL, Storage and Functions
* Proven experience in SQL development and data modelling
* Experience building both periodic batch and micro-batch data pipelines
* Solid understanding of enterprise data warehouse design and loading strategies
* A minimum of 1 year hands-on experience with Snowflake
* Experience working with large-scale enterprise datasets
* Strong analytical mindset with a clear focus on data integrity and performance
Desirable Experience
* Advanced Snowflake performance tuning and optimisation
* Python and or Databricks exposure
* Experience designing full end-to-end data platform architectures
* Background supporting enterprise BI ecosystems
* Familiarity with CI/CD pipelines and infrastructure-as-code practices
Additional Details
* Visa candidates will be considered
* Salary is open and negotiable depending on experience
* Immediate requirement
If you are an experienced Senior Data Engineer with strong Azure and Snowflake expertise and are comfortable with a London-based hybrid working model, we’d love to hear from you
Job Title: Contract Data Analyst
Location: Hybrid, Occasional visits to North London Office
Contract Duration: 3 months
Company Overview: A medical equipment services organisation in the UK & Ireland, committed to delivering innovative solutions and exceptional service to our clients. We seek a skilled Data Analyst to join our team on a contract basis to support our ERP implementation project, migrating to Microsoft Dynamics 365.
Job Description:
Role Overview: The Data Analyst will help model and prepare data for migration to new systems. This will include the modelling of master data. The role will involve taking the lead with data cleansing.
The ideal candidate will have a strong background in data modelling, data cleansing, and de-duplicating data. This role will involve carrying out data migration as part of our ERP and Finance systems projects.
The business operates a medical equipment and consumables operation that includes sales, training, installation, and field service.
The company is growing rapidly and is currently in the (Apply online only) people range.
The project scope is to replace the current Field Service solution with D365 Field Service and implement D365 Business Central for Finance and Operations. Further project phases are under consideration for Commercial, Sales and Training.
Key Responsibilities:
Perform data modelling to structure and organise data effectively.
Cleanse and de-duplicate data to ensure accuracy and consistency.
Execute data migration tasks for ERP and Finance systems.
Mapping data sets to master data + cleansing/enriching/transformation
Build and optimise SQL queries for data extraction and manipulation.
Utilize Excel and Access to manipulate and analyse data.
Understand and work with relational databases.
Use tools to automate data cleansing processes.
Skills and Experience:
Preferred Qualifications:
Salary: 30,400 plus Veolia benefits
Location - N1 9JY Hybrid with office working at least 3 times a week
Hours - Full time, 40 hours per week, Monday to Friday
We are looking for a Financial Planning & Systems Analyst to join our Finance Team. To assist the Senior Finance Planning & Systems Manager in maintaining and improving the forecasting and budgeting tool for the UK&I business. The role holder will also be expected to assist the wider Central Finance team with automation and standardisation of processes with particular focus on managing and controlling “Big Data”.
What we can offer you:
What you’ll do:
The experience you will need:
Essential:
Desirable:
What’s next?
Apply today, so we can make a difference for generations to come.
We’re proud to have been named as one of The Sunday Times Best Places to Work for three consecutive years in 2023, 2024 and 2025. This consistent recognition reflects our commitment to our people, demonstrating that Veolia is not just transforming the environment, we’re also transforming what it means to have a rewarding, purposeful career.
We’re dedicated to supporting you throughout your application journey, offering adjustments where reasonable and appropriate. As a proud Disability Confident Employer, we will offer an interview to applicants with a disability or long-term condition who opt-in to the Disability Confident scheme, and meet the minimum criteria for our roles.
We’re also committed to ensuring that all applicants and colleagues receive fair treatment without discrimination on any grounds, aiming to create a diverse and inclusive workplace where everyone can thrive.
Job Title: GCP FinOps Engineer
Location: Newport, UK (Hybrid)
Contract Duration: 6 Months
IR35 Status: Inside IR35
Role Overview
We are seeking an experienced GCP FinOps Engineer to optimise cloud spend, performance, and operational efficiency across large-scale data analytics and containerised workloads. You will work closely with engineering, data, and product teams to embed cost-efficient architectures, enforce financial governance, and implement best practices across Google Cloud environments.
Responsibilities
Key Skills / Knowledge
Job Title: Azure Cosmos DB Developer
Location: London (Hybrid 1 day per week onsite mandatory)
Contract Type: Contract
Duration: 6 Months
Rate: £(Apply online only)/day (Inside IR35)
Industry: Financial Services
Azure Cosmos DB Developer Contract London (Hybrid)
We are seeking an experienced Azure Cosmos DB Developer to join a major global financial services organisation undergoing significant transformation across its application and data engineering landscape.
This is an exciting opportunity to contribute to the modernisation of critical platforms within a highly regulated, enterprise-scale environment.
You will play a key role in delivering high-performance, cloud-native data solutions using Azure services, ensuring scalability, resilience, and operational excellence.
The Role
You will support the transformation of mission-critical systems, driving technical excellence in data engineering and cloud-native development. The role requires strong hands-on expertise in Azure Cosmos DB, Spark, and distributed computing concepts.
Key Responsibilities
Essential Skills & Experience
Desirable
Why Apply?
If you are a Cosmos DB specialist looking for your next contract engagement within financial services, we would love to hear from you.
Location UK Remote
Salary Up to 68,000 home based nationally or up to 74,00 home based for those living within the M25 dependent on experience plus a 600 per annum home working allowance
Job Ref J13058
We are looking for a Principal Data Engineer to shape the technical direction of data engineering across a cloud based Enterprise Data Platform built on Microsoft Azure.
This role suits someone with deep hands on data engineering experience who can also set standards, guide teams, and influence how data solutions are designed and delivered at scale. You will play a key role in ensuring the platform is engineered to a consistently high standard and can evolve to meet future organisational needs. We value diverse perspectives and are committed to creating an environment where people with different backgrounds and experiences can do their best work.
The environment
The Enterprise Data Platform is built on Microsoft Azure, using Databricks, Microsoft Fabric, and Power BI to deliver trusted, governed data and analytics.
What you will be doing
Setting data engineering standards, patterns, and best practices
Acting as a trusted senior technical authority across data engineering and analytics
Shaping solution design and architectural decisions on Azure
Ensuring data pipelines are scalable, reliable, and production ready
Championing modern engineering practices including CI/CD and automation
Working in a forward deployed way with delivery teams to support progress and remove blockers
Managing and developing data engineers, supporting growth and high quality delivery
What we are looking for
Strong experience designing and building data platforms on Microsoft Azure
Hands on experience with Databricks and Microsoft Fabric
Experience working with analytics and reporting tools such as Power BI
Experience managing and mentoring data engineers
Excellent communication skills with the ability to explain complex technical ideas clearly to both technical and non-technical audiences
A collaborative approach and an interest in raising engineering standards across teams
If you have strong Azure based data engineering experience and want to shape how data engineering is delivered at scale, make an application today to find out more.
Alternatively, you can refer a friend or colleague by taking part in our fantastic referral schemes! If you have a friend or colleague who would be interested in this role, please refer them to us. For each relevant candidate that you introduce to us (there is no limit) and we place, you will be entitled to our general gift/voucher scheme.
Datatech is one of the UK’s leading recruitment agencies in the field of analytics and host of the critically acclaimed event, Women in Data. For more information, visit our website: (url removed)