Role/Job title: Starburst Developer
Work Location: London (2 days in a week MUFG Client Office)
The RoleStarburst DeveloperYour responsibilities: Design and maintain data pipelines using Starburst and related technologies.
Optimize query performance and resolve data processing bottlenecks.
Manage databases to ensure high availability, reliability and security.
Integrate Starburst with various data sources including cloud services and APIs.
Monitor data pipelines and troubleshoot issues proactively.
Collaborate with business users and stakeholders on data requirements.
Maintain comprehensive and up-to-date documentation for data processes.
Stay current with data engineering advancements and propose innovative solutions.
Implement best practices for data quality assurance and testing.Your ProfileEssential skills/knowledge/experience: (Up to 10, Avoid repetition) Minimum 8+ Years of Strong hands-on experience with Starburst Enterprise or Denodo Tool
Advanced SQL skills, including query optimization and performance tuning.
Experience with cloud data platforms (Snowflake, Databricks)
Proficiency in any Scripting language (e.g. Python/Scala)
Proficiency in data analytics and data engineeringDesirable skills/knowledge/experience: (As applicable) Handson experience Starburst or Denodo data virtualization
Data Integration background
Database development experience in any database (Oracle, SQL Server)
Banking domain preferred.
We’re seeking an experienced senior Informatica Technical Lead with hands-on expertise in Informatica (preferably IICS) to support data migration efforts on a large-scale SAP S/4HANA On-Cloud implementation. The ideal candidate will have strong stakeholder management skills, adaptability across time zones, and be available to start immediately.
Key Responsibilities & Requirements:
Please only apply if you are expert in Informatica transformations, data integration, and data analysis.
6-month initial contract
Hybrid
Inside IR35
Harvey Nash’s public sector client is seeking an experienced Power BI Data & Reporting Analyst.
In this role, you’ll design and develop interactive dashboards, executive level reporting, and a suite of reusable Power BI templates to enhance future in house development.
Strong analytical skills, excellent communication, and the ability to work with complex datasets are essential.
Key Skills & Experience (Essential)
Desirable Skills
We are working with a well-established engineering group operating at the heart of the UK water and infrastructure sector, delivering pumping and environmental solutions nationwide. As part of continued investment in technology and data capability, they are now seeking a Contract Data Engineer (BI) to join their team based in Chandlers Ford. This role is pivotal in designing, developing, and optimising a scalable cloud-based data platform that underpins strategic decision-making across the organisation. You will shape data strategy, enhance governance, and drive innovation in business intelligence and analytics. Responsibilities : Design and develop robust cloud-based data pipelines and scalable data architectures Build and optimise data solutions using Databricks, Synapse, Fabric or equivalent cloud technologies Develop Python-based data processing, automation, and packaging solutions Design and maintain high-performance data models and warehousing environments Implement governance frameworks ensuring data quality, security, and accessibility Engage with senior business and IT stakeholders to gather requirements and translate them into technical solutions Drive DevOps and CI/CD best practices across the data function Implement infrastructure as code using tools such as Bicep or Terraform Solve complex data challenges with a strategic and analytical mindset Support business intelligence initiatives ensuring data is reliable, accessible, and insight-driven Skills & Experience : Strong experience in a data engineering or cloud data architecture role delivering enterprise-grade solutions Deep expertise in modern cloud data processing platforms such as Databricks, Synapse or Fabric Advanced Python programming skills for scalable data processing and automation Extensive SQL experience across relational and non-relational databases Strong understanding of data modelling, warehousing, and governance principles Experience with containerisation and orchestration tools such as Docker or Kubernetes Proven background in DevOps and CI/CD methodologies Ability to communicate complex data concepts clearly to technical and non-technical stakeholders Willingness to travel occasionally across UK sites where required Summary : Position : Data Engineer (Business Intelligence) Location : Chandlers Ford Contract Rate : £450 £500 per day Duration : 3 month initial Contract This is a high-impact contract opportunity for a technically strong Data Engineer to influence data strategy and build scalable solutions that directly support business growth and performance. Apply Now ? TPBN1\_UKTJ
Data Engineer | Outside IR35 | £450 - £500 | 6 months | Hybrid London We’re supporting a company who are looking for a Data Engineer to build and enhance the data processing capabilities within their Databricks environment. You’ll be responsible for developing the code that drives their data pipelines, using Python, Spark, and Databricks Workflows to deliver new platform functionality and ensure efficient execution. Key Responsibilities Develop reliable Python and PySpark code to support data ingestion, transformation, and end‑to‑end processing. Deliver new technical features and components aligned to approved solution designs and business requirements. Enhance, extend, and tune existing data frameworks to support additional use cases and improved performance. Create, manage, and optimise Databricks Workflows, including orchestration logic and operational behaviours. Carry out testing, performance tuning, and provide day‑to‑day operational support for data pipelines. Work closely with Solution Designers / Architects and Configuration Analysts to ensure consistent and effective delivery. If this is a role that suits your skillset, can work onsite 2 days per month and immediately available then please apply for the job advert directly or reach out to myself at (url removed). Data Engineer | Outside IR35 | £450 - £500 | 6 months | Hybrid London
Morela is proud to be supporting our client in one of the most innovative and high-impact areas of data and technology.
We re currently seeking an Data Consultant (Palantir) to join a fast-growing consultancy delivering enterprise-scale Palantir Foundry and Gotham solutions. In this contract role, you ll work on complex, data-driven projects, helping clients turn raw data into actionable insights while ensuring compliance and governance standards are met.
Location: London / UK (Hybrid/Flexible)
Contract: 6 12 months
Key Skills & Experience:
This is a unique opportunity to work on high-profile projects across both public and private sectors, make an immediate impact, and further develop your Palantir expertise.
Outside IR35 Contract Competitive day rate
This 3-6 month contract for a Data Architect focuses on shaping and managing data architecture to support analytics initiatives. The role requires expertise in designing and implementing scalable data solutions.
Client Details
They are a not for profit organisation based in the North West
Description
Profile
A successful Data Architect should have:
Job Offer
Daily rate of 450 to 600 - outside IR35
3 month contract, with likely extension
Fully remote role
Power BI Developer - Construction, Rail & Civil Engineering
Department:
Commercial / Project Controls / Digital & Data
Reports To:
Head of Project Controls / Digital Transformation Manager
Location:
Working from home
Employment Type:
Contract - (Outside IR35)
Role Overview
We are seeking an experienced Power BI Developer to support major infrastructure, rail, and civil engineering projects by delivering high-quality business intelligence and data analytics solutions.
The successful candidate will work closely with Project Managers, Commercial Managers, Planners, and Senior Leadership teams to transform complex cost, programme, and operational data into clear, actionable dashboards that support performance improvement, cost control, and strategic decision-making.
Key Responsibilities
Technical Skills & Experience
Qualifications
Key Competencies
Desirable Experience
If you are interested in hearing more please contact John Baker or Kat Oxlade
Fusion People are committed to promoting equal opportunities to people regardless of age, gender, religion, belief, race, sexuality or disability. We operate as an employment agency and employment business. You’ll find a wide selection of vacancies on our website.
7-8 Month contract
3 days per week on site in Central London
(Apply online only) per day (Inside IR35)
Job Summary:
We are seeking an experienced Murex consultant based onshore (London) with a strong background in the MX3.1 Datamart module. The ideal candidate will possess a very good technical understanding of the Murex platform and its reporting functionality, and ideally have a sound working knowledge of rates, FX, credit and commodities products.
The candidate will be responsible for working on Murex reporting changes, enhancements and bug fixes for business users of the application. The will cover development, testing and preparation for release to production.
Responsibilities on the role-
Handle the end to end design, development and deployment packaging of Murex Datamart solutions aligning with user requirements
Configure and implement different Datamart objects with an optimized solution as per murex best practices
Work on reporting-related bugs and issues, and collaborate with users directly to understand the issues and requirements
Implement the agreed solutions and gather approval on the solution from user and downstream stakeholders
Handle the testing of the solution, documentation of the detailed technical specifications and runbook creation.
Reconcile report outputs and explain any differences
Participate in peer reviews of technical specifications and testing documentation
Technical Skills-
Required Qualifications-
Bachelor’s degree in Finance, Computer Science, or a related field
Strong experience with Murex Datamart module
Strong technical skills, including proficiency in SQL, Unix/Linux and other relevant programming languages
Understanding of financial products including rates & commodities
Excellent problem-solving and analytical skills
Strong communication and interpersonal skills, with the ability to work effectively in a team environment
Disclaimer:
This vacancy is being advertised by either Advanced Resource Managers Limited, Advanced Resource Managers IT Limited or Advanced Resource Managers Engineering Limited (“ARM”). ARM is a specialist talent acquisition and management consultancy. We provide technical contingency recruitment and a portfolio of more complex resource solutions. Our specialist recruitment divisions cover the entire technical arena, including some of the most economically and strategically important industries in the UK and the world today. We will never send your CV without your permission. Where the role is marked as Outside IR35 in the advertisement this is subject to receipt of a final Status Determination Statement from the end Client and may be subject to change.
A leading financial services company has an urgent 6 months + (inside ir35) requirement for a Data Governance & Quality Analyst to provide hands on support in executing data stewardship and governance activities, maintaining data quality, metadata and lineage, and supporting the implementation of governance standards, processes and tools to ensure the organisation can rely on accurate, well managed data for regulatory compliance, analytics and operational decision making, working under the direction of the business.
Key Responsibilities
Support the execution of strategic priorities for developing Data Governance capabilities, ensuring alignment with the data strategy, Data Protection Policy, SII data policy and the enterprise governance framework.
Key Skills / Experience
* Expertise in Data Governance concepts and best practice
* Demonstrable skills in Data Quality Analysis.
* Solid understanding of GDPR and The Data Protection Act 2018
* Experience in Microsoft Purview Data Governance is essential
* Working knowledge of Profisee (MDM) tooling is required
* Understanding of financial regulations and regulatory reporting
* Auditing experience
* Knowledge of or skills in Data warehousing, Data Lake and Big Data solutions (understanding SQL would be useful)
* Knowledge of Cloud based big data frameworks such as data lake, relational, Graph and other no-SQL databases
* Familiar with Cloud and Data Management trends, including open source projects, methodologies (connect and collect, hub and spoke, data fabrics, etc.) and leading commercial vendors that relate to data acquisition, management and the semantic web
* Microsoft Server technologies (Azure, T-SQL, SSIS, SSRS, Power BI) is desirable
* Understanding of Master Data Management technology landscape, processes and design principles
* Operational familiarity in the use of meta-Data Management, data quality, and data stewardship tools and platforms. Experience of Microsoft Purview is desirable.
* Data Lineage knowledge - ability to perform route cause analysis
* Proven track record in operating large Data Governance programs and managing enterprise data assets in a complex organisation
* Creating and implementing Data Governance frameworks and policies
* Experience using Data Governance & Data Quality systems and tools
* Experience querying databases using SQL is essential
* Experience with SQL Server (T-SQL, SSIS, SSRS, MDS) is desirable.
* Experience with Power BI
* Knowledge of data sources, transformation rules, and use of the data for the area of Data Stewardship
* Experience in the use of data catalogues and data quality technologies
* Experience of working within the financial sector
Job DetailsJob Title: GCP FinOps EngineerLocation- Newport, UKKey Responsibilities(Individual contributor role)*
Key Skills / Knowledge
Experience Required
Job Title: Data Engineer (BD&A - DAPM Live Service Support)
Max Rate: £430 per day inside ir35
Duration: 6 months
Location: Telford/hybrid 2 days per week onsite)
Active SC security clearance is required for this role.
Job Description:
We are seeking an SC Cleared Live Support & Monitoring Engineer to provide operational support across a suite of data integration and analytics platforms. This role focuses on maintaining stability, enhancing monitoring capability, and improving service visibility through consolidated dashboards and intelligent alerting.
Responsibilities
Live Service Support
Provide ongoing live support across platforms including:
Denodo
Talend
Pentaho Data Integration (PDI)
Git
MySQL
Amazon Redshift
Investigate, diagnose and resolve incidents across data and integration services
Work closely with technical teams to maintain service availability and performanceGrafana Monitoring & Alerting
Design, create and consolidate Grafana dashboards
Transform multiple independent dashboards into a unified Live Service view with drill-down capability by service
Gather monitoring requirements from stakeholders
Configure and implement alerting for legacy services that currently lack monitoring
Deliver fit-for-purpose alert thresholds and notifications aligned to operational needs
Improve visibility, observability and proactive incident management
Experience & Skills
Essential
Active SC Clearance
Experience supporting live production environments
Exposure to data platforms such as Denodo, Talend, PDI, MySQL or Redshift
Experience creating or maintaining Grafana dashboards
Understanding of monitoring, alerting and service observability principles
Strong troubleshooting and analytical skills
Ability to gather requirements and translate them into monitoring solutionsDesirable
Experience configuring Grafana alerting
Experience working in a client-side environment
Knowledge of legacy system monitoring uplift
Familiarity with Git version control
Onsite Requirements: Remote
Start Date: ASAP
Role Duration: 1 year
Clerance Requirements: Active SC clearance
Inside IR35 - umbrella only
Role Description:
We’re looking for a Data Engineer whose main focus is understanding and documenting existing systems, with the goal of supporting decommissioning activities. The role centres on analysing current solutions built using Java, Node JS, and React, and developing a clear, end to end picture of how data flows across the wider programme.
This includes documenting data flows, system dependencies, and underlying data models, ensuring there is a clear record of how data is structured, stored, and used throughout the solution. The role involves investigating how systems are used on a day-to-day basis, clarifying ownership and integration points, and capturing this information in a way that supports risk assessment and decommissioning decisions.
Responsibilities:
Python and PySpark are required as supporting capabilities, used where needed to analyse data pipelines and confirm how data moves and transforms in practice. The role also requires strong experience with testing and data quality management, ensuring that documented data flows and models are accurate and trusted. Experience working in cloud environments such as AWS or Azure is expected, with Databricks considered a nice to have.
Required Skills:
Belmont Recruitment are currently looking for an experienced Civica CX Specialist to join Kirklees Council on an initial 3-6 month temporary contract. This is a full-time role working 37 hours per week, Monday to Friday.
Key Responsibilities
Essential Experience & Skills
Please apply with an up to date CV ASAP if this role would be of interest to you.
Data Analyst - ETL, Power BI, PACE, Databricks, Sharepoint
Up to 500 per day (Inside IR35 - Umbrella)
My client is an International Consultancy who require a Data Analyst with demonstrable ETL and Data manipulation skills to play a key role in transforming data across multiple systems through the use of tools such as Power BI, Databricks and Sharepoint as well as PACE.
Key Requirements:
Nice to have:
If you’re interested in this role, click ‘apply now’ to forward an up-to-date copy of your CV, or call us now.
If this job isn’t quite right for you, but you are looking for a new position, please contact us for a confidential discussion about your career.
Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C’s, Privacy Policy and Disclaimers which can be found at (url removed)
Azure Data Engineer | Outside IR35 | £400 - £450 | 6 months | Hybrid Liverpool You’ll be designing, developing, and managing data pipelines across Azure, primarily using Azure Data Factory to integrate multiple data sources and deliver streamlined workflows into Azure SQL. A strong grasp of Python will be essential as you'll be transforming, cleansing, and validating complex datasets to ensure they’re accurate, efficient, and ready for analytics and product teams to leverage. This position plays a key role in supporting ongoing modernisation efforts by strengthening the way data is collected, processed, and made available across the business. Responsibilities Build, maintain, and improve scalable ETL/ELT pipelines using Azure Data Factory Model, manage, and optimise datasets within Azure SQL Use Python and Pandas for data preparation, transformation, and quality checks Work closely with engineering, product, and analytics teams to understand data requirements and deliver robust solutions Maintain high standards around data integrity, reliability, and performance Contribute to modernising data tools, workflows, and overall data infrastructureWhat We’re Looking For Strong hands-on experience with Azure Data Factory and cloud-based data orchestration Solid Azure SQL knowledge, including schema design and performance tuning Proficiency in Python, with practical experience using Pandas for data manipulation Comfortable delivering in environments where onboarding is minimal and pace is high Background working with product-led or digitally focused teams is beneficial Able to start immediatelyIf this is a role that suits your skillset, can work onsite 2 days per week and immediately available then please apply for the job advert directly or reach out to myself at (url removed). Azure Data Engineer | Outside IR35 | £400 - £450 | 6 months | Hybrid Liverpool
Your new company This company is a well‑respected charity and registered social landlord dedicated to providing high‑quality, affordable homes and supporting thriving communities across South Devon. The organisation now manages over 4,000 homes spanning areas from Dartmoor National Park to the urban centres of Teignbridge, the South Hams, West Devon, Mid Devon, East Devon, and Exeter. As a non‑profit housing provider, they deliver a full range of housing and tenant support services, including property repairs and maintenance, rent management, financial advice, and neighbourhood development initiatives. Its work is focused on raising service standards, adapting to evolving community needs, and helping residents access opportunities that improve their wellbeing and quality of life. The organisation's vision-"Homes people love, a landlord you can trust"-is reflected in its values of being friendly, accessible, collaborative, and committed to continuous improvement. The business works closely with tenants, partners, and local stakeholders to build sustainable, inclusive communities. Your new role The purpose of this role is to lead the development and operation of the organisation's data engineering capability, ensuring that data is efficiently extracted from source systems, transformed through robust ETL/ELT processes, and loaded into well‑structured data models that support strategic reporting, analytics, and Business Intelligence. The post‑holder will develop scalable pipelines, improve data quality, enhance data models, and strengthen data governance and standards across the organisation. Through collaboration, knowledge sharing, and the application of modern data engineering practices, the role will drive improvements in organisational data maturity and contribute to the successful implementation of the data strategy. To work within the Company's Equality and Diversity Policy, Health and Safety Policy, Customer Service and Performance Policies ensuring that these are complied with throughout all activities within the scope of this role to ensure the highest standards of customer care. Ensure that all activities undertaken are carried out to the highest standards of integrity and professionalism in accordance with the Company's policies and procedures. Bring your skills, your ideas and your initiative to the role. What you'll need to succeed To thrive in this role as a Data Engineer, you'll bring a strong blend of technical expertise, analytical capability, and a commitment to delivering high‑quality data solutions that support organisational goals. Qualifications & Background You'll have: A relevant degree (or equivalent experience gained through progressively responsible data roles); or a formal professional qualification within Data Engineering. Technical Experience You'll need:Experience as a Data Engineer or in a similar technical data role. Experience designing, building, and maintaining ETL/ELT pipelines. Hands‑on experience with orchestration tools such as SSIS, Azure Data Factory, or similar. Practical experience implementing data validation, exception reporting, reconciliation checks, and data quality rules. Strong ability to transform, cleanse, and manipulate datasets using SQL/T‑SQL. Experience producing clear technical documentation, data flows, and specifications. Ability to design and implement dimensional data models (e.g., star schema) including Slowly Changing Dimensions (SCD Type 1/2) for accurate historical reporting. Desirable: Experience in the Social Housing sector. Exposure to BI tools such as Power BI or SSRS. Experience with CI/CD, automated testing, or DataOps practices. Knowledge, Skills & Abilities You should demonstrate: Strong SQL and T‑SQL development skills, including performance optimisation. A solid understanding of relational databases and technologies such as Azure SQL Database. Strong problem‑solving and analytical skills. Awareness of cloud cost optimisation (compute, storage, pipeline efficiency). Understanding of data lifecycle management. Knowledge of RBAC and secure data access management principles. Ability to gather, interpret, and translate stakeholder requirements into technical solutions. Strong organisational skills and the ability to prioritise competing tasks. Understanding of data ethics, privacy, confidentiality, and regulatory frameworks such as GDPR. The ability to explain complex technical concepts to non‑technical stakeholders. Willingness to support and upskill colleagues in data engineering concepts. A proactive, self‑motivated approach and a commitment to high‑quality service delivery. Ability to identify gaps in your own knowledge and seek opportunities for professional development. Desirable:Knowledge of Python for data manipulation, profiling, and data quality tasks. Core Competencies Customer Focus: You seek to understand stakeholder needs and ensure outcomes meet expectations. Communication: You keep others informed, build strong relationships, and are approachable and collaborative. Critical Thinking: You challenge existing processes and contribute new ideas to improve outcomes. Flexibility & Adaptability: You adopt practical approaches to deliver results in a changing environment. Leadership & Ownership: You take responsibility for implementing actions that support organisational aims. Teamwork: You work effectively with colleagues to achieve shared objectives. What you'll get in return Hybrid Working Model - 1 day a week in the office. Pension Model - Company 1.5 X your contribution. What you need to do now If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found at (url removed)
I am searching for a Data Engineer for an exciting and growing technology focused business based in Exeter. The role requires you in the office 2-days per week so you will need to live within a commutable distance of Exeter to be considered for the role or you will be in a position to relocate to the area. In this position you will be following agile methodologies for the design, development and acceptance of the data components for complex software solutions. Working closely with the Product Owner you will gain a good understanding of customer requirements and knowledge of implementation processes to help solution scoping. You will be responsible for requirements analysis, specification definition, data analysis and project management, as required, to meet the needs of each solution. You will create production code and perform code reviews with the team - you will be equally comfortable working alone or in pairs (pair programming). I am looking to speak with candidates who use design patterns and adopt best practices, candidates who take responsibility for ensuring high quality coding and development in their work. To be a success in this role you will need to be skilled in a mixture of the following: \* Databricks \* Power BI \* Python \* TSQL \* Extract Transform Load (ETL) \* Analysis and design \* Test Automation \* Refactoring \* Unit Testing (Mocking) \* Agile \* Scrum Any experience working with PowerShell, Azure, AWS, Data Lakes or Zoho is highly desirable but is NOT essential. Experience of using AI environments to enhance productivity and efficiency through intelligent task management is also desirable (i.e. Copilot and ChatGPT). I am looking to speak with good communicators who like to work collaboratively within a diverse range of technical experts - this is a highly effective technology team. The role comes with a competitive salary and an outstanding benefits package which includes an enhanced pension, medical and healthcare, a bonus, good holiday allowance and much, much more! Please note, to be considered for this role you will MUST have the Right to Work in the UK long-term without company sponsorship. Our customer is not able to sponsor candidates for this opportunity. The role comes with an outstanding benefits package which include an enhance pension, medical and healthcare, a bonus, good holiday allowance and much, much more! KEYWORDS Data Engineer, Databricks, Power BI, Python, TSQL, Extract Transform Load (ETL), Analysis and design, Test Automation, Refactoring, Unit Testing (Mocking), Agile, Scrum, PowerShell, Azure, AWS, Data Lakes, Zoho Please note that due to a high level of applications, we can only respond to applicants whose skills and qualifications are suitable for this position. No terminology in this advert is intended to discriminate against any of the protected characteristics that fall under the Equality Act 2010. Bowerford Associates Ltd is acting as an Employment Agency in relation to this vacancy.
Job Title: Contract Data Analyst
Location: Hybrid, Occasional visits to North London Office
Contract Duration: 3 months
Company Overview: A medical equipment services organisation in the UK & Ireland, committed to delivering innovative solutions and exceptional service to our clients. We seek a skilled Data Analyst to join our team on a contract basis to support our ERP implementation project, migrating to Microsoft Dynamics 365.
Job Description:
Role Overview: The Data Analyst will help model and prepare data for migration to new systems. This will include the modelling of master data. The role will involve taking the lead with data cleansing.
The ideal candidate will have a strong background in data modelling, data cleansing, and de-duplicating data. This role will involve carrying out data migration as part of our ERP and Finance systems projects.
The business operates a medical equipment and consumables operation that includes sales, training, installation, and field service.
The company is growing rapidly and is currently in the (Apply online only) people range.
The project scope is to replace the current Field Service solution with D365 Field Service and implement D365 Business Central for Finance and Operations. Further project phases are under consideration for Commercial, Sales and Training.
Key Responsibilities:
Perform data modelling to structure and organise data effectively.
Cleanse and de-duplicate data to ensure accuracy and consistency.
Execute data migration tasks for ERP and Finance systems.
Mapping data sets to master data + cleansing/enriching/transformation
Build and optimise SQL queries for data extraction and manipulation.
Utilize Excel and Access to manipulate and analyse data.
Understand and work with relational databases.
Use tools to automate data cleansing processes.
Skills and Experience:
Preferred Qualifications:
Job Title: GCP FinOps Engineer
Location: Newport, UK (Hybrid)
Contract Duration: 6 Months
IR35 Status: Inside IR35
Role Overview
We are seeking an experienced GCP FinOps Engineer to optimise cloud spend, performance, and operational efficiency across large-scale data analytics and containerised workloads. You will work closely with engineering, data, and product teams to embed cost-efficient architectures, enforce financial governance, and implement best practices across Google Cloud environments.
Responsibilities
Key Skills / Knowledge
Job Title: Data Engineer (BD&A - DAPM Live Service Support)
Max Rate: 430 per day inside ir35
Duration: 6 months
Location: Telford/hybrid 2 days per week onsite)
Active SC security clearance is required for this role.
Job Description:
We are seeking an SC Cleared Live Support & Monitoring Engineer to provide operational support across a suite of data integration and analytics platforms. This role focuses on maintaining stability, enhancing monitoring capability, and improving service visibility through consolidated dashboards and intelligent alerting.
Responsibilities
Live Service Support
Grafana Monitoring & Alerting
Experience & Skills
Essential
Desirable