You are an accomplished Python developer, you know your way around APIs, have a strong handle on supporting or configuring SaaS solutions and you thrive on customer interaction. Are you ready to make your mark on future-proof software?
We are Preservica and our groundbreaking active digital preservation solutions are at the razors edge eliminating the challenge of file obsolescence, data ROT and more, addressing the need for smart digital preservation technology. Our award-winning software is used by leading businesses, archives, libraries, museums and government organisations across the globe.
We are world leaders and proud of our achievements but to stay ahead we need the brightest and most talented commercial and technical innovators to join our professional services team and right now we are looking for a solid Technical Success / Integration Engineer.
About the Role
The Role
Working as part of the integration team with new customers and their IT teams across a range of commercial and government sectors, your key role will be the successful uptake and integration of their legacy data into Preservicas Active Digital Preservation solution.
You will be hands on in the upload and ingest of large volumes of digitized and born digital content, configuring roles and security and integrating and mapping to catalogs and metadata standards
Ideally with a background in either technical support or customer success within a records management or SaaS service environment, you will be familier with real-life best practice workflows, ingest routines and data/metadata mapping, using custom scripts and APIs. Equally you will have sound customer centric skills and a positve can-do attitude.
Responsibilities:
Location:
This role can be operated as a hybrid role with monthly days in our Abingdon office.
Requirements
What We Look For:
What We Offer:
As our business continues to grow we believe in investing in our people and giving them the support and tools to keep us on track. As well as a competitive salary and benefits package, we offer tangible career development opportunities and dedicated training time to support professional growth.
Preservica is an equal opportunities employer.
Analytics Platform Engineer (Python, Kubernetes) - Secure Gov - Cheltenham - (RL8096)
Job Title - Analytics Platform Engineer (Principle & Senior)
Location - Cheltenham
Salary - Competitive
Benefits - Bonus and commission scheme, comprehensive benefits package including private medical and pension, flexible hybrid working, clear progression with funded training, and enhanced long-term incentives including additional leave and retention bonuses.
Work on analytics platforms that support highly sensitive, mission-critical programmes within a secure environment. This is an opportunity to build and scale modern data platforms while contributing to projects of national significance, alongside some of the strongest engineers in the sector.
The Client - We’re partnering with a leading organisation in the secure government sector to support the growth of a key programme delivering advanced data and analytics capabilities. This is a critical hire within an expanding team, focused on building and scaling platforms that underpin mission-critical solutions.
Operating at the forefront of data, cloud, and AI-driven innovation, they offer an environment where engineers can work on complex, high-impact challenges with real-world significance.
The Candidate - This would suit a candidate with a strong background in data or analytics platform engineering, who is comfortable working across both software development and infrastructure. You’ll enjoy solving complex technical challenges, working in dynamic environments, and collaborating closely with Data Scientists and MLOps teams. A pragmatic, adaptable mindset is key, along with a passion for building scalable, secure systems that enable data-driven outcomes. You should also be comfortable working in secure, highly regulated environments.
The Role - We are seeking Senior and Principal Analytics Platform Engineers to join a growing team delivering high-impact solutions within a secure environment. You will play a key role in designing, building, and evolving a modern analytics platform, supporting the full life cycle from development through to deployment and ongoing optimisation. This is a hands-on role offering exposure to a broad and evolving technology landscape. Due to the nature of the work, you will be operating within a highly secure environment with specific access requirements.
Key Duties:
Requirements:
Desirable:
To apply for this Analytics Platform Engineer permanent job, please click the button below and submit your latest CV.
Curo Services endeavours to respond to all applications, however this may not always be possible during periods of high volume. Thank you for your patience.
Curo Services is a trading name of Curo Resourcing Ltd and acts as an Employment Business for contract and temporary recruitment as well as an Employment Agency in relation to permanent vacancies.
Principal Grafana Network Analytics Engineer Edinburgh, UK | Fully Remote (UK-based)
Engineering | Cyber Security
£95,000 & Benefits
Ever wanted your dashboards to actually defend the internet?
This role does exactly that.
Im recruiting on behalf of a high-growth cyber security technology company that protects some of the worlds most critical online services from large-scale DDoS and application attacks. Their software sits right at the heart of customer networks- and when it works (which it must), entire businesses stay online. Theyre now expanding their world-class engineering team and are looking for a Principal-level Grafana expert to lead the charge on network and security analytics.
The Opportunity This is not a keep-the-lights-on role. Youll own and lead the design and development of sophisticated, high-performance dashboards used to visualise real-time network and security data. Youll set technical direction, mentor engineers, and build analytics that help customers instantly understand and stop complex cyber attacks. Youll be trusted to work from first principles, influence architecture, and make decisions that genuinely matter. What Youll Be Doing
What Were Looking For Essential
Nice to Have
Location & Flexibility
Edinburgh-based engineers: hybrid working (typically 2 days in office)
Fully remote options available for the right candidate
Cutting-edge tech, complex data, and meaningful problems
A culture that values ownership, curiosity, and smart engineering
If youre a senior/principal engineer who loves data, networks, and building things that actually matter this ones worth a conversation.
Bright Purple is an equal opportunities employer and we are proud to work with clients who share our values of diversity in our industry,
Were seeking a Data Platform Engineer to design and build a modern enterprise data platform using Microsoft Fabric. Youll play a key role in architecting a secure, scalable and high-performance cloud environment that enables advanced analytics and data-driven decision making. This is a hybrid role with 1 day a week in the Milton Keynes office.
You will -
Essential Skills
An exciting opportunity to shape and deliver a strategic Microsoft Fabric data platform from the ground up.
Full Time, Permanent
Grade 7 (£38,784.49 - £46,048.78)
Abertay is a modern university with a global outlook, rooted in its local and national communities. We have made our mark with high-quality, well-directed teaching and research, and a stimulating and enriching experience for our students.
IT Services is a friendly, vibrant and fast-moving department with a focus on delivering excellent customer service and high-quality digital technology services to our staff and students.
Following a recent expansion, we are seeking to appoint a Data Engineer to join the team. This role will report to the Head of Enterprise Applications and requires significant experience in developing, implementing and supporting enterprise data solutions.
To be successful in this role, you will need:
This role benefits from hybrid working arrangements.
If you believe you have the skills and experience for this exciting and challenging role, please submit your application through our online recruitment system.
Please note that we will only accept applications through our online recruitment system.
Committed to Equal Opportunities
Abertay University is a Scottish Registered Charity,
No: SC016040
Key Responsibilities: Design, implement, and maintain data pipelines to ingest and process OpenShift telemetry (metrics, logs, traces) at scale. Stream OpenShift telemetry via Kafka (producers, topics, schemas) and build resilient consumer services for transformation and enrichment. Engineer data models and routing for multi-tenant observability; ensure lineage, quality, and SLAs across the stream layer. Integrate processed telemetry into Splunk for visualization, dashboards, alerting, and analytics to achieve Observability Level 4 (proactive insights). Implement schema management (Avro/Protobuf), governance, and versioning for telemetry events. Build automated validation, replay, and backfill mechanisms for data reliability and recovery. Instrument services with OpenTelemetry; standardize tracing, metrics, and structured logging across platforms. Use LLMs to enhance observability capabilities (e.g., query assistance, anomaly summarization, runbook generation). Collaborate with platform, SRE, and application teams to integrate telemetry, alerts, and SLOs. Ensure security, compliance, and best practices for data pipelines and observability platforms. Document data flows, schemas, dashboards, and operational runbooks.
Required Skills: Hands-on experience building streaming data pipelines with Kafka (producers/consumers, schema registry, Kafka Connect/KSQL/KStream). Proficiency with OpenShift/Kubernetes telemetry (OpenTelemetry, Prometheus) and CLI tooling. Experience integrating telemetry into Splunk (HEC, UF, sourcetypes, CIM), building dashboards and alerting. Strong data engineering skills in Python (or similar) for ETL/ELT, enrichment, and validation. Knowledge of event schemas (Avro/Protobuf/JSON), contracts, and backward/forward compatibility. Familiarity with observability standards and practices; ability to drive toward Level 4 maturity (proactive monitoring, automated insights). Understanding of hybrid cloud and multi-cluster telemetry patterns. Security and compliance for data pipelines: secret management, RBAC, encryption in transit/at rest. Good problem-solving skills and ability to work in a collaborative team environment. Strong communication and documentation skills.
Glasgow City Council’s Summer Internship Programme will be available from Monday 8 June 2026 – Friday 28 August 2026, inclusive.
Applicants must be available for the full duration of the placement.
The intern will work 35 hours per week and rate of pay will be the Glasgow Living Wage.
Interns will work for 12 weeks, during which time they will accrue 6 days leave, the payment of which is included in their Salary so must be taken during their 12-week placement.
Applicants require to be available week commencing 23 March 2026 - Thurs 2 April 2026 for interview.
The intern will support the development of enhanced Business Intelligence (BI) reporting to strengthen performance monitoring, governance and audit assurance within the Directorate.
Key Responsibilities
• Review and analyse existing BI dashboards and underlying data sources across
Education Services
• Work with officers to define and agree key performance indicators (KPIs) aligned to
Directorate priority committee reporting and Internal Audit requirements.
• Design and develop a consolidated BI dashboard or KPI-based performance report
• Test outputs with key stakeholders, incorporating feedback and ensuring data accuracy and usability
• Produce clear documentation and support handover to ensure outputs can be maintained and refreshed beyond the project period.
Eligibility criteria
• Must live within the Glasgow City Council boundary
• Have the right to live and work in the UK
• Be in the year of study specified in the advert
For more Information please see attached Recruitment outline and Person Specification or please visit our website https://www.glasgow.gov.uk/summerinternship.
We want everyone to be able to apply. If you need the Application Pack in another format, like Braille, large print, or another language, please call us on 0141 287 1054.
If we need to post it to you, we’ll send it by second-class mail within three working days. Please allow enough time to complete and return your application before the closing date. If you think you might need more time because of accessibility needs, please get in touch and we’ll be happy to help.
There are also a number of Accessibility Tools compatible with the myjobscotland website which may assist you with your application. More information on these can be found at https://myjobscotland.gov.uk/accessibility-statement.
Please note that Glasgow City Council is currently completing a Job Evaluation exercise and introducing a new pay and grading structure which may impact on current salaries quoted in job adverts, see
Working for Us\Job Evaluation
For further information about working for us please refer to our website GCC HR Policies
Role Title: Lead Data Engineer Location: Sheffield/hybrid (3 days on site) Duration: 9 months Rate: £430 per day inside ir35 We are seeking a Lead Data Engineering Consultant with proven experience in leading and developing data engineering platforms. Experience required: Extensive enterprise experience with Hadoop, Spark, and Splunk. Proficiency in object-oriented and functional scripting, particularly in Python. Skilled in handling raw, structured, semi-structured, and unstructured data (SQL and NoSQL). Experience integrating large, disparate datasets using modern tools and frameworks. Strong background in building and optimizing ETL/ELT data pipelines. Familiarity with source control and implementing Continuous Integration, Delivery, and Deployment via CI/CD pipelines. Experience supporting and collaborating with BI and Analytics teams in fast-paced environments. Ability to pair program and work effectively with other engineers. Excellent analytical and problem-solving abilities. Knowledge of agile methodologies such as Scrum or Kanban is a plus. Comfortable representing the team in standups and problem-solving sessions. Capable of driving the creation of technical test plans and maintaining records, including unit and integration tests, within automated test environments to ensure high code quality. Promote SRE (Site Reliability Engineering) culture by addressing challenges through data engineering. Ensure service resilience, sustainability, and adherence to recovery time objectives for all delivered software solutions.Soft Skills (Consultant): Demonstrated ability and enthusiasm for enhancing team performance. Strong active listening and effective communication skills. Self-mastery, with a focus on positive mindsets and professional behaviours. Maintains up-to-date expertise in current tools, technologies, and key areas such as cybersecurity, data privacy, consent, and data residency regulations. Engages with industry groups and external vendors to represent and advance HSBC's interests and influence. Takes accountability for ensuring control and compliance throughout the engineering process. Champions innovation and the adoption of advanced technologies and best practices within the domain.If you are interested in this role or wish to apply, please feel free to submit your CV
Role: AI Developer
Location: London - up to 3 days per week on-site
Duration: 3 Months
Day rate: 540 - 585 Umbrella Only
Minimum of 5 years UK residency required
Role Description:
We’re looking for an experience AI developer with the skills below. They’ll be building out our client’s AI needs and wants using a Microsoft Azure and Copilot stack.
Essential skills and experience
Data Analyst Renewable Energy & Asset Optimisation
Noriker Power develops and optimises rapid response power systems. Through innovation and a strong motivation to protect the environment, Noriker has grown into a vertically integrated developer and service provider in this increasingly important market.
We are looking for a Data Analyst with a strong foundation in Mathematics or Physics to help us optimise the performance of our renewable energy portfolio. As we scale our BESS assets, we need a practical analyst who can distil domain data into commercial results.
Key Responsibilities
Requirements
Why Join Noriker Power?
PLM Data Analyst Opportunity
Are you an experienced PLM Data Analyst with a background in aerospace and defense? Join our client’s team on a contract basis to participate in advanced projects at the forefront of the industry. This exciting opportunity involves working with innovative tools and technologies, helping to shape the future by leveraging your expertise in PLM systems.
Role Overview
As a PLM Data Analyst, you will play a key role in analysing existing CATIA V5 PLM data, such as CAD, metadata, and structures. You’ll support data mapping activities from CATIA V5 to the 3DEXPERIENCE (3DX) data model and contribute to the seamless integration of PLM object models. This role is especially suited to someone with a strong understanding of parts, products, documents, and BOMs within the ENOVIA ecosystem.
Key Skills and Responsibilities
Join a dynamic sector and contribute to a leading client’s innovative projects. If you’re looking for a challenging and rewarding role, apply today to bring your skills to our client’s esteemed team.
Please visit our website to find out more about our Key Information Documents. Please note that the documents provided contain generic information. If we are successful in finding you an assignment, you will receive a Key Information Document which will be specific to the vendor set-up you have chosen and your placement.
To find out more about Computer Futures please visit our website
Computer Futures, a trading division of SThree Partnership LLP is acting as an Employment Business in relation to this vacancy | Registered office | 8 Bishopsgate, London, EC2N 4BQ, United Kingdom | Partnership Number | OC387148 England and Wales
Role Title: Lead Data Engineer
Location: Sheffield/hybrid (3 days on site)
Duration: 9 months
Rate: 430 per day inside ir35
We are seeking a Lead Data Engineering Consultant with proven experience in leading and developing data engineering platforms.
Experience required:
Soft Skills (Consultant):
If you are interested in this role or wish to apply, please feel free to submit your CV.
400 per day - inside IR35
Are you an expert in Natural Language Processing who thrives on building scalable, real-world AI solutions? We are seeking a hands-on Data Scientist to join a premier global credit ratings and financial information firm. You will be a key player in launching a brand-new, from-scratch analytics platform designed for elite institutional clients including corporate banks and asset managers.
The Opportunity
In this role, you will go beyond conventional boundaries to design, build, and deploy quantitative models that power advanced insights. You will collaborate with a cross-domain team of economists, political scientists, and developers to transform proprietary risk data into actionable strategic assets.
Your Impact
Your Experience
TM1 Developer Warton/Samlesbury(Hybrid -1 day p/w onsite) Competitive Salary +Bonus & Overtime
My client a multinational Defence organisation are looking for a TM1 Developer to join either their Warton or Samlesbury site working on a hybrid basis 1 day per week onsite.
What you’ll be doing:
Your skills and experiences:
To apply for this role, please send your CV to Peter Bibby on the email address below
Data Engineer - Telford 2 days onsite - 393 per day inside IR35 - 6 months We are looking for an ideally SC Cleared Data Engineer or one who is eligible for clearance This developer role will be primarily working on Talend and Oracle RDS systems, within our existing Talend framework and patterns. Experience of ETL tooling will be needed, preferably Talend but Pentaho/Informatica experience will be transferable. Experience working in Oracle RDS databases will also be required. Must have: Data ETL product experience - Talend preferred Oracle RDS Nice to have: SQL AWS GenAI *Damia Group Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept our Data Protection Policy which can be found on our website.* *Please note that no terminology in this advert is intended to discriminate on the grounds of a person's gender, marital status, race, religion, colour, age, disability or sexual orientation. Every candidate will be assessed only in accordance with their merits, qualifications and ability to perform the duties of the job.* *Should the role require the successful candidate to undergo and be eligible for UK Security Vetting. Clearance sponsorship will be provided where required. Due to the nature of the work, candidates should meet the relevant residency requirements. If applicable, Reserved Post nationality restrictions will be confirmed by the client. Damia is committed to inclusive recruitment and welcomes applicants from all backgrounds.* *Damia Group is acting as an Employment Business in relation to this vacancy and in accordance to Conduct Regulations 2003.*
Role: Senior Data Engineer Background: Leveraging data analytics to provide insights and recommendations to drive strategic decision-making collaborating with cross-functional teams, including Finance, Accounting, Operations, HR, and others to deliver accurate and timely financial reporting, dashboards, analytics, and data-driven insights. Key Accountabilities A Senior Data Engineer (Production Support) will be responsible for monitoring, maintaining, and supporting ETL processes, data pipelines, and data warehouse environments. The ideal candidate should have strong troubleshooting skills, hands-on experience with ETL tools, and the ability to quickly resolve production issues to ensure data availability, accuracy, and reliability. \* Monitor and support daily ETL processes, data pipelines, and batch jobs to ensure timely and accurate data delivery. \* Troubleshoot and resolve production issues, job failures, and performance bottlenecks across ETL and data warehouse systems. \* Work Closely with Data platform team to resolve data load issues. \* Perform root cause analysis of recurring issues and implement permanent fixes. \* Collaborate with development teams to transition projects smoothly into production and ensure operational readiness. \* Implement and maintain monitoring, alerting, and logging solutions for proactive issue detection. \* Ensure data quality, consistency, and availability through ongoing validation and health checks. \* Apply best practices for production support, including incident management, change management, and problem management. \* Work closely with business users, data analysts, and other stakeholders to resolve data-related queries. \* Document runbooks, support procedures, and knowledge base articles to streamline production operations. \* Continuously optimize processes for reliability, performance, and scalability in production environments. \* Ensure compliance with data security, access controls, and audit requirements in production systems. Day-to-Day Tasks - Senior Data Engineer (Production Support) Production Support: \* Check system dashboards, logs, and alerts for failures or anomalies. \* Verify data quality and integrity checks (row counts, duplicates, missing data, schema changes). \* Review ETL/ELT job runs, data pipeline executions, and batch processes. \* Validate data loads into staging, warehouse, and downstream systems for critical tables. \* Monitor real-time and scheduled jobs to ensure SLAs are met. \* Investigate and resolve production issues (job failures, data inconsistencies, performance delays). \* Collaborate with business users to resolve data access or reporting issues. \* Coordinate with development/engineering teams for fixes, hot patches, or re-runs of failed jobs. \* Track and document incidents, resolutions, and preventive measures in ticketing systems (e.g., ServiceNow, Jira). \* Participate in daily/weekly operations meetings to report status and highlight issues. \* Handover critical ongoing issues to on-call/offshore support (if applicable). Minor Works/ Maintenance: \* Enhance Existing models with addition of fields as per the requirements. \* Help with Deployments and initial loads during Go-live. \* Perform root cause analysis for recurring or high-severity incidents. Proactive/Preventive Work: \* Fine-tune ETL workflows and SQL queries to improve performance. \* Implement monitoring scripts and automation to reduce manual intervention. \* Restructure the Load plans to improve effeciency. \* Review security and access controls to ensure compliance. \* Update documentation (runbooks, troubleshooting guides, SOPs) for operational continuity. Skills and Capability requirements: \* 6+ years of experience with ETL, data pipelines, and data warehouse production environments. \* Strong expertise in troubleshooting ETL/ELT processes using tools such as Matillion, Informatica, ODI, or SSIS. \* Experience in cloud-based data platforms like Snowflake. \* Proven ability to analyze job failures, perform root cause analysis, and implement permanent fixes. \* Hands-on experience with monitoring, alerting, and logging tools. \* Familiarity with incidents, problem, and change management processes in ITIL-based environments. \* Strong SQL programming and debugging skills with relational and cloud databases. \* Experience with traditional and non-traditional forms of analytical data design (Kimbal, Inmon etc) \* Excellent communication skills to interact with business users, analysts, and cross-functional technical teams. Nice to Have \* Domain knowledge in the area of finance data is preferred. \* Experience with SAP Systems and Databases \* Knowledge of data visualization tools, such as PowerBI or Tableau
BI Developer (Power BI | Azure | SQL)
Hybrid - Wolves based office 3 days per week
Are you a data-driven problem solver who loves turning complex data into clear insights? We’re looking for a skilled BI Developer to join our clients growing team.
What you’ll do:
What we’re looking for:
BI Developer - apply ASAP if interesed. GleeIT
At Gleeson Recruitment Group, we embrace inclusivity and welcome applicants of all backgrounds, experiences, and abilities. We are proud to be a disability confident employer.
By applying you will be registered as a candidate with Gleeson Recruitment Limited. Our Privacy Policy is available on our website and explains how we will use your data.
Job Title: Data Engineer (AEOI) Rate: £430 per day inside ir35 Duration: 6 months Location: Telford/hybrid (2 days onsite) SC security clearance is required for this role. We're hiring an ETL Developer to support a major government AEOI programme covering Pillar R7, ETR Exchange, NTJ Exchange and CRS Outbound Exchange. Due to growing demand, new teams are being stood up and existing teams expanded to deliver critical data exchange services. Job Description: Project - AEOI Projects - Pillar R7/ETR Exchange/NTJ Exchange/CRS Outbound Exchange Demand in the AEOI programme space is expected to increase necessitating the stand-up of an additional team and the expansion of existing teams to support. This developer role will be primarily working on Talend and Oracle RDS systems, within my clients existing Talend framework and patterns.Expereince required: Experience of ETL tooling will be needed, preferably Talend but Pentaho/Informatica experience will be transferable. Experience working in Oracle RDS databases will also be required. Data ETL product experience - Talend preferred Oracle RDSNice to have: SQL AWS GenAIIf you are interested in this role, please feel free to submit your CV
Candidates must be based in UK and eligible to work in UK.
Remote-Friendly Software Development Opportunity in Agricultural Technology
Join an award-winning agricultural analytics software company at the forefront of sustainable technology. We’re seeking exceptional Back-End Developers who can transform data into impactful solutions that enhance food production, farm economics, and environmental resilience.
Position Highlights:
Candidate Assessment Process:
Ideal Candidate Profile:
Academic and Professional Requirements:
Technical Expertise:
Preferred Background:
Personal Attributes:
What We Offer:
Exeter, Devon (Hybrid 2 days per week in office)
About Us
At FDB (First Databank), we create and deliver the world s most trusted drug knowledge, enabling healthcare professionals to make critical decisions that improve patient safety, efficiency, and outcomes. Our solutions are embedded across hospitals, GP practices, pharmacies, and wider healthcare systems, supporting millions of patients every day.
Our values guide everything we do: Better Together, Clear Expectations, Constantly Curious, and Health at the Heart. If these resonate with you, you ll feel right at home with us.
The Opportunity
We are now looking for an experienced Data Engineer to join us on a full-time, permanent basis.
Working within Agile teams and collaborating with a range of experts, you ll have the chance to utilise your skills and build solutions that genuinely make a difference.
What s more, with hybrid working, a strong focus on wellbeing, an annual bonus scheme and a comprehensive benefits package, you ll have the flexibility, recognition and backing to do your best work while continuing to develop your expertise.
So, if you want to be part of building innovative solutions that support millions every day, read on and apply today!
The Role
As a Data Engineer, you will design and develop high-quality data solutions that support innovative software products aimed at improving health and environmental outcomes.
Working within Agile methodologies, you will collaborate closely with the Product Owner and a wide range of technical and subject matter experts to understand customer requirements and shape effective, scalable data components.
You will undertake requirements analysis, solution scoping, specification definition and data analysis, while challenging assumptions and defining appropriate acceptance criteria to mitigate risk.
Through the creation of production code and participation in code reviews, you will apply established design patterns and best practices to ensure performance, data quality, security, robust error handling, monitoring and logging.
Additionally, you will:
About You
To be considered as a Data Engineer, you will need experience using AI environments and good verbal and written communication skills, including presentation skills, as well as experience with the following:
You will also need some experience with Azure / AWS, PowerShell, Data lakes and Zoho Creator / Analytics.
The Benefits
You will be joining a very supportive team where you will have the opportunity to grow and develop new skills. In addition, FDB offers:
Other organisations may call this role Software Engineer, Data Module Developer, BI Engineer, Business Intelligence Engineer, Power BI Engineer, Python Developer, Python Programmer, R Developer, Python Engineer, or IT Data Engineer.
Data Manager - Birmingham (hybrid) 70,000 PA Opportunity for a Data Manager to join a well-known organisation undergoing significant technology transformation. A reputable, complex organisation with numerous sites, providing services to hundreds-of-thousands across the country. You'll be joining at a particularly exciting time for the business. Reporting directly to the Head of IT, you'll be responsible for establishing and leading an enterprise-wide data management capability within a regulated, operationally complex environment. This is a key role responsible for ensuring organisational data is accurate, trusted, secure and fit for operational, regulatory and strategic decision-making, spanning data strategy, governance, architecture, engineering, reporting and analytics. Key Responsibilities: Build and deliver an enterprise data strategy, aligned to business objectives and measurable outcomes Establish robust data governance, ownership, standards, quality controls and prioritisation Lead the development of target data architecture, including warehousing, modelling, integrations and pipelines Oversee data integrity, security, availability and compliance (including GDPR / Data Protection) Manage delivery through internal teams and external partners, including procurement and supplier management Recruit and lead a small team (up to 3 data engineers / BI analysts) over time Work closely with stakeholders to deliver timely, accurate reporting and actionable insights Drive continuous improvement through data quality metrics, audits and process optimisation Skills & Experience: Strong experience in enterprise data management, governance and architecture Excellent knowledge of Microsoft data platforms (Power Platform, Microsoft Fabric, Azure data technologies) Confident communicator able to translate complex data concepts for senior/non-technical stakeholders Experience in regulated, asset-intensive or safety-critical sectors Salary up to 70,000 PA The role offers excellent benefits, including free/heavily discounted public transport travel, 25 days holiday (+bank holidays) and an excellent pension scheme.