Infrastructure Data Engineer - Kafka Focus Fully Remote | Outside IR35 Working pattern: Sunday to Thursday Supporting a team based in Israel Exalto Consulting is supporting a client looking to appoint an experienced Infrastructure Data Engineer with strong Apache Kafka expertise. This is a fully remote contract role for someone who has hands-on experience running and improving Kafka in live production environments. The focus is not on initial setup alone. It is on keeping a high-volume streaming platform reliable, scalable, and well tuned over time. You would be joining a team responsible for maintaining and developing core data infrastructure that supports a range of business-critical use cases. The work includes improving platform performance, strengthening reliability, and helping internal teams make effective use of streaming capabilities. What the role involves You will take ownership of the day-to-day operation and improvement of Kafka infrastructure in production. That includes identifying performance issues, tuning configurations, resolving incidents, and building out the surrounding platform components needed to support a dependable streaming environment. The role also involves close collaboration with engineering, data, and analytics teams, so it is important that you are comfortable working across functions and helping others use the platform effectively. Key responsibilities Operate, maintain, and optimise Kafka clusters in production environments Design and improve Kafka infrastructure to support a range of streaming use cases. Tune Kafka settings including partitions, replication, retention, and throughput Monitor platform performance and address bottlenecks, instability, and capacity issues. Build and support related components including Kafka Connect, Schema Registry, and ELK. Develop internal tooling and microservices to support self-service platform use Improve monitoring, alerting, and observability using Prometheus and Grafana. Investigate production incidents, carry out root cause analysis, and make preventative improvements. Work with engineering, data, and analytics teams to ensure reliable data delivery. What we are looking for Proven experience running Kafka in live production environments. Strong understanding of Kafka internals and the trade-offs involved in scaling and performance tuning. Experience with Kafka Connect and Schema Registry. Good scripting and automation skills, ideally using Python. Experience with Elasticsearch and Kibana. Strong knowledge of Linux environments, shell scripting, and system performance tuning. Experience with Docker and Kubernetes, or similar container and orchestration tooling. Experience with CI/CD, Git, and infrastructure as code tools such as Terraform or Ansible. A solid understanding of distributed systems and streaming architectures. Experience supporting platforms where availability, resilience, and scale are important. Desirable experience Exposure to stream processing tools such as Apache Flink. Experience with cloud platforms including AWS, Azure, or GCP. Knowledge of hybrid environments. Experience with RBAC, multi-tenant systems, or usage metering. Experience with MSSQL or other relational databases. Working arrangement This role is fully remote and sits outside IR35. The team is based in Israel, so the working pattern is Sunday to Thursday. There is a two-hour time difference from the UK, and candidates should be comfortable working in line with that team structure. Important note This role requires genuine hands-on experience operating and optimising Kafka in production. Candidates whose experience is mainly limited to implementation, provisioning, or setup without ongoing production ownership are unlikely to be the right fit. To find out more, please get in touch with Exalto Consulting with a copy of your latest CV. TPBN1\_UKTJ
Do you have a strong background in software development? Becoming part of a growing community of data and digital professionals and champion excellence in AI data, and digital and help grow data & digital skills across HM Treasury? If so we would love to hear from you! About the Team The Chief Secretary of the Treasury (CST) has outlined the government's ambition to rewire the state – see Institute for Government speech. Central to this vision is more collaboration and transparency between departments and the centre of government on spending, requiring a greater level of sharing and harmonising of key data sets (Finance, Outcome & Performance data). To meet the spending challenges of the future, HM Treasury is committed to developing an integrated data solution which will enable a single version of the truth which provides real-time, standardised data on finance, outcomes and performance. This will allow for greater autonomy for departments, more open conversations between departments and HMT and more effective, data-driven decision making, ultimately leading to better outcomes for the public. The Finance and Performance Data Integration Service (FPDIS) is a key part of the Government’s ambition to rewire the state. The new compact between department and the centre requires more and better data, and this programme is the means by which the Treasury will get that data. The team is building and every role will bring vital perspectives and insight to the programme. We are currently developing our approach, business case and early thinking about what the future could look like. You would be joining us at the start of an exciting journey. About the Job The key responsibilities of the post holders will be: Technical Leadership · Lead the end-to-end technical design, development, and implementation of AI solutions. This would involve development and maintenance of analytic products in our preferred tech stack (Python, Plotly Dash and Azure) and experimentation with and use of other applications. · Provide technical guidance and mentoring to data engineers, analysts and non-technical staff working on the broader FPDIS programme. · Document AI architectures, models, and agent behaviours to ensure transparency, governance, and continuous improvement. Solution Design & Delivery · Lead technical delivery of an experimental Agile project to extract finance and performance data from PDFs and other documents. · Identify opportunities to apply AI to optimise public spending business processes, improve user experiences and deliver public value. · Integrate AI capabilities with enterprise platforms and services, including low-code environments, APIs, and data pipelines, and cloud-based data integration platforms. Technology Evaluation · Assess and select appropriate AI models, platforms, and tools (e.g. OpenAI, Copilot Studio). · Stay current with emerging AI technologies, particularly developments in agent-based systems, and evaluate their applicability to FPDIS. Collaboration & Partner Engagement · Work closely with partners to translate business needs into AI-enabled solutions, incorporating agent-based architectures where appropriate. · Support the training and upskilling of HMT staff in AI literacy, responsible use of intelligent systems, and adoption of AI-enabled tools. Governance & Compliance · Ensure all AI solutions are ethical, secure, and aligned with HMT’s strategic objectives and regulatory obligations, and wider DSIT guidance. About You · Technical leadership of applied AI projects · Designing of AI solution to a business problem. · Technical Application of LLMs in a Digital Product. · Working as a team Some of the Benefits our people love! · 25 days annual leave (rising to 30 after 5 years), plus 8 public holidays and the King’s birthday (unless you have a legacy arrangement as an existing Civil Servant). Additionally, we operate flexitime systems, allowing employees to take up to an additional 2 days off each month · Flexible working patterns (part-time, job-share, condensed hours) · Generous parental and adoption leave packages · Access to a generous Defined Benefit pension scheme with employer contributions of 28.97% · Access to a cycle-to-work salary sacrifice scheme and season ticket advances · A range of active staff networks, based around interests (e.g. analysts, music society, sports and social club) and diversity For more information about the role and how to apply, please follow the apply link. If you need any reasonable adjustments to take part in the selection process, please tell us about this in your online application form, or speak to the recruitment team.
Bristol
Who We Are
ACH is a social enterprise with a clear vision, dedicated to empowering refugees and migrants in the UK to lead self-sufficient and ambitious lives. We bring together a diverse team of strategists and researchers, driven by their own lived experiences, to provide tailored integration services.
Why We Do What We Do
Our mission goes beyond individual support; we actively challenge and disrupt the systems that perpetuate inequalities in our society. With a proven track record of delivering effective support services, in 2024, we helped house over 1,400 individuals and provided support to over 500 of our service users, helping them to achieve their personal goals and lead fulfilling lives in their new country.
We are now looking for a Data and Systems Analyst to join us on a full-time basis, for an 18 month fixed-term contract.
Our Commitment to You
This is a purpose-driven opportunity for a data and systems professional with strong Power BI and reporting expertise to join our mission-focused organisation.
Youll get the chance to see your work make a tangible difference, shaping data across our organisation so leaders and teams can make better decisions that directly support refugees and migrants to thrive.
Alongside meaningful work, youll benefit from an employment package designed to give you the flexibility, security and support to do your best work while being part of a genuinely people-centred organisation.
What Youll Be Doing
As a Data and Systems Analyst, you will turn organisational data into clear insight while shaping and supporting the IT systems that help us deliver our objectives.
Working closely with managers, team leaders and the IT team, you will develop and maintain key performance reports and data visualisations to support performance management and decision-making.
Youll analyse business processes, capture new system requirements, and help design reporting, forms and dashboards that improve how data is recorded, understood and used across the organisation.
You will also act as a champion for data integrity and effective system use, supporting staff at all levels to build confidence with reporting tools and ensuring we meet regulatory and governance expectations.
Additionally, you will:
What Were Looking For
To be considered as a Data and Systems Analyst, you will need:
The closing date for this role is 5th April 2026.
Other organisations may call this role Data Analyst, Systems Analyst, Business Intelligence Analyst, BI Analyst, Reporting Analyst, Data and Reporting Analyst, Performance Analyst, Data Insights Analyst, or Business Systems Analyst.
Webrecruit and ACH are equal opportunities employers, value diversity and are strongly committed to providing equal employment opportunities for all employees and all applicants for employment. Equal opportunities are the only acceptable way to conduct business and we believe that the more inclusive our environments are, the better our work will be.
So, if you want to grow your career while contributing to life-changing work as a Data and Systems Analyst, please apply via the button shown. This vacancy is being advertised by Webrecruit. The services advertised by Webrecruit are those of an Employment Agency.
AWS Architect (ÂŁ800-ÂŁ900 per day - 12 month contract) We're partnering with a leading global organisation in the financial data space to find a hands-on AWS Architect. This is a high-impact role focused on building a cutting-edge, cloud-native research environment for advanced analytics, quantitative research, and AI-driven innovation. If you thrive at the intersection of cloud architecture, data science platforms, and AI services, this is an opportunity to shape a next-generation research ecosystem used by top-tier data scientists and researchers. The Role As the AWS Architect, you will lead the end-to-end design and delivery of a scalable, secure, and highly connected research platform on AWS. This environment will empower teams to work with large, complex datasets and accelerate experimentation and insights. Key Responsibilities \* Architect and deliver a cloud-native research platform on AWS \* Design environments supporting Jupyter Notebooks and Amazon SageMaker \* Integrate Amazon Bedrock and emerging AI services to enhance research workflows \* Build scalable data access and connectivity frameworks for structured and unstructured datasets \* Develop Model Context Protocol (MCP) services to enable seamless data and notebook integration \* Implement semantic search and discovery capabilities (e.g. OpenSearch) \* Define best practices across architecture, security, and scalability \* Collaborate closely with engineering, data, and research stakeholders What We're Looking For \* Deep, hands-on AWS expertise (architect-level with strong delivery experience) \* Proven track record building data science platforms or research environments \* Strong grounding in data engineering principles (pipelines, governance, data access) \* Experience integrating AI/ML services, including Amazon Bedrock \* Ability to design and expose APIs/services for data and notebook interoperability \* Experience working in enterprise or regulated environments Nice to Have \* Exposure to emerging AWS AI tools (e.g. Amazon Q, Trainium, Bedrock Agents) \* Experience with agentic AI or MCP-compatible services \* Knowledge of OpenSearch or similar discovery tools \* Background in financial services, capital markets, or data-heavy industries Why Apply? \* Work on a greenfield, high-impact platform \* Collaborate with leading data scientists and researchers \* Shape the future of AI-enabled research environments \* Competitive compensation and flexible working
We’re partnering with a leading hospitality and members-club group to hire an experienced Power BI Developer for an 3-month duration. This is a great opportunity to deliver high-impact reporting and shape how the business uses data across membership, F&B, and club operations.
The role sits within the FP&A function, working closely with the Interim CFO, and requires someone who is both technically strong and commercially sharp.
The Opportunity
You’ll be responsible for building a full suite of Power BI dashboards that will become core to the business’s decision-making. This includes reporting across:
The business is moving towards a more modern data environment, so you’ll be developing reports directly from Microsoft Fabric rather than Excel, with plenty of autonomy to shape the approach.
What You’ll Be Doing
About You
We’re looking for someone with:
Robert Walters Operations Limited is an employment business and employment agency and welcomes applications from all candidates
Experience
12+ years of experience in Data Engineering, Data Warehousing, Cloud Data Platforms, and Enterprise Analytics solutions, with strong expertise in modern cloud data architectures.
Job Summary
We are seeking an experienced Data Architect with strong expertise in Snowflake on Amazon Web Services and DBT to design, architect, and optimize scalable enterprise data platforms.
The role involves defining data platform architecture, governance standards, and scalable data transformation frameworks, while ensuring high performance, security, and cost efficiency. The architect will provide technical leadership to data engineering and analytics teams and ensure the platform supports enterprise reporting, advanced analytics, and AI/ML initiatives.
The ideal candidate should also have exposure to AI/ML data platforms and experience in the hospitality domain, supporting systems such as reservations, guest management, and operational analytics.
Key Responsibilities
Required Technical Skills
Data Platform
Cloud Platform
Strong experience with Amazon Web Services, including:
Data Transformation
Programming / Query
Data Engineering
AI / Data Science Exposure
Preferred Skills
Education
Bachelor’s or Master’s degree in Computer Science, Information Technology, Data Engineering, Data Science, or a related field.
This is a chance for a well-qualified, and experienced DBA, to use their wider experience to step into a key role in our core data team. In a fast moving and intellectually challenging environment, our goal is to improve overall user satisfaction through product design and delivery.
The system operates on a complex set of SQL Server databases across many different customer environments (including AWS and Azure). Working closely with the Head of Data Services you will be responsible for a wide scope of database support functions including, design, support, performance, and reporting.
The role requires:
AleadingFinancialServicesorganisationundergoingalargescaledatatransformationislookingtohireanexperiencedDataQualityManageronapermanentbasis.TheroleoffersasalaryofÂŁ95,000plusastrongbenefitspackageandflexibleworking.
ThisrolewillsuitatechnicallycredibleDataQualityleaderwithagenuinepassionfordataquality,accuracyandtrust.Youwillworkcloselywithdataengineersandplatformteamstoembedpragmaticgovernanceandqualitycontrolsintodelivery,whileinfluencingstakeholdersacrossthebusinessandpossessacommercialmindset.
Thisisahandsontechnicalleadershiprole,combiningdataqualityandgovernanceownershipwithpracticalengineeringinput.YouwillleadasmallteamandpartnerwithdataengineersandoperationalSMEstoembedbestpracticeacrossdataquality,governanceanddatamanagement.
Roleremit
Experiencerequired
IfyouareanexperiencedDataQualityManagerwiththerequiredbackground,pleaserespondwithanup-to-dateCVforreview.
We’re looking for an experienced Analytics Engineer to help shape our data foundations and deliver high quality, trusted insights across the business. You’ll design and maintain scalable analytics models, ensure data quality, and work closely with teams across Finance, Commercial, Operations, and IT to turn complex requirements into intuitive, reliable datasets.
This role sits at the heart of our analytics function, balancing immediate reporting needs with long-term data architecture, and ensuring our organisation can make confident, data driven decisions.
Key Responsibilities
What We Are Looking For
We’re seeking someone with strong analytical and technical capability, who enjoys solving complex data problems and can confidently partner with stakeholders across the business. You’ll bring the ability to design robust data models, communicate clearly, and ensure outputs are accurate, consistent, and actionable.
Why Join Us
We understand there’s no one size fits all approach. We’re proud to offer an inclusive workplace where every colleague feels valued, supported, and empowered to be their true self. If you require any reasonable adjustments throughout the recruitment process, please let us know and we’ll be happy to accommodate.
Everyone who applies will receive a response.
Contract Details
Role Title
Incorta Developer/Data Engineer
Overview
We are looking for an experienced Incorta Developer/Data Engineer to support reporting and analytics delivery for an organisation using Oracle E-business Suite.
The role will focus on extracting, modelling, and visualising data within Incorta, enabling Real Time insights across finance, HR, and operational datasets.
Key Responsibilities
Core Skills & Experience
Essential
Desirable
Tech Environment
Were not just your average health company; were aiming to revolutionise access to healthcare in the UK by offering innovative health and wellbeing solutions that are affordable, accessible, and effective. From preventive care to comprehensive medical support, we aim to empower individuals to take charge of their health, inspiring them to make the most of their wellbeing. Added to that were the first health insurer in the UK to be awarded B-Corp status in recognition of our significant achievements in sustainability, in addition to our ambitious environmental and social responsibility goals.
As aSalesforce Data Engineer , youll be at the heart of our data-driven transformation. You will help enhance our Salesforce ecosystem by ensuring high-quality data flows into and across the platform, enabling richer insights, personalised customer engagement and more intelligent use of Salesforce capabilities, including emerging AI features.
Reporting to our Data Delivery & Governance Manager, youll work closely with CRM, Product, Tech, and Data colleagues. Youll design and build data integrations, strengthen governance frameworks, and support the development of audience and segmentation capabilities that underpin personalised communications at scale.
Key responsibilities:
Tasked with the integration of data into the Salesforce platform, including but not limited to Marketing Cloud, in accordance with the broader data strategy of the organisation.
Responsible for the high-quality standard of data imported into the Salesforce platform.
Charged with the implementation of robust governance measures for data within the Salesforce platform.
Committed to collaborating with business stakeholders to ascertain their data needs specific to Salesforce.
Engaged in partnership with the Data Architect to contribute to the development of robust architectural designs and principles.
Utilising insights, behavioural and personal data to create and maintain prospect and customer audiences, to facilitate the delivery of personalised comms and messaging.
Identifying, scoping, and working in partnership with CRM, Product and Tech colleagues to prioritise and deploy MarTech and data enhancements.
Evolving our use and adoption of the Salesforce Platform including new channels and AI capability, together with broader marketing technology and data maturity, to enable personalisation at scale.
Providing support, mentoring and advice to colleagues in other areas of the business on best practice in the Marketing Coud platform.
Maintain, build and own native Salesforce business processes
Perform simple and complex daily administration tasks such as data manipulation, data loading, merging of duplicate records, managing custom fields, objects, layouts, list views, security configuration, complex workflows/Process/Flows, and overall system configuration
Proactively identifying opportunities where with marketing data and/or technology we can improve our CRM communications, reduce manual tasks, and improve the customer experience
TPBN1_UKTJ
BI / Datawarehouse Lead / Manager Home / Prestigious Client Prestigious in their sector, my client has an excellent market image and continues to make a significant impact. They are heavily involved in an ambitious business / systems transformation, and we have an excellent opportunity for an established Datawarehouse Lead to join their team. The successful candidate will play a lead role in the technical development and management of their global Data Warehouse and supporting team. The successful candidate will have a proven background and experience of full cycle development of a corporate, Microsoft based Data Warehouse. Desired skills: Design, Build, Test, Azure, Fabric Lakehouse We are looking to recruit a high-calibre resource, so the client is happy to consider candidates from both a contract and permanent background. Please forward your most recent CV to be considered for telephone screening. SQL / BI / ETL / DATAWAREHOUSE / AZURE / FABRIC / SPARK / PYSPARK / LEAD / MANAGER / HOME / REMOTE / MIDLANDS / BIRMINGHAM / SQL / BI / ETL / DATAWAREHOUSE / AZURE / FABRIC / SPARK / PYSPARK / LEAD / MANAGER / HOME / REMOTE / MIDLANDS / BIRMINGHAM
About The Role
This is a brand-new role with a big remit and even bigger opportunity.
As our AI, Automation and Integration Engineer, youll be at the forefront of how Salvation Army Homes uses technology to work smarter, faster and more effectively. This isnt about maintaining the status quo - its about designing the future.
Youll lead the charge on automation, system integration and the responsible adoption of AI, helping us move towards streamlined processes, joined-up data, and genuinely intelligent digital services. From low-code automation and cloud integration to exploring practical AI use cases, youll have the space and backing to experiment, innovate and deliver real impact.
Working within our Digital, Data and ICT team, youll collaborate closely with colleagues across the organisation to turn ideas into working solutions. Youll help create a single version of the truth across our systems, reduce duplication and manual effort, and enable better decision-making through clean, connected data.
Because this role is new, youll play a key part in shaping how it operates - setting standards, defining approaches, and influencing how we use emerging technologies across the organisation. If youre excited by greenfield work, modern platforms and meaningful outcomes, this is a rare chance to make a role your own.
About The Candidate
Youre a hands-on technologist with a strong track record of delivering automation, integration and modern digital solutions in real-world environments - and youre ready to step into a role where you can shape both the technology and the approach.
Youll bring proven experience of designing and delivering process automation, ideally using the Microsoft Power Platform (Power Automate, Power Apps and Power BI), alongside experience building and supporting integrations between business-critical systems. Youre comfortable working across data, workflows and APIs to reduce manual effort and create seamless, joined-up services.
Youll have a strong technical foundation, including:
You dont just build solutions - you think about how theyre used, governed and scaled. You understand the importance of security, compliance and responsible data use, and you can balance innovation with control. Experience working in complex or multi-system environments is important, as is the ability to document, standardise and improve what you deliver.
Youre also excited by whats next. You may already have experience applying AI concepts in a business context, or you may be keen to develop this further - but either way, youre motivated to explore how AI and emerging technologies can be applied practically, ethically and at scale to improve services and decision-making.
Just as importantly, youre a strong collaborator and communicator. You can translate complex technical ideas into plain English, influence stakeholders at all levels, and work closely with analysts, data specialists and business teams to turn ideas into delivered outcomes. Youre organised, proactive, and comfortable managing multiple priorities in a fast-moving, evolving environment.
Above all, youre motivated by the opportunity to build something new, take ownership of a greenfield role, and play a leading part in an organisations journey towards automation, integration and AI-enabled services - while staying aligned with strong values and a clear social purpose.
The benefits on offer
In return for helping to transform lives, well give you access to some great benefits. These include:
About The Company
A registered social landlord andone of the leading providers of supported housing in the UK, Salvation Army Homes is dedicated to transforming lives by providing accommodation and support for some of the most vulnerable members of society - mainly people with complex needs and/or experiencing homelessness.
Our aim is to work with individuals to build on their strengths, creating person centred, individualised strategies and plans that transform lives, support recovery and enable positive behaviour. In order to succeed, however, we need the right people in place. Our workforce is one of our greatest assets, but only by recruiting the very best can we continue to deliver comprehensive, good quality housing services, support and resettlement. services to our residents. Thats where you come in.
As an equal opportunities employer, Salvation Army Homes is committed to the equal treatment of all current and prospective employees and does not condone discrimination on the basis of age, disability, sex, sexual orientation, pregnancy and maternity, race or ethnicity, religion or belief, gender identity, or marriage and civil partnership. We invite and welcome applications to apply for Salvation Army Homes opportunities without concern of bias or discrimination.
We reserve the right to close this vacancy early if we receive sufficient applications for the role. Therefore, if you are interested, please submit your application as early as possible.
We are looking for a hands-on Lead Data Platform Engineer to join a high-impact programme delivering scalable, Real Time data solutions. This role offers the opportunity to work on complex, mission-critical systems using modern technologies across streaming, cloud, and distributed data platforms.
The Role
As a Lead Data Platform Engineer, you will design and deliver batch and Real Time data pipelines that are highly available, low-latency, and scalable.
You will collaborate with cross-functional teams to shape solutions, influence architecture, and ensure data is transformed into reliable, consumable outputs.
Key Responsibilities
Required Experience
Desirable Skills
If you are interested, please apply or get in touch to discuss further.
We’re looking for a Machine Learning Engineer to design, build, and deploy data-driven models that solve complex problems and enhance real-world performance. You’ll work across the full ML life cycle, collaborating with data, engineering, and product teams to turn ideas into robust, scalable solutions.
Responsibilities:
Reasonable Adjustments:
Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people of all backgrounds and perspectives. Our success is driven by our people, united by the spirit of partnership to deliver the best resourcing solutions for our clients.
If you need any help or adjustments during the recruitment process for any reason, please let us know when you apply or talk to the recruiters directly so we can support you.
SQL Developer - Contract - (Apply online only) per day - Outside of IR35
I am currently looking for a strong SQL Developer. My London based client is looking to get someone started immediately.
As a SQL Server/ ETL Developer you will have strong SQL Scripting skills and have the ability to Write stored procedures from scratch. The client is in the process of trying to upload/import over 400 files on a daily basis with no delays.
Location: Remote
Length: 6 months with strong view to extend
Day Rate: 250 Per Day - Dependent on experience
IR35 Status: Outside of IR35.
Required experience will include:
If you are interested in this SQL Server/ ETL Developer role please apply with your most recent CV. Alternatively email me on Jordan co . uk.
SQL Developer - Contract - 250 per day - Outside of IR35
Randstad Technologies is acting as an Employment Business in relation to this vacancy.
We’re hiring a Data Engineer / Consultant Data Engineer to design and deliver scalable data pipelines and big data solutions for clients.
This is a client-facing role, combining hands-on engineering with stakeholder engagement and solution design.
Key Responsibilities
Skills & Experience
Nice to Have
Lead Power BI Developer (12 - month Fixed Term Contract)
Location: Remote
Overview
We are seeking a Lead Power BI Developer to deliver high-quality, scalable reporting solutions within a project-focused environment. This role combines hands-on development, leadership, and strong stakeholder engagement, including interaction with senior leadership and C-suite.
A key focus of this role is to establish and embed Power BI best practices across the organisation, coaching and enabling existing staff to improve capability, consistency, and long-term sustainability of BI solutions.
Key Responsibilities
Lead the design and delivery of Power BI dashboards, reports, and data models
Translate business requirements into scalable, user-friendly BI solutions
Drive best practice in data modelling, DAX optimisation, and report design
Own Power BI Service (workspaces, apps, deployment, governance)
Implement and manage access control (RLS, Azure AD security)
Establish governance, standards, and lifecycle management processes
Collaborate with data engineers on pipelines and data model design
Lead stakeholder engagement, including presenting to senior leadership and C-suite
Support testing, deployment, documentation, and user adoption
Mentor team members and provide technical leadership
Embed Power BI best practices across the organisation through coaching and enablement
Essential Skills & Experience
Proven experience delivering end-to-end Power BI solutions in a lead capacity
Strong expertise in Power BI Service, governance, and deployment
Deep understanding of access control (RLS, Azure AD groups)
Advanced DAX and strong SQL skills
Strong data modelling experience (star schema, performance optimisation)
Experience with Azure-based platforms and Databricks
Strong stakeholder management, including C-suite engagement
Excellent communication skills across technical and non-technical audiences
Desirable Experience
Databricks experience
Data warehousing and architecture design exposure
Experience building large-scale enterprise semantic models
Microsoft Dynamics Business Central or Navision
DevOps practices
If you are interested in the role then please apply or reach out
Are you ready and looking for a role that you can make your own, taking the autonomy to set out what and how you do it?
As we have grown, we have accumulated a diverse database infrastructure, including PostgreSQL, Maria DB, InfluxDB and MongoDB systems.
The timing is ripe for an experienced Administrator to take ownership, mature, upgrade and manage our database servers, while supporting development teams and business operations.
Key Accountability & Responsibilities
Work with teams across the Technology department to install, configure, maintain, and upgrade our database servers across development, testing, and production environments.
Monitor database health and perform routine maintenance tasks including index optimisation, table maintenance, and schema modifications.
Work with Development and Data engineering teams to optimise performance & cost of data pipelines
Manage database capacity planning and storage allocation to ensure adequate resources for current and future needs.
Proactive management of databases, ensuring application and operational performance needs are met.
Implement and maintain high availability solutions including replication, clustering, and failover configurations.
Document database architectures, configurations, procedures and policies.
Implement appropriate security controls to ensure databases and data are protected.
Ensure database backup, validation and disaster recovery capabilities are in place and rehearsed.
Provide insight and recommendation on the adoption and consolidation of database related technologies.
Be part of the on 24x7 on call rota to provide out of hours support for our critical systems.
Knowledge & Skills
Proven expertise managing PostgreSQL and MariaDB/MySQL databases
Experience with NoSQL databases.
Experience with cloud database services (AWS).
Deep understanding of relational database concepts, normalisation, and SQL optimisation.
Proficiency in SQL and query optimisation across multiple database platforms.
Experience with database replication, clustering, and high availability configurations.
Familiarity with backup and recovery tools specific to each database platform.
Understanding of database security principles and access control mechanisms.
Experience with monitoring tools and performance analysis techniques.
Knowledge of version control systems (Git) for managing database code and scripts.
Experience with database automation, CI/CD pipelines and tooling.
Gigaclear is a growing Fibre Broadband (FTTP / FTTH) company, developing our fibre-to-the-premises broadband infrastructure to some of the most difficult to reach areas of the UK, empowering those communities with broadband to rival any city.
Staff rewards, benefits and opportunities
We foster a collaborative, engaging culture that empowers staff to grow and maximise their skills. We want to challenge our people in a fair environment where hard work is rewarded and a path for progression is open to all.
Our approach is to work guided by our mission, vision and values.
Our Mission - Empowering communities with brilliant broadband
Our Vision - Connected Communities
Our Values - Own it, Find the Right Way, Work Together, Win Together
SAS Data Engineer - MUST HAVE SC CLEARANCE - Remote and Telford or Hove - 6 months+/RATE: ÂŁ459 per day inside IR35
One of our Blue Chip Clients is urgently looking for a SAS Data Engineer.
Hybrid role: requires attendance for occasional workshops (typically a couple of days per month) at one of our sites - Telford or Hove
Please find some details below:
Clearance Required: Active SC with a governing body
SAS Data Engineer to support Live service within CONNECT ACE - CONNECT is a strategic risking tool that cross matches one and a half billion internal and third party data items to enable the customer to capture up to ÂŁ25 million in yield per day in recovered tax revenue.
SC is required for this role, Working in a fast paced environment, an experienced engineer with SAS, Oracle SQL & Unix skills to join the Blue ACE Team to support the Live Services in resolving incidents and problems. Also to support the development on Projects
Must have
SAS 9.4 Programming skills
Unix/Linux Skills
Excellent interpersonal skills
Good planning and scheduling capabilities
SC Clearance
Good people management skills
Good understanding of delivery
Team Player
Customer facing skills
Resilience
Agile (Scrum and Kanban)
SAS Viya Programming & Gitlab knowledge would be advantages
REQUIRED SKILLS:
SAS 9.4 Programming skills
Unix/Linux Skills
Excellent interpersonal skills
Good planning and scheduling capabilities
SC Clearance
Good people management skills
Good understanding of delivery
Team Player
Customer facing skills
Resilience
Agile (Scrum and Kanban)
Additional Requirements:
Please send CV for full details and immediate interviews. We are a preferred supplier to the client.
Your new company Working for a renowned financial services organisation Your new role Seeking a Data Engineer to help design and maintain scalable batch and near‑real‑time ingestion pipelines, modernizing legacy ETL/ELT processes into Azure and Snowflake, and implementing best‑practice patterns such as CDC, incremental loading, schema evolution, and automated ingestion frameworks. They build cloud‑native solutions using Azure Data Factory/Synapse, Databricks/Spark, ADLS Gen2, and Snowflake capabilities including stages, file formats, COPY INTO, and Streams/Tasks to support raw‑to‑curated data modelling. The role involves creating reusable components and Python libraries to accelerate delivery across teams, enforcing data quality through validation, observability, and robust pipeline design, and ensuring strong security, governance, and documentation standards. Collaboration within agile workflows-including CI/CD, code reviews, and iterative planning-is also key to delivering consistent, reliable, and secure data solutions. What you'll need to succeed Strong hands-on data engineering experience, with strong focus on data ingestion Experience building production pipelines using Azure Data Factory, Databricks, Synapse Solid SQL skills and experience working with modern cloud data warehouses, ideally Snowflake Proficiency in Python for data processing, automation, and pipeline utilities Good understanding of data lake/lakehouse concepts and ingestion patterns Infrastructure-as-Code exposure (Terraform) and CI/CD (Azure DevOps) Able to prototype quickly while adhering to Group standards and controls Communicates clearly with business stakeholders and technical teams Familiarity with orchestration frameworks (Dagster) - desirable Energy commodity trading experience is a real advantage What you'll get in return Flexible working options available. What you need to do now If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found at (url removed)