? Job Ad: Analytics Engineer (6-Month Initial Contract)
Role: Analytics Engineer
Location: Hybrid London
Contract Length: 6-Month Initial Contract (likely extension)
Client: A large insurance company
Industry Preference: Financial Services or Insurance experience strongly preferred
?? About the Role
We are supporting a large insurance company in hiring a skilled Analytics Engineer for a 6-month initial contract. Youll join a data function operating with modern engineering practices, helping to build robust, well-tested, scalable analytics models that power key decision-making across the business.
?? What Youll Be Doing
?? What Were Looking For
? Industry Preference
Because of the nature and regulatory environment of the client, we are especially interested in candidates from:
Such backgrounds tend to align strongly with the systems, data models, and governance structures used by the client.
?? Nice to Have
?? Why Join Us?
Job Title: Data Analytics Engineer
Location: London, UK
Job Type: Contract
Trading as TEKsystems. Allegis Group Limited, Maxis 2, Western Road, Bracknell, RG12 1RT, United Kingdom. No. 2876353. Allegis Group Limited operates as an Employment Business and Employment Agency as set out in the Conduct of Employment Agencies and Employment Businesses Regulations 2003. TEKsystems is a company within the Allegis Group network of companies (collectively referred to as “Allegis Group”). Aerotek, Aston Carter, EASi, Talentis Solutions, TEKsystems, Stamford Consultants and The Stamford Group are Allegis Group brands. If you apply, your personal data will be processed as described in the Allegis Group Online Privacy Notice available at https://www.allegisgroup.com/en-gb/privacy-notices.
To access our Online Privacy Notice, which explains what information we may collect, use, share, and store about you, and describes your rights and choices about this, please go to https://www.allegisgroup.com/en-gb/privacy-notices.
We are part of a global network of companies and as a result, the personal data you provide will be shared within Allegis Group and transferred and processed outside the UK, Switzerland and European Economic Area subject to the protections described in the Allegis Group Online Privacy Notice. We store personal data in the UK, EEA, Switzerland and the USA. If you would like to exercise your privacy rights, please visit the “Contacting Us” section of our Online Privacy Notice at https://www.allegisgroup.com/en-gb/privacy-notices for details on how to contact us. To protect your privacy and security, we may take steps to verify your identity, such as a password and user ID if there is an account associated with your request, or identifying information such as your address or date of birth, before proceeding with your request. If you are resident in the UK, EEA or Switzerland, we will process any access request you make in accordance with our commitments under the UK Data Protection Act, EU-U.S. Privacy Shield or the Swiss-U.S. Privacy Shield.
Location: London / Hybrid
Salary: Competitive
The Role
We are looking for a Python Developer with strong GIS experience to join a growing engineering team building geospatial data platforms used by global organisations.
This role sits within the Client Solutions team, responsible for developing tools and data pipelines that deliver risk analytics and intelligence in a geospatial context.
Youll work in a modern engineering environment using Python, Django, AWS and geospatial data tooling, building scalable systems that process large volumes of spatial data and power mapping solutions used by clients worldwide.
This is a strong opportunity for an engineer who enjoys building real data platforms rather than scripts, and wants to work on meaningful problems around global risk, sustainability and data intelligence.
Responsibilities
Required Skills
Nice to Have
What You’ll Work With
Why Join
How to Apply
If youre an Applied ML Contractor looking a new exciting opportunity working focused on real operational decision systems, get in touch.
Please apply if interested and well aim to respond within 24 hours.
DV Data ScientistLocation: RAF Wyton, Huntingdon (45 days onsite)Clearance: Active DV EssentialContract: 12 months Inside IR35 (extension likely up to 3 years) or Permanent roleRate: Negotiable
Programme Overview
Arqtech Search is supporting a defence delivery partner on a secure data transformation programme within an RAF operational environment. The programme focuses on designing, sustaining and evolving secure data platforms that underpin intelligence, surveillance and operational decision-support capabilities. The Data Scientist will play a critical role in delivering resilient, maintainable, and scaleable AI solutions.
Key Responsibilities
Required Skills
Due to nature of this work, active DV clearance is essential and 5 days per week are required on-site in RAF Wyton for a 9 day condensed working week (every other Friday off).We cannot consider candidates who do not currently hold an active DV clearance.
Description
Role Summary The Principal Engineer is the senior technical authority responsible for setting the engineering direction, ensuring platform reliability, and driving innovation across a critical enterprise technology domain. This role provides deep technical leadership, shapes long term strategy, and ensures that engineering teams deliver secure, scalable, and resilient services that underpin the organisations digital ecosystem. The Principal Engineer acts as the highest level hands on expert, partnering with architects, product owners, and engineering squads to define standards, modernise platforms, and embed automation and DevOps practices across all services. Technical Leadership Serve as the domains foremost technical expert, accountable for engineering excellence and long term platform strategy. Define and maintain technical standards, patterns, and best practices across the domain. Lead complex design decisions, ensuring solutions are secure, scalable, and aligned with enterprise architecture. Drive adoption of automation, DevOps tooling, and platform as a product principles across all engineering teams. Platform Ownership Oversee the health, performance, and lifecycle of core platforms within the domain. Ensure capacity planning, resilience, backup, recovery, and disaster readiness are embedded into all services. Champion observability, monitoring, and proactive incident prevention. Innovation & Modernisation Identify opportunities to modernise legacy platforms, reduce technical debt, and introduce new technologies. Evaluate emerging tools, frameworks, and architectures relevant to the domain. Lead proof of concepts and guide engineering teams through adoption. Collaboration & Influence Partner with cross domain Principal Engineers to ensure cohesive enterprise wide engineering standards. Work closely with product, security, architecture, and operations teams to deliver integrated solutions. Mentor senior engineers and uplift engineering capability across the organisation. Governance & Risk Management Ensure compliance with security, regulatory, and operational standards. Provide technical oversight for major changes, upgrades, and transformation initiatives. Act as an escalation point for critical incidents and complex technical challenges. Job Description: Database Engineering (Oracle, ExaCC, PostgreSQL, MongoDB) Role Summary Were looking for a Principle Database Engineer with deep expertise across enterprise grade and open source database platforms, including Oracle, Exadata Cloud@Customer (ExaCC), PostgreSQL, and MongoDB. Youll design, operate, and optimise mission critical data services that support high volume, high availability applications. This role blends hands on engineering with architectural thinking, ensuring our data platforms are secure, scalable, and performant. Key Responsibilities Oracle & ExaCC Engineering Administer and optimise Oracle databases running on Exadata and Exadata Cloud@Customer platforms. Perform capacity planning, performance tuning, and storage optimisation for ExaCC environments. Manage Oracle RAC, Data Guard, RMAN, and advanced Oracle features for high availability and disaster recovery. Support patching, upgrades, and lifecycle management across Oracle estates. PostgreSQL Administration Deploy, configure, and maintain PostgreSQL clusters in production environments. Implement replication, backup strategies, and failover mechanisms. Optimise query performance, indexing strategies, and storage utilisation. Develop automation for provisioning, monitoring, and patching PostgreSQL instances. MongoDB Operations Manage MongoDB clusters, including sharding, replication, and scaling strategies. Monitor and tune performance for document based workloads. Implement backup, restore, and disaster recovery processes for MongoDB environments. Ensure data integrity, schema design best practices, and operational resilience. Cross Platform Database Engineering Build automation for database provisioning, configuration, and compliance using IaC and scripting. Implement robust monitoring and observability for all database platforms. Collaborate with application teams to design schemas, optimise queries, and troubleshoot performance issues. Ensure security hardening, auditing, and adherence to data governance standards. Participate in on call rotations supporting critical database services. Skills & Experience Essential Strong hands on experience with Oracle Database and Exadata/ExaCC platforms. Solid PostgreSQL administration skills in production environments. Practical experience managing MongoDB clusters at scale. Proficiency in SQL, PL/SQL, and scripting languages (Python, Bash, PowerShell). Understanding of replication, clustering, high availability, and disaster recovery patterns. Familiarity with database performance tuning and query optimisation. Experience with automation tools (Terraform, Ansible, Liquibase, Flyway, etc.).
Skills
Job Title: Principle Engineer - Databases
Location: Manchester, UK
Rate/Salary: 500.00 - 600.00 GBP Daily
Job Type: Contract
Trading as TEKsystems. Allegis Group Limited, Maxis 2, Western Road, Bracknell, RG12 1RT, United Kingdom. No. 2876353. Allegis Group Limited operates as an Employment Business and Employment Agency as set out in the Conduct of Employment Agencies and Employment Businesses Regulations 2003. TEKsystems is a company within the Allegis Group network of companies (collectively referred to as “Allegis Group”). Aerotek, Aston Carter, EASi, Talentis Solutions, TEKsystems, Stamford Consultants and The Stamford Group are Allegis Group brands. If you apply, your personal data will be processed as described in the Allegis Group Online Privacy Notice available at https://www.allegisgroup.com/en-gb/privacy-notices.
To access our Online Privacy Notice, which explains what information we may collect, use, share, and store about you, and describes your rights and choices about this, please go to https://www.allegisgroup.com/en-gb/privacy-notices.
We are part of a global network of companies and as a result, the personal data you provide will be shared within Allegis Group and transferred and processed outside the UK, Switzerland and European Economic Area subject to the protections described in the Allegis Group Online Privacy Notice. We store personal data in the UK, EEA, Switzerland and the USA. If you would like to exercise your privacy rights, please visit the “Contacting Us” section of our Online Privacy Notice at https://www.allegisgroup.com/en-gb/privacy-notices for details on how to contact us. To protect your privacy and security, we may take steps to verify your identity, such as a password and user ID if there is an account associated with your request, or identifying information such as your address or date of birth, before proceeding with your request. If you are resident in the UK, EEA or Switzerland, we will process any access request you make in accordance with our commitments under the UK Data Protection Act, EU-U.S. Privacy Shield or the Swiss-U.S. Privacy Shield.
Technical Data Engineer / Analyst | Contract | Azure | 6 months | Hybrid | Outside IR35 | £425 - £475 | Nottingham | We’re supporting a major data transformation programme and looking for a Technical Data Analyst to help migrate legacy systems into a modern Azure data platform. This is a hybrid role 3 days a week in the Nottingham office which is none negotiable. This is a ideal role for someone who is looking for hands on experience with Azure analytics tools and thrives in a data heavy environment. What you'll be doing: Analysing, understanding, and documenting legacy data structures Supporting data model mapping from old systems into the new Azure platform Validating migrated datasets for accuracy and completeness Creating reporting assets and dashboards using Power BI Working with engineers to identify data issues, test pipeline outputs, and improve overall data quality Key Skills Strong SQL skills and experience working with Azure‑hosted datasets Power BI reporting/dashboards creation Python experience Exposure to Azure Data Lake, Databricks, or Data Factory is beneficial If this is a role that suits your skill set, you are immediately available and can work 3 days a week in the office near Nottingham then please apply for the attached job or send your CV to (url removed). Technical Data Engineer / Analyst | Contract | Azure | 6 months | Hybrid | Outside IR35 | £425 - £475 | Nottingham
Mid-Level Engineer - Technology & Innovation
London, UK | Hybrid 3 days a week in office.
We’re hiring a Mid-Level Engineer to join a growing Technology & Innovation team within a Financial Advisors firm, one of the UK and Ireland’s leading Corporate Finance advisory firms.
Reporting to the CTO, you’ll help deliver and manage our cloud-agnostic data & AI platform , scale client digital products , and work across cutting-edge technologies including AI, cloud platforms, and data engineering . This role sits at the heart of our strategy to unlock value from data across real assets, sustainability, and essential services .
What you’ll do
Build and enhance our cloud platforms and data infrastructure
Deploy AI/ML and GenAI solutions into production
Drive automation and operational efficiency using AI and RPA
Support development of client-facing digital platforms and dashboards
Modernise systems into cloud-native, API-first architectures
Key skills
Cloud engineering (Azure, Kubernetes, serverless, DevOps)
Python and relational databases (SQL)
Data platforms, ETL/ELT, and analytics
AI tools and LLM integrations (e.g. GPT, Claude, Gemini)
Security, governance, and regulated environment experience
Why this team?
Work in a high-performing, cooperative team
Exposure to advanced AI and data technologies
Opportunity to shape technology strategy in corporate finance
A values-driven B-Corp organisation focused on Finance with a Purpose
If you’re an ambitious technologist who enjoys solving complex problems and building impactful platforms, please apply below.
*This role does not offer Visa sponsorship*
McGregor Boyall is an equal opportunity employer and do not discriminate on any grounds.
TPBN1_UKTJ
One of our Blue Chip Clients is urgently looking for a Data Management Support - (Aircraft Maintenance CAMO Data Support). This role is fully remote but candidate need to be within 45 min commute of Brize Norton as per Client's request due to the on call element of the role. CONTRACTOR MUST BE ELIGIBLE FOR BPSS MUST BE PAYE THROUGH UMBRELLA - Aerospace manufacturing - Manufacturing Process capture and document creation - Technical areas o Aerospace Primary Structure Composite (carbon fibre) Composite wing manufacture Composite primary structure - curing/co-bonding Inspection methods Manufacturing methods Metallic Treatments Inspection Methods Mechanical assembly and drilling (metallic/composite) Assembly Machining Structural adhesives Paints Sealants Shift Pattern Early shift: 7:00 am to 15:00 pm Central shift: 9:00 am to 17.00 pm Late shift: 12:30 am to 20:00 pm Primary shift: 20:00 pm to 7:00 am. Only need to go to the office if an A/C lands. Weekend shift: Currently from 07:00 Saturday morn to 07:00 Monday morn Please send CV for full details and immediate interviews. We are a preferred supplier to the client.
Job Title: RPA & Data Automation Developer
Contract Type: 6 month contract
Inside IR35 - £550-£700 per day (umbrella rate)
Location: Hybrid working - Surrey
Are you ready to take your career to the next level in the world of Robotic Process Automation (RPA) and Data Automation? We’re on the lookout for a passionate RPA & Data Automation Developer to drive efficiency, improve data quality and deliver actionable insights. If you thrive in a collaborative environment and enjoy working with cutting-edge technologies, we want to hear from you!
Key Responsibilities:
What We’re Looking For:
If you’re excited about the possibility of making a difference through RPA and Data Automation, we’d love to hear from you! Apply now and become a vital part of our mission to enhance public services through data-driven insights.
Let’s innovate together!
Adecco is a disability-confident employer. It is important to us that we run an inclusive and accessible recruitment process to support candidates of all backgrounds and all abilities to apply. Adecco is committed to building a supportive environment for you to explore the next steps in your career. If you require reasonable adjustments at any stage, please let us know and we will be happy to support you.
Exalto consulting are currently recruiting for a contract data scientist, this is inside IR35 paying £500 per day, need to go to site x3 days a week in Hounslow.
Skills/capabilities
• Strong knowledge of either machine learning and optimization techniques, incl. supervised (regression, tree methods, etc.), unsupervised (clustering) learning, and operations research (linear, mixed integer programming, heuristics)
• Fluent in Python (required) and other programming languages (preferred) with strong skills in applying DS, ML, and OR packages (scikit-learn, pandas, numpy, gurobi etc.) to solve real-life problems and visualise the outcomes (e.g. seaborn)
• Proficient in working with cloud platforms (AWS preferred), code versioning (Git), experiment tracking (e.g. MLflow)
• Experience with cloud-based ML tools (e.g. SageMaker), data and model versioning (e.g. DVC), CI/CD (e.g. GitHub Actions), workflow orchestration (e.g. Airflow/Dagster) and containerised solutions (e.g. Docker, ECS) nice to have
• Experience in code testing (unit, integration, end-to-end tests)
• Strong data engineering skills in SQL and Python
• Proficient in use of Microsoft Office, including advanced Excel and Powerpoint Skills
• Advanced analytical skills, including the ability to apply a range of data science and analytic techniques to quickly generate accurate business insights
If you have the above experience and are looking for a new contract role please send your CV for immediate consideration as our client are looking to hire ASAP
contract data scientist - Hounslow x3 days a week - £500 inside IR35
A successful FTSE-listed organisation with a friendly, close-knit culture is looking for a Senior Data Engineer to help shape and deliver its evolving data platform. This is a hands-on, high-impact role in a smaller organisation where you will act as both a senior engineer and strategic data partner, designing solutions that support both operational and long-term business goals. You will play an important role in the company’s migration to Microsoft Fabric, while continuing to build and optimise its existing SQL Server-based data warehouse environment. While primarily a backend engineering role, you will also support reporting via SSRS/Microsoft Power BI, with an increasing focus on AI-driven capabilities over time. As the Senior Data Engineer, you will work closely with the Solutions Architect and collaborate with teams across the business, with occasional travel to France and Germany. This opportunity suits someone with in-depth SQL experience who is looking to work with modern tools like Microsoft Fabric. Essential Skills Strong SQL Server experience (T-SQL, SSIS, SSRS, stored procedures, functions, triggers) Data warehouse architecture, build and maintenance Excellent communication skills and ability to gather requirements from non-technical stakeholders at all levels Degree in Computer Science or STEM subject Comfortable working 4 days per week onsite in Vauxhall Desirable Microsoft Fabric, Azure Synapse, Azure Data Factory, Snowflake, Databricks Python or any other relevant language Experience with tools such as Power BI, Qlik or Tableau The role offers a package of £64k+, plus a bonus of up to 15%, along with a generous pension, 28 days' holiday, private medical insurance, permanent health insurance, study support, and a modern office with an onsite gym. If your experience aligns, apply with an up-to-date CV as soon as possible, as this is expected to be a popular opportunity
Azure Data Engineer | £400 - £500 Outside IR35 | Bristol | Hybrid | 6‑Month Initial Term |
A large‑scale data transformation programme is underway, and our client is looking for an experienced Azure Data Engineer to support the rebuild of their cloud data platform. This role is hands‑on and delivery‑focused — you’ll be designing and developing Azure‑native data pipelines, working extensively with Databricks, and shaping scalable data models across the Microsoft ecosystem. The role would require you to be on site in Bristol 4 days per week, please only apply for this position if you are local enough to do this without relocating.
What you’ll be doing
Build, enhance and maintain data pipelines using Azure Databricks, Data Factory, and Delta Lake
Develop and optimise Lakehouse components and cloud‑based data flows
Create robust data models to support analytics, MI and downstream reporting
Assist in migrating legacy warehouse assets into a modern Azure environment
Contribute to cloud architecture decisions, data standards and best‑practice engineering patterns
Develop reliable Python and PySpark code to support data ingestion, transformation, and end‑to‑end processing.
What you’ll bring
Strong hands‑on experience across Azure Data Services (ADF, ADLS, Synapse, Databricks)
Excellent SQL skills, with experience in performance tuning and optimisation
Solid understanding of data modelling (star schema, medallion, ETL frameworks)
Ability to work with complex, inconsistent or legacy data sources
Experience building scalable, production‑ready pipelines in a cloud environment
Azure Data Engineer | £400 - £500 Outside IR35 | Bristol | Hybrid | 6‑Month Initial Term
PySpark Engineer Lead
As the Technical Lead, you will drive the high-stakes migration of legacy SAS analytics to a modern, cloud-native PySpark ecosystem on AWS. This isn’t just a lift and shift you will refactor complex procedural logic into scalable, production-ready distributed pipelines for a Tier-1 financial services environment.
Core Responsibilities
Technical Stack
Randstad Technologies is acting as an Employment Business in relation to this vacancy.
Your new company
One of the most influential Central Government Organisations in the current economic climate
Your new role
Data Scientist - SC Cleared - SQL, Python & R
What you’ll need to succeed
My client is looking for an Analytical Data Scientist, leading/working alongside a team of data scientists to deliver key outputs for commissioned projects (use cases).
You will also support the development of GSCIP through developing tools, data visualisations, and data available for analysis.
You will have the opportunity to work on bespoke data science projects to improve understanding and interpretation of the data, and enhance use case delivery capability.
This role can only be offered to candidates with Active and Existing SC or DV Clearance.
Essential Criteria:
Desirable Criteria:
This is a hybrid role at 40% and Monday is a mandatory team day. Successful candidates will join the role asap and interviews are to commence 16/3/26 onwards.
What you’ll get in return
This is an excellent role to join the GSCI Programme as an experienced Data Scientist, ensuring existing delivery and data standards are maintained and services scaled up!
What you need to do now
If you’re interested in this role, click ‘apply now’ to forward an up-to-date copy of your CV, or call us now.
If this job isn’t quite right for you, but you are looking for a new position, please contact us for a confidential discussion about your career.
Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C’s, Privacy Policy and Disclaimers which can be found at (url removed)
A Data Engineer is needed for a contract where your work will directly shape how a business trusts, structures, and uses its data. If you enjoy building reliable pipelines, improving models, and turning messy data into dependable assets, this is the kind of project where your impact is felt quickly. This role focuses on practical delivery. You’ll be strengthening the foundations of analytics and reporting by building dependable solutions that teams across the organisation rely on every day. What’s in it for you £500 per day contract with immediate impact on a growing environment Hybrid working with a balanced onsite and remote setup A delivery-focused project where practical engineering skills are valued The opportunity to improve and shape core assets used across the business A collaborative environment working closely with technical teams and stakeholders Real ownership over the reliability and structure of pipelines and models What you’ll be getting stuck into as a Data Engineer Building and maintaining scalable pipelines that support analytics, reporting, and operational data use Developing and refining warehouse models that align with real business requirements Writing and optimising SQL for transformation, integration, and performance improvements Strengthening quality through validation, governance, and structured data workflows Delivering reliable, accessible datasets for reporting and decision-making Supporting monitoring, testing, and continuous improvement across data processes What you’ll bring to the table as a Data Engineer Strong hands-on experience delivering practical solutions Strong SQL capability for transformation, modelling, and optimisation Previous experience designing and working with data warehouse models Experience building and maintaining production pipelines Exposure to platforms such as Databricks, Synapse, or Microsoft Fabric If you're a Data Engineer ready to step into a contract where you can quickly add value by building dependable pipelines and models, apply now to learn more. Candidate Source Ltd is an advertising agency. Once you have submitted your application it will be passed to the third party Recruiter who is responsible for processing your application. This will include holding and sharing your personal data, our legal basis for this is legitimate interest subject to your declared interest in a job. Our privacy policy can be found on our website and we can be contacted to confirm who your application has been forwarded to
Contract Data Engineer - Azure / Databricks
Location: London (2 days onsite)
Rate: 550- 600 per day (Inside IR35)
Contract: 6 months
A leading UK financial institution is seeking an experienced Data Engineer to support the development and enhancement of a modern cloud-based data platform. This role will focus on building scalable data pipelines and supporting the evolution of a cloud-first data architecture.
Key Responsibilities
Key Skills & Experience
Contract Details
If you’re an experienced Data Engineer with strong Azure and Databricks expertise and are available for a new contract, please apply or get in touch to discuss further.
Our client, a prominent entity in the Defence & Security sector, is seeking a meticulous Data Scientist with a strong understanding of Linux, Data Science, and AWS to join their team. This is a contract position located in London for a duration of 12 months, requiring a UKIC DV clearance to undertake sensitive and impactful work.
Key Responsibilities:
Job Requirements:
Additional Details:
If you are a dedicated Data Scientist with the necessary clearances and skills, and are eager to contribute to mission-critical projects in the realm of national security, we want to hear from you. Apply now to take the next step in your career with our client.
Job Title: TM1 Planning Developer
Location: Remote - Inside IR35.
Start Date: April
Job Type: Contract
We’re looking for a TM1/IBM Planning Analytics Developer to support the development and optimisation of enterprise planning solutions within a complex data environment.
You will be responsible for designing and maintaining TM1 models and cubes, developing business rules and processes, and supporting financial and operational planning workflows. The role involves working closely with finance and business stakeholders to deliver scalable, high-performance planning and reporting solutions.
This is a 3 month initial contract, remote and Inside IR35.
Key requirements:
If you are interested in this opportunity, please apply now with your updated CV in Microsoft Word/PDF format.
Disclaimer
Notwithstanding any guidelines given to level of experience sought, we will consider candidates from outside this range if they can demonstrate the necessary competencies.
Square One is acting as both an employment agency and an employment business, and is an equal opportunities recruitment business. Square One embraces diversity and will treat everyone equally. Please see our website for our full diversity statement.
Rate: up to £1000 a day inside IR35
Location: 3 days at London Office
We are working with a leading global financial institution on a senior hire within their Real Time market data engineering team. This role is focused on building and operating low-latency, high-performance KDB+ platforms that support mission-critical trading, analytics and monitoring use cases.
What You’ll Be Doing
What We’re Looking For
Extensive hands-on experience with KDB+/q in a production environment
Proven experience designing or operating Real Time tick data systems
Strong knowledge of:
Experience building low-latency systems where performance matters
Strong Linux/Unix skills, including debugging running processes
Power BI Developer
Manchester
6 month contract
Inside ir35
Purpose of the Role
The role is responsible for designing high quality Power BI datasets, developing scalable data models, and delivering insightful dashboards that support critical business processes. The position also leads the establishment of a resilient BI data foundation, integration with ServiceNow, and uplift of AI ready reporting capabilities across the organisation.
Key Objectives
Key Responsibilities
Report & Dashboard Development
DAX & Data Transformation
Data Integration & Gen BI
Power BI Service Administration
Stakeholder Engagement & Documentation
Essential Skills & Experience
Desirable Skills
Behaviours & Mindset
If you believe you have the experience required, please apply with your CV now for instant consideration!
TO APPLY - PLEASE APPLY WITH AN UP-TO-DATE CV
Candidates will ideally show evidence of the above in their CV in order to be considered.
Please be advised if you haven’t heard from us within 48 hours then unfortunately your application has not been successful on this occasion, we may however keep your details on file for any suitable future vacancies and contact you accordingly.
Pontoon is an employment consultancy. We put expertise, energy, and enthusiasm into improving everyone’s chance of being part of the workplace. We respect and appreciate people of all ethnicities, generations, religious beliefs, sexual orientations, gender identities, and more. We do this by showcasing their talents, skills, and unique experience in an inclusive environment that helps them thrive.
We use generative AI tools to support our candidate screening process. This helps us ensure a fair, consistent, and efficient experience for all applicants. Rest assured, all final decisions are made by our hiring team, and your application will be reviewed with care and attention.
Role Title: Confluent Engineer
Location: London
Duration: 27/05/2026
Days on site: 2-3
Rate INSIDE IR35 £402.75 PER DAY
MUST BE THROUGH UMBRELLA
Role Description:
* Role Title: Senior Software Engineer - Confluent Streaming Platform
Role Purpose
* We are seeking a Senior Software Engineer with strong hands-on experience in Confluent Platform, Apache Kafka, and Apache Flink to support the introduction and evolution of Intact’s enterprise streaming capabilities. This role sits within the Integration function, responsible for enabling Real Time data, event-driven architecture, and high-performance integrations across the organisation.
* The ideal candidate will contribute to the design, development, and scaling of Intact’s Confluent-based streaming platform, supporting teams across the organisation in adopting event-driven approaches.
* This opportunity sits within a significant cloud-modernisation programme, leveraging Agile and DevOps practices to continuously deliver business value.
Key Accountabilities
* Deliver engineering tasks across the Confluent Streaming Platform, including the design, development, testing, and deployment of event-driven services and data pipelines.
* Develop Kafka topics, schemas, and streaming applications using Kafka, Kafka Connect, Schema Registry, and Flink.
* Collaborate with architects and platform teams to shape the event streaming roadmap.
* Provide subject-matter expertise in distributed streaming and event-driven architecture.
* Review streaming applications produced by other engineers, ensuring quality and best practices.
* Troubleshoot production streaming issues and conduct root-cause analysis.
* Promote platform standards, governance models, and reusable patterns.
* Collaborate with vendor teams and Confluent professional services.
* Actively participate in Agile ceremonies and technical discussions.
Customer Conduct Framework
* Understand how FCA Conduct Rules apply to this role and consistently demonstrate behaviours that support positive customer outcomes and safe data handling.
Functional/Technical Skills
* 6+ years of software engineering experience, including 3+ years hands-on with Confluent/Apache Kafka.
* Experience designing and building distributed streaming applications using Confluent Platform components.
* Strong understanding of event-driven architecture concepts (event streaming, event sourcing, Pub/Sub, stream processing).
* Experience with Avro/JSON/Protobuf, schema evolution, and Schema Registry.
* Integration experience with Back End systems such as SQL/NoSQL databases, APIs, and cloud data platforms.
* Familiarity with API design, data modelling, and microservice integration patterns.
* Proficient with Git, Jira, Azure DevOps, Docker, Kubernetes.
* Working knowledge of AWS and/or Azure.
* Strong understanding of clean code, reusable component design, and Agile/DevOps practices.
Decision-Making Authority
* Makes decisions on design and implementation of streaming components within agreed architecture and standards.
* Provides expert guidance impacting platform reliability, performance, and integration quality.
* Escalates risks, issues, and architectural concerns appropriately.
Please send latest CV
LA International is a HMG approved ICT Recruitment and Project Solutions Consultancy, operating globally from the largest single site in the UK as an IT Consultancy or as an Employment Business & Agency depending upon the precise nature of the work, for security cleared jobs or non-clearance vacancies, LA International welcome applications from all sections of the community and from people with diverse experience and backgrounds.
Award Winning LA International, winner of the Recruiter Awards for Excellence, Best IT Recruitment Company, Best Public Sector Recruitment Company and overall Gold Award winner, has now secured the most prestigious business award that any business can receive, The Queens Award for Enterprise: International Trade, for the second consecutive period.
Up to £475/day Outside IR35 London 2 days per week in Office We are seeking a highly skilled AI Engineer with deep expertise in Agentic AI, Large Language Models, NLP, GenAI pipelines, cloud ML platforms, and vector-based retrieval systems . This is an opportunity to join an advanced AI team building next-generation intelligent systems, multi-agent applications, and high-scale GenAI microservices. You will design, deploy, and optimise production-grade AI/ML systems powering millions of customer interactions. You will work across Python, cloud-native architectures, vector search, RAG frameworks, orchestration engines, and multi-agent systems , shaping AI capabilities that transform how organisations interact, automate, and understand their customers. Key Responsibilities AI / LLM / Agentic Engineering Design, build, and optimise agentic AI systems using frameworks such as LangChain, LangGraph, Vertex AI Agent Builder, Bedrock Agents, AgentKit, CrewAI , and custom orchestration. Build LLM-powered applications using models including GPT-4o/5, Llama3, Claude, Gemini 2.5 Pro, Bard , and enterprise-grade LLM deployments. Implement RAG and CAG architectures using Pinecone, OpenSearch, Google GenAI Search , and custom vector stores. Engineer domain-tuned embeddings using ADA-002, Gecko, Word2Vec, BERT, Sentence Encoder, and topic modelling. AI/ML Pipelines & MLOps Develop scalable AI/ML microservices using Docker, Kubernetes (EKS/GKE), and CI/CD-driven automation. Build and enhance pipelines for model evaluation, bias/drift detection, real-time inference, and monitoring . Optimise inference latency for high-volume, near-real-time applications such as transcript and behavioural analysis. NLP & Applied Machine Learning Apply text clustering, N-gram analytics, sentiment modelling, intent classification, and summarisation for insight extraction. Refine conversational intent taxonomies and behavioural models for more accurate AI assistant interactions. Data Engineering & Cloud Integration Use cloud services including SageMaker, Azure ML Studio, Vertex AI for training, deployment, and monitoring. Manage datasets using GCP Cloud Storage and implement secure, compliant data workflows. AI Governance & Quality Assurance Establish guardrails, safety layers, automated evaluation frameworks, and prompt governance patterns. Ensure all AI systems meet stringent data governance, privacy, and financial-sector compliance requirements. Technical Skills Languages & Development Python, Java, SQL, Shell Scripting, Node.js, Streamlit IDE experience: PyCharm, VS Code, JupyterLab, Eclipse, Notepad++, Sagemaker Studio, Azure ML Studio, Vertex AI Workbench Python Libraries NumPy, Pandas, Matplotlib, Scikit-learn, TensorFlow, Keras, PyTorch, PySpark, SpaCy, SciPy, NLTK, Statsmodels, Boto3, AzureSDK NLP & LLMs BERT, Word2Vec, Universal Sentence Encoder, NLTK, embeddings, fuzzy matching, topic modelling LLM experience: GPT-3.5/4o/5, Llama2/3, Claude, Gemini, Bedrock models, SQuAD fine-tuning, custom RAG agents AI Search & Vector Innovations Pinecone, OpenSearch, LangChain/LangGraph, LangIndex, Vertex AI Search, Vector DBs, RAG pipelines What We're Looking For Proven experience developing production-grade LLM, GenAI, NLP, or agent-based AI systems . Strong engineering foundation across Python, cloud platforms, APIs, and vector search. Experience with complex multi-agent AI orchestration. Ability to deliver high-scale, low-latency AI solutions in demanding environments. Strong collaboration, architectural thinking, and a passion for cutting-edge AI innovation. TPBN1\_UKTJ