Role: Data Migration Specialist - Defence Sector Location: London (On-site) Start Date: ASAP Contract: Freelance Duration: 12 Months +
A leading organisation within the defence sector is seeking a Data Migration & Data Synchronisation Specialist to support a large-scale data transformation initiative. This role requires a contractor experienced in designing and delivering secure data migration frameworks across complex enterprise environments, integrating both on-premise and cloud platforms.
Candidates must be freelance contractors and have Security clearance or eligible for Security clearance due to the nature of the programme
The successful consultant will play a key role in building scalable architectures and automated data pipelines to enable accurate, secure, and high-performance data flows across multiple systems.
Key Responsibilities
Required Technical Expertise
Additional Requirements
Role Details:Position: SparkScala DeveloperLocation: London, UK (Hybrid)Work Mode: Hybrid (3 Days a week from Office)Duration: Initial contract will be for 12 months, however long-term projectRate: Inside IR35- Up to GBP 450/day
?? Visa Note: Open to British Citizenship, or ILR (Indefinite Leave to Remain), or UK Settlement Visa. We do not provide visa sponsorship.
Key Requirements:Must have skills:-Spark & Scala
Nice to have skills:-Spark Streaming, Hadoop,-Hive,-SQL,-Sqoop,- Impala
Detailed Job Description: At least 8+ years of experience and strong knowledge in Scala programming language. Able to write clean, maintainable and efficient Scala code following best practices. Good knowledge on the fundamental Data Structures and their usage At least 8+ years of experience in designing and developing large scale, distributed data processing pipelines using Apache Spark and related technologies. Having expertise in Spark Core, Spark SQL and Spark Streaming. Experience with Hadoop, HDFS, Hive and other BigData technologies. Familiarity with Data warehousing and ETL concepts and techniques Having expertise in Database concepts and SQL/NoSQL operations. UNIX shell scripting will be an added advantage in scheduling/running application jobs. At least 8 years of experience in Project development life cycle activities and maintenance/support projects. Work in an Agile environment and participation in scrum daily standups, sprint planning reviews and retrospectives. Understand project requirements and translate them into technical solutions which meets the project quality standards Ability to work in team in diverse/multiple stakeholder environment and collaborate with upstream/downstream functional teams to identify, troubleshoot and resolve data issues. Strong problem solving and Good Analytical skills. Excellent verbal and written communication skills. Experience and desire to work in a Global delivery environment. Stay up to date with new technologies and industry trends in Development.
Software Engineer II
Hybrid: Manchester (1 day a week)
Inside IR35: £500
Duration: 6 months
Tech Stack: Java / Spring Boot, SQL Server, AWS, Kubernetes, CI/CD. Node.js/Python and SSIS experience is a bonus.
A technology-driven team is seeking an experienced Software Engineer II to support the design, build and operation of backend services within a financial data environment. This role is ideal for someone who values strong engineering practices, autonomy, and cross-functional collaboration.
You will help develop and operate backend systems that deliver accurate, timely financial data to critical enterprise processes. The role includes hands-on development, production support, and ensuring compliance with key regulatory frameworks such as SOX and GDPR/PII.
If this sounds like you, please apply.
Please click here to find out more about our Key Information Documents. Please note that the documents provided contain generic information. If we are successful in finding you an assignment, you will receive a Key Information Document which will be specific to the vendor set-up you have chosen and your placement.
To find out more about Computer Futures please visit
Data Engineer- Highly competitive salary About the Role: Were partnering with a leading technology consultancy that helps organisations harness the power of data to modernise platforms and drive business outcomes. As a Data Engineer, youll be at the forefront of designing and delivering cloud-native solutions on Google Cloud, turning complex datasets into actionable insights. In this role, youll work on diverse projects, from batch and streaming pipelines to data warehouses, data lakes, and AI-powered analytics platforms. This is a hands-on role where your expertise will guide delivery, shape best practices, and mentor other team members. Key Responsibilities : Lead the design, development, and deployment of scalable data pipelines using BigQuery, Dataflow, Dataproc, and Pub/Sub Automate ETL/ELT workflows and orchestrate pipelines with tools such as Cloud Composer Contribute to architecture and end-to-end solution design for complex data platforms Set engineering standards and ensure high-quality code, deployment, and documentation practices Collaborate with clients and internal teams, translating business requirements into practical solutions Mentor and coach junior engineers to grow their skills and adopt best practices What They're Looking For: They're looking for a Data Engineer who can take ownership of complex data solutions while remaining hands-on. You should have: Proven experience building production-ready solutions on Google Cloud Expertise with batch and streaming frameworks like Apache Spark or Beam Strong understanding of data storage, pipeline patterns, and event-driven architectures Experience with CI/CD, version control, automated testing, and Agile delivery Ability to communicate clearly to both technical and non-technical stakeholders Mentoring or coaching experience Bonus skills: Kafka, enterprise data platform migrations, RDBMS experience (Postgres, MySQL, Oracle, SQL Server), and exposure to ML pipelines. Security Eligibility Candidates must be eligible for UK Security Clearance (SC or DV) if required. Why This Role? This is a chance to work on high-impact, cloud-native projects as a Data Engineer, taking ownership of technical decisions, shaping delivery practices, and developing your career. Youll join a supportive environment where mentoring and learning are highly valued, and your work will directly contribute to the success of complex data programmes. Ok I'm In What's Next? Please apply with your latest CV. TPBN1\_UKTJ
Job Title: Lead Data Engineer
Location: Leeds, 2x per week
Salary: Up to £80,000 per annum
Why Apply?
This is an exciting opportunity to work as a Lead Data Engineer delivering scalable, high quality data solutions for a leading client in the technology sector. This position offers professional growth, challenging projects, and access to cutting edge cloud data technologies.
Lead Data Engineer Responsibilities:
Lead Data Engineer Requirements:
We are an equal opportunities employer and welcome applications from all suitably qualified persons regardless of their race, sex, disability, religion/belief, sexual orientation or age.
Senior Data Architect Data & AI | Remote (Occasional travel to Glasgow or Reading) Salary £115,000 (Package) Are you a Senior or Data Architect who thrives on designing elegant, scalable cloud data solutions? Do you enjoy helping organisations become truly data-guided through modern Data & AI platforms? This is an exciting opportunity to join a growing consultancy environment where technical excellence and collaboration are key. The Role We are seeking a Senior Architectto join an expanding Data & AI team. You will work closely with architects and consultants to design and deliver high-quality, cloud-based data solutions using Microsoft Azure technologies. You will be accountable for the technical delivery of projects and will engage across the full project lifecycle, from presales through to operational handover. What Youll Be Doing
About You You will have extensive experience designing and delivering modern data platforms, with:
Desirable Experience
Bright Purple is an equal opportunities employer: we are proud to work with clients who share our values of diversity and inclusion in our industry.
£Up to £95,000 GBP
Hybrid WORKING
Location: Bristol; Gloucester; Cardiff; Corsham; Cheltenham, Bristol, South West - United Kingdom Type: Permanent
Principal GCP Data Engineer
Join an award-winning innovation and transformation consultancy recognised for its cutting-edge work in data engineering, cloud solutions, and enterprise transformation. This organisation is known for bringing ingenuity to life, helping clients turn complexity into opportunity, and fostering a culture where technical specialists thrive and grow.
An opportunity has arisen for a Principal GCP Data Engineer to join the London-based data and analytics practice. This Principal GCP Data Engineer role offers the chance to lead the design and delivery of end-to-end data solutions on Google Cloud Platform for high-profile clients, shaping data strategy and driving technical excellence across complex programmes.
With a reputation for combining breakthrough technologies with pragmatic delivery, the organisation empowers senior data engineers to influence architecture, mentor teams, and deliver production-ready solutions that create lasting impact.
The Role - Principal GCP Data Engineer
The Principal GCP Data Engineer is a senior technical role responsible for leading data engineering solutions, guiding teams, and acting as a subject matter expert in Google Cloud Platform. As a Principal GCP Data Engineer, you will define end-to-end solution architectures, implement best practices, and lead the development of robust, scalable data pipelines.
This role combines hands-on technical leadership with coaching, mentorship, and client engagement, making it ideal for a Principal GCP Data Engineer who enjoys delivering complex solutions while shaping the capabilities of their team and influencing enterprise-wide data strategy.
What You’ll Be Doing as a Principal GCP Data Engineer
As a Principal GCP Data Engineer, you will:
Key Responsibilities
Key Requirements
The successful Principal GCP Data Engineer will bring deep technical expertise, client-facing experience, and leadership skills. You will have:
Why Join
Reference: AMC/AON/PGCPDataEnginer
#aaon
Software Engineer II
Hybrid : Manchester (1 day a week)
Inside IR35: £500
Duration: 6 months
Tech Stack: Java / Spring Boot, SQL Server, AWS, Kubernetes, CI/CD. Node.js/Python and SSIS experience is a bonus.
A technology-driven team is seeking an experienced Software Engineer II to support the design, build and operation of backend services within a financial data environment. This role is ideal for someone who values strong engineering practices, autonomy, and cross-functional collaboration.
You will help develop and operate backend systems that deliver accurate, timely financial data to critical enterprise processes. The role includes hands-on development, production support, and ensuring compliance with key regulatory frameworks such as SOX and GDPR/PII.
Java / Spring Boot
Python & Node.js
Microsoft SQL Server , SQL optimisation
SSIS (bonus)
AWS
Kubernetes , Docker
CI/CD tooling
Infrastructure as Code (TypeScript, AWS CDK)
Exposure to Apache Spark is a plus
If this sounds like you, please apply.
Please click here to find out more about our Key Information Documents. Please note that the documents provided contain generic information. If we are successful in finding you an assignment, you will receive a Key Information Document which will be specific to the vendor set-up you have chosen and your placement.
To find out more about Computer Futures please visit
TPBN1_UKTJ
Software Engineer II Hybrid: Manchester (1 day a week) Inside IR35: £500 Duration: 6 months Tech Stack: Java / Spring Boot, SQL Server, AWS, Kubernetes, CI/CD. Node.js/Python and SSIS experience is a bonus. A technology‑driven team is seeking an experienced Software Engineer II to support the design, build and operation of backend services within a financial data environment. This role is ideal for someone who values strong engineering practices, autonomy, and cross‑functional collaboration. You will help develop and operate backend systems that deliver accurate, timely financial data to critical enterprise processes. The role includes hands-on development, production support, and ensuring compliance with key regulatory frameworks such as SOX and GDPR/PII. Java / Spring Boot Python & Node.js Microsoft SQL Server, SQL optimisation SSIS (bonus) AWS Kubernetes, Docker CI/CD tooling Infrastructure as Code (TypeScript, AWS CDK) Exposure to Apache Spark is a plusIf this sounds like you, please apply. Please click to find out more about our Key Information Documents. Please note that the documents provided contain generic information. If we are successful in finding you an assignment, you will receive a Key Information Document which will be specific to the vendor set-up you have chosen and your placement. To find out more about Computer Futures please visit Computer Futures, a trading division of SThree Partnership LLP is acting as an Employment Business in relation to this vacancy | Registered office | 8 Bishopsgate, London, EC2N 4BQ, United Kingdom | Partnership Number | OC(phone number removed) England and Wales
London + 2 or 3 days work from home
Circ £60,000 - £70,000 + Excellent Benefits Package
A fantastic opportunity is available for a Data Engineer that enjoys working in a fast paced and collaborative team playing work environment. Our client has been expanding at a remarkable pace and have transformed their technical landscape with leading edge solutions. Having implemented a new MS Fabric based Data platform, the need is now to scale up and deliver data driven insights and strategies right across the business globally. The Data Engineer will be joining a close-knit team that is the hub of our client’s global data & analytics operation. Previous experience with MS Fabric would be beneficial but is by no means essential. Interested candidates must have experience in a similar role with MS Azure Data Platforms, Synapse, Databricks or other Cloud platforms such as AWS, GCP, Snowflake etc.
Key Responsibilities will include;
* Design, implement, and optimize end-to-end solutions using Fabric components:
* o Data Factory (pipelines, orchestration)
* o Data Engineering (Lakehouse, notebooks, Apache Spark)
* o Data Warehouse (SQL endpoints, schemas, MPP performance tuning)
* o Real-Time Analytics (KQL databases, event ingestion)
* o Manage and enhance OneLake architecture, delta lake tables, security policies, and data governance within Fabric.
* o Build scalable, reusable data assets and engineering patterns that support analytics, reporting, and machine learning workloads.
* Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver effective solutions.
* Troubleshoot and resolve data-related issues in a timely manner.
Key Experience, Skills and Knowledge:
* Proven 2 yrs+ experience as a Data Engineer or similar role, with a strong focus on PySpark, SQL, Microsoft Azure Data platforms and Power BI an advantage
* Proficiency in development languages suitable for intermediate-level data engineers, such as:
* Python / PySpark: Widely used for data manipulation, analysis, and scripting.
* SQL: Essential for querying and managing relational databases.
* Understanding of D365 F&O Data Structures is highly desirable
* Strong problem-solving skills and attention to detail.
* Excellent communication and collaboration abilities.
This is a hybrid role based in Central / West London with the flexibility to work from home 2 or 3 days per week. Salary will be dependent on experience and expected to be in the region of £60,000 - £70,000 + an attractive benefits package including bonus scheme.
For further information, please send your CV to Wayne Young at Young’s Employment Services Ltd. YES are operating as both a recruitment Agency and Recruitment Business
London + 2 or 3 days work from home
Circ £60,000 - £70,000 + Excellent Benefits Package
A fantastic opportunity is available for a Data Engineer that enjoys working in a fast paced and collaborative team playing work environment. Our client has been expanding at a remarkable pace and have transformed their technical landscape with leading edge solutions. Having implemented a new MS Fabric based Data platform, the need is now to scale up and deliver data driven insights and strategies right across the business globally. The Data Engineer will be joining a close-knit team that is the hub of our client s global data & analytics operation. Previous experience with MS Fabric would be beneficial but is by no means essential. Interested candidates must have experience in a similar role with MS Azure Data Platforms, Synapse, Databricks or other Cloud platforms such as AWS, GCP, Snowflake etc.
Key Responsibilities will include;
Key Experience, Skills and Knowledge:
This is a hybrid role based in Central / West London with the flexibility to work from home 2 or 3 days per week. Salary will be dependent on experience and expected to be in the region of £60,000 - £70,000 + an attractive benefits package including bonus scheme.
For further information, please send your CV to Wayne Young at Young’s Employment Services Ltd. YES are operating as both a recruitment Agency and Recruitment Business
Lead AWS Data Engineer | Birmingham | Finance | AWS | Java | Outside IR35 | Contract | 12 Months Opus is partnered with financial services client to deliver a major programme of work. You’ll join an established engineering group, working alongside internal teams to build a new reporting workflow, upgrade an existing pipeline, and lead a full data‑sourcing uplift across multiple reporting workflows. The team will also be responsible for helping upgrade the framework for two critical internal workflows. This role will require you to be on site in either the London or Birmingham office 5 days per week, please only apply if you hit this criteria. Required Experience Strong background in data engineering within distributed data environments Hands-on expertise with AWS, Spark, Glue, and Snowflake Experience building and optimising data pipelines & reporting workflows Ability to work closely with internal engineering and controls teams Experience upgrading or modernising existing workflows and frameworks For Lead-level: prior experience leading engineering teams or workstreamsTech Stack AWS (Glue, S3, Lambda, Step Functions) Apache Spark Snowflake Java if you are interested in this role then please apply here or email me your most recent and up to date CV, along with your availability to (url removed) Lead AWS Data Engineer | Birmingham | Finance | AWS | Java | Outside IR35 | Contract | 12 Months
Data Platform Engineer – London
(AWS, Apache Spark, AWS Glue, Iceberg, S3, RDS, Redshift, Kafka/MSK, Python, Terraform, Ansible, CI/CD, Jenkins, GitLab, Snowflake, Databricks)
Working with an established FinTech client in London who is looking for a Data Platform Engineer to play a key role in defining, building, and evolving their enterprise Data Lakehouse platform during an exciting period of growth. You’ll work closely with Platform Engineering and Application Engineering teams, taking ownership of the infrastructure, patterns, standards,and tooling used to build and operate data products across the business.
The role focuses on ensuring the data platform is resilient,secure, reliable, and cost-effective within an AWS environment. You’ll be responsible for how the platform is operated, maintained, monitored, and extended, with a strong emphasis on observability, fault prevention, and early fault detection across AWS data services.
Automation is central to the way this team works. You’ll design and maintain Infrastructure as Code and Configuration as Code solutions, supported by CI/CD pipelines, to ensure consistent, repeatable deployments and strong governance. You’ll also enhance data lake integration testing, security measures, monitoring, SLAs, and operational metrics.
Working for a tech driven organisation in a collaborative environment, for an organisation that values engineering that values engineering best practises! This client Is offering this role on hybrid basis, looking to be in the office few times per month.
For more information, please get in touch
Prospect is looking for someone who is equally passionate about football and analytics and is excited about the possibilities of the intersection of the two. The ideal candidate would have experience as a problem solver, data engineer, and communicator, preferably with a degree in a quantitative field (such as computer science, engineering, physics, statistics or applied mathematics). You’ll work as part of cross-functional teams to help solve challenges and aid decision makers across the sporting landscape, from elite professional teams, to leagues and broadcasters, applying advanced analytics and modelling techniques. Roles & Responsibilities: - A passion for sport with an understanding of our clients’ sporting disciplines or an eagerness to learn about them. - A strong programming proficiency in Python and SQL querying. Experience with relational database platforms. - Knowledge of cloud technologies. It is an advantage (but not a requirement) to have had experience working with AWS. - Excellent collaboration and communication skills. - 4+ years of experience in big data related software development; experience with data modelling, design patterns and building highly scalable and secured solutions. - Practical knowledge of software engineering concepts and best practices, like testing frameworks, packaging, API design, DevOps, DataOps and MLOps. - The right to work in the United Kingdom.