Make yourself visible and let companies apply to you.
Role title
Roles
Processing Jobs in London
Trending Processing jobs in London
Get notified about new jobs that match this search?
Hedge Fund - Python Developer (Equities) - Trade life cycle - PnL - Kafka - Contract
Scope AT Limited
London
In office
Senior
Private salary
RECENTLY POSTED
+1

Our Hedge Fund client is looking for a Python Developer/Engineer Contract role

This team is responsible for the firms equity transaction data platform, including trade life cycle event processing, enrichment, and PnL calculations. The role is ideal for an engineer who enjoys building robust, high-throughput services and data pipelines in a fast-paced, delivery-focused environment.

Principal Responsibilities
Design and develop solutions for trade life cycle event processing, including corporate actions, expiries, and other post-trade events.
Build and operate Python-based services that perform large-scale data transformations and calculations.
Publish and distribute transaction and PnL data using Kafka, including AVRO-based schemas and streaming patterns.

Required Skills
Minimum of 6+ years of professional Python development experience, ideally in capital markets or a fintech firm.
Experience in finance: understanding of common financial asset classes; knowledge of equities corporate action processing, trade life cycle concepts, and/or P&L calculations is a strong plus.
Experience with Kafka (or equivalent streaming/messaging platforms) and schema-based event publishing (eg, AVRO).
Strong experience performing large-scale data calculations in Python using libraries like pandas, polars, and NumPy.
Experience building REST services using frameworks such as FastAPI and/or Flask.
Strong SQL skills and experience working with relational databases in production environments.
Hands-on experience with containerized deployments and modern infrastructure tooling (Docker, Kubernetes) and familiarity with cloud platforms.
Understanding of modern SDLC practices (testing strategy, CI/CD, release management, observability, and operational ownership).

Office based, 5 days per week.
Based in London.

Contract role inside IR35

By applying to this job you are sending us your CV, which may contain personal information. Please refer to our Privacy Notice to understand how we process this information. In short, in order to supply you with work finding services, we will hold and process your personal data, and only with your express permission we will share this personal data with a client (or a third party working on behalf of the client) by email or by upload to the Client/third parties vendor management system. By giving us permission to send your CV to a client, this constitutes permission to share the personal data that would be necessary to consider your application, interview you (Phone/video/face to face) and if successful hire you.
Scope AT acts as an employment agency for Permanent Recruitment and an employment business for the supply of temporary workers. By applying for this job you accept the Terms and Conditions, Data Protection Policy, Privacy Notice and Disclaimers which can be found at our website.

Lead Data Platform Engineer - Databricks - IAC - Terraform - Azure Data Factory - Data Lakehouse
Nexere Consulting Limited
London
Remote or hybrid
Senior
ÂŁ80,000 - ÂŁ85,000
RECENTLY POSTED
+1

The Data Platform Engineer designs, develops, automates, and maintains secure, scalable, and compliant data platforms that enable the firm to efficiently manage, analyse, and utilise data. The role ensures that data solutions are robust and reliable while meeting regulatory obligations and safeguarding client confidentiality.

Key Responsibilities

  • Design and architect scalable, secure, and compliant data platforms and solutions, producing technical documentation and securing approvals through governance bodies such as Architecture Review Boards.
  • Build and deliver robust data solutions using Databricks, PySpark, Spark SQL, Azure Data Factory, and Azure services.
  • Develop APIs and write efficient Python, PySpark, and SQL code to support data integration, processing, and automation.
  • Implement and manage CI/CD pipelines and automated deployments using Azure DevOps to enable reliable releases across environments.
  • Develop and maintain infrastructure-as-code (eg, Terraform, ARM) to provision and manage cloud resources, including ADF pipelines, Databricks assets, and Unity Catalog components.
  • Monitor, troubleshoot, and optimise data platform performance, reliability, and costs, identifying bottlenecks and recommending improvements.
  • Create dashboards and observability tools to report on platform performance, usage, incidents, and operational KPIs.

Knowledge, Skills & Experience

  • Degree in Computer Science, Data Engineering, or a related field.
  • Proven experience designing and building cloud-based data platforms, ideally within Azure.
  • Strong hands-on expertise with Databricks, PySpark, Spark SQL, and Azure Data Factory.
  • Solid understanding of Data Lakehouse architecture and modern data platform design.
  • Proficiency in Python for data engineering, automation, and data processing.
  • Experience developing and integrating REST APIs for data services.
  • Strong DevOps experience, including CI/CD, automated testing, and release management for data platforms.
  • Experience with Infrastructure as Code tools such as Terraform or ARM templates.
  • Knowledge of data modelling, ETL/ELT pipelines, and data warehousing concepts.
  • Familiarity with monitoring, logging, and alerting tools (eg, Azure Monitor).

Desirable

  • Experience with additional Azure services (eg, Fabric, Azure Functions, Logic Apps).
  • Knowledge of cloud cost optimisation for data platforms.
  • Understanding of data governance and regulatory compliance (eg, GDPR).
  • Experience working in regulated or professional services environments.
Page 13 of 13
Frequently asked questions
Haystack features a variety of Processing-related roles in London, including data processing, payment processing, workflow automation, and transaction processing positions across industries such as finance, retail, and IT services.
While some Processing jobs may require certifications such as Six Sigma, ITIL, or specific software knowledge, many positions value practical experience and relevant technical skills. Job listings on Haystack detail any required certifications.
Yes, Haystack lists both on-site and remote Processing job opportunities in London. You can filter your search to find remote positions that fit your preferences.
New Processing jobs are posted regularly on Haystack, often daily. To stay up-to-date, you can set up job alerts specific to Processing roles in London.
Salaries for Processing jobs in London vary based on role, experience, and company size. Entry-level positions typically start around ÂŁ25,000, while senior roles can exceed ÂŁ70,000 annually. Specific salary details are provided in each job posting.