Make yourself visible and let companies apply to you.
Roles

Remote Data Engineer Jobs

Overview

Find top remote Data Engineer jobs with Haystack, your go-to IT job board for flexible, work-from-anywhere opportunities. Explore the latest openings in data engineering, build scalable data pipelines, and work with cutting-edge technologies—all from the comfort of your home. Start your remote Data Engineer career today!
Filters applied
Remote
Data Engineer
Search
Salary
Location
Remote preference
Role type
Seniority
Tech stack
Sectors
Contract type
Company size
Visa sponsorship
Business Systems Developer
RPS Group Plc
Abingdon
Fully remote
Junior - Mid
Private salary
RECENTLY POSTED
javascript
csharp
java
sql
vault
microsoft-azure
+1
We’re looking for a Business Systems Developer to join our EMEA Business Systems team at Tetra Tech. You’ll be part of a collaborative, international team that values initiative, quality, and clear communication across cultures and time zones.About the roleThe role encompasses a broad range of technologies and it’s not expected that you’ll have expertise in everything. This role will suit a self-starter, keen to learn and improve. Although this is a remote-first role, the position is based in the United Kingdom, with the main office in Abingdon, Oxfordshire. Occasional travel to the Abingdon office or other UK offices may be required for team meetings and collaboration sessions.About YouAs a Business Systems Developer, you’ll be responsible for designing, developing and supporting integrations between our global Oracle EBS system and various internal business systems and external services. You’ll support a cross-platform suite of services relying on both the Oracle and Microsoft Azure ecosystems utilizing technologies including Data Factory, SQL, Java and C#. You’ll work on new initiatives including application development using Oracle APEX and data analysis in Oracle Analytics Cloud and AutoML as well as supporting existing systems.Your Responsibilities
Assist in developing and maintaining integrations between Oracle EBS and internal/external systems using Java and Azure Data Factory
Assist in the development of business applications primarily using Oracle APEX as well as JavaScript frameworks
Support existing applications, services and infrastructure
Write and maintain high-quality technical documentation
Collaborate closely with stakeholders (business leads, IT functional teams) across multiple countries
Provide 2nd and 3rd line support for incidents and service requests
Skills & Experience
Solid experience with SQL and software development processes using Java
Experience with Microsoft Azure services, in particular Data Factory, SQL Server, Logic Apps, Storage Accounts and Key Vault
Experience or working knowledge of JavaScript and knowledge of any JavaScript framework
Experience with Oracle Database
Solid experience with Microsoft DevOps and/or other source control systems
Strong analytical and communication skills
Keen desire to learn and develop skills
Comfortable working in a remote environment with distributed teams
Knowledge of other technologies including C#.Net, Oracle APEX, Oracle Analytics Cloud and SQL Server Integration Services would be advantageous
RPS, a Tetra Tech companyRPS, part of Tetra Tech since January 2023, is a global firm that defines, designs, and manages projects in urbanisation, natural resources, and sustainability. As part of Tetra Tech’s 28,000-strong team across 550 offices in over 120 countries, we deliver solutions that create lasting value in an increasingly urbanised and resource-scarce world. By leveraging our global expertise, we enable our clients to develop winning solutions for their clients and communities.As a Tetra Tech company, RPS is proud to provide market-leading development and project opportunities for our people, supporting their growth while addressing the challenges that matter. Our people drive our success, and this is where you come to build a career.What happens next?If we feel you are a good match you will be invited to attend a competency-based interview. All applications will be considered. Ready to apply? Please have your CV ready and continue with your application online. #LI-JP1
Sr.Data Engineer
Cognizant
London
Remote or hybrid
Senior
Private salary
RECENTLY POSTED
aws
terraform
github
grafana
kafka
python
+4
We are hiring a senior Data Engineer to lead the development of intelligent, scalable data platforms for Industry 4.0 initiatives. This role will drive integration across OT/IT systems, enable real-time analytics, and ensure robust data governance and quality frameworks. The engineer will collaborate with cross-functional teams to support AI/ML, GenAI, and IIoT use cases in manufacturing and industrial environments. Key Responsibilities Architect and implement cloud-native data pipelines on AWS or Azure for ingesting, transforming, and storing industrial data. Integrate data coming from OT systems (SCADA, PLC, MES, Historian) and IT systems (ERP, CRM, LIMS) using protocols like OPC UA, MQTT, REST. Design and manage data lakes, warehouses, and streaming platforms for predictive analytics, digital twins, and operational intelligence. Define and maintain asset hierarchies, semantic models, and metadata frameworks for contextualized industrial data. Implement CI/CD pipelines for data workflows and ensure lineage, observability, and compliance across environments. Collaborate with AI/ML teams to support model training, deployment, and monitoring using MLOps frameworks. Establish and enforce data governance policies, stewardship models, and metadata management practices Monitor and improve data quality using rule-based profiling, anomaly detection, and GenAI-powered automation Support GenAI initiatives through data readiness, synthetic data generation, and prompt engineering. Mandatory Skills: Cloud Platforms:Deep experience with AWS (S3, Lambda, Glue, Redshift) and/or Azure (Data Lake, Synapse). Programming & Scripting:Proficiency in Python, SQL, PySpark etc. ETL/ELT & Streaming:Expertise in technologies like Apache Airflow, Glue, Kafka, Informatica, EventBridge etc. Industrial Data Integration:Familiarity with OT data schema originating from OSIsoft PI, SCADA, MES, and Historian systems. Information Modeling:Experience in defining semantic layers, asset hierarchies, and contextual models. Data Governance:Hands-on experience Data Quality:Ability to implement profiling, cleansing, standardization, and anomaly detection frameworks. Security & Compliance:Knowledge of data privacy, access control, and secure data exchange protocols. Defining and creating MLOPs pipeline Good to Have Skills GenAI Exposure:Experience with LLMs, LangChain, HuggingFace, synthetic data generation, and prompt engineering. Digital Twin Integration:Familiarity with nVidia Omniverse, AWS TwinMaker, Azure Digital Twin or similar platforms and concepts Visualization Tools:Power BI, Grafana, or custom dashboards for operational insights. DevOps & Automation:CI/CD tools (Jenkins, GitHub Actions), infrastructure-as-code (Terraform, CloudFormation). Industry Standards:ISA-95, Unified Namespace (UNS), FAIR data principles, and DataOps methodologies. TPBN1_UKTJ
Data Engineer
BIOMETRIC TALENT
Preston
Remote or hybrid
Mid - Senior
ÂŁ70,000
RECENTLY POSTED
airflow
sql
snowflake
dbt
About the Client Our client is a long-established, service-focused business delivering intelligent, data-driven solutions that help organisations increase efficiency, reduce operational risk, and streamline complex logistical processes. With a strong reputation for reliability and innovation, they serve a diverse portfolio of national and regional clients, many of whom rely on their services as a critical part of day-to-day operations. They pride themselves on combining technology with exceptional service delivery, offering bespoke solutions tailored to the evolving needs of their customers. The organisation continues to invest in digital transformation and strategic partnerships to remain at the forefront of operational excellence. Known for a collaborative and pragmatic culture, they value long-term relationships and continuous improvement. How youll spend your day Youll play a key role in building, maintaining, and optimising data pipelines and transformation workflows. Your focus will be on ensuring data integrity, reliability, and performance across the organisations cloud-based analytics environment. Develop and maintain automated data ingestion pipelines using Fivetran. Implement and manage dbt models for scalable data transformations. Monitor and optimise Snowflake performance and costs. Ensure version control and CI/CD best practices for dbt projects. Set up orchestration and monitor pipeline health. Troubleshoot and resolve issues to maintain smooth data operations. Collaborate with BI Analysts and Data Stewards to deliver trusted, compliant datasets. Support business teams with data availability and workflow optimisation. What youll bring to this role Were looking for a technically strong Data Engineer with a proactive, problem-solving approach and solid experience in modern data tools and practices. Proven experience with SQL and Snowflake performance tuning. Hands-on expertise with Fivetran and dbt. Good understanding of data modelling, governance, and security best practices. Familiarity with orchestration tools such as Airflow or Prefect (advantageous). Experience working in Azure (or AWS/GCP). Strong analytical and collaboration skills, with great attention to detail. A degree in Computer Science, Data Engineering, or relevant experience. Perks & Benefits: 2 x basic annual salary Death in service 8% pension(5% employee) 25 days holiday plus bank holidays What happens next? One of our Recruitment Consultants will be in touch and inform you if youve been successful to the next stage of the process or not, which is a qualification call where we will tell you more about the role and the client, and understand more about you, your experience and career aspirations. Should we both wish to proceed, we will submit your details to the client and be in touch regarding the outcome and any further steps. The interview process for this client consists of: Stage 1 Remote Technical Discussion Stage 2 Remote Competency and Culture Interviewwith the CTO Equal Opportunities We are committed to providing equal opportunities for all candidates and welcome applications from individuals regardless of age, disability, gender identity, marital status, race, religion or belief, sexual orientation, or any other characteristic protected by law. As an employment agency for permanent and contract hires, we are dedicated to promoting a diverse and inclusive workforce, and we encourage applications from underrepresented groups to drive innovation and equality within the workplace. Should you require any reasonable adjustments please let us know so we can accommodate for any interactions with us at Biometric Talent, but also inform the client to ensure reasonable adjustments are made to allow for a fair and equitable process. TPBN1_UKTJ
Lead Data Engineer
TPXImpact Holdings Plc
London
Remote or hybrid
Senior
ÂŁ65,000
RECENTLY POSTED
fabric
aws
git
airflow
sql
dimensions
+1
About The Role Job level: 10 Were looking for a Lead Data Engineer to join our Data Engineering and Analytics practice. In this role, you will: Lead the design, development, management and optimisation of data pipelines to ensure efficient data flows, recognising and sharing opportunities to reuse data flows where possible. Coordinate teams and set best practices and standards when it comes to data engineering principles. Champion data engineering across projects and clients. Responsibilities Lead by example, holding responsibilities for team culture, and how projects deliver the most impact and value to our clients. Be accountable for the strategic direction, delivery and growth of our work. Lead teams, strands of work and outcomes, owning commercial responsibilities. Hold and manage uncertainty and ambiguity on behalf of clients and our teams. Ensure teams and projects are inclusive through how you lead and manage others. Effectively own and hold the story of our work, ensuring we measure progress against client goals and our DT missions. Work with our teams to influence and own how we deliver more value to clients, working with time and budget constraints. Strategically plan the overall project and apply methods and approaches. Demonstrably share work with wider audiences. Elevate ideas through how you write, speak and present. Dimensions Headcount : Typically leads a multidisciplinary team or multiple workstreams (team size 515) Resource complexity: Provides leadership across multiple workstreams or technical domains within a project or programme. Responsible for delivery coordination, prioritisation, and quality, often overseeing more junior leads or specialists. Problem-solving responsibility : Solves highly complex problems, balancing technical, user, business, and operational needs. Applies expert judgement to make decisions, manage risks, and guide teams through ambiguity. Change management requirements: Leads or co-leads significant change initiatives. Responsible for managing stakeholder expectations, supporting adoption, and embedding sustainable ways of working. Internal/External interactions: Acts as a trusted partner to client and internal stakeholders at multiple levels. Leads workshops, presentations, and stakeholder engagement to ensure buy-in, alignment, and delivery clarity. Strategic timeframe working towards: Works across mid- to long-term delivery cycles (612 months), ensuring that near-term work supports broader programme and client objectives. About You Professional knowledge and experience Essential Proven experience in data engineering, data integration and data modelling Expertise with cloud platforms (e.g. AWS, Azure, GCP) Expertise with modern cloud data platforms (e.g. Microsoft Fabric, Databricks) Expertise with multiple data analytics tools (e.g. Power BI) Deep understanding of data wareho using concepts, ETL/ELT pipelines and dimensional modelling Proficiency in advanced programming languages (Python/PySpark, SQL) Experience in data pipeline orchestration (e.g. Airflow, Data Factory) Familiarity with DevOps and CI/CD practices (Git, Azure DevOps etc) Ability to communicate technical concepts to both technical and non-technical audiences Proven experience in delivery of complex projects in a fast paced environment with tight deadlines Desirable Advanced knowledge of data governance, data standards and best practices. Experience in a consultancy environment, demonstrating flexibility and adaptability to client needs. Experience defining and enforcing data engineering standards, patterns, and reusable frameworks Professional certifications in relevant technologies (e.g. Microsoft Azure Data Engineer, AWS Data Analytics, Databricks Certified Professional Data Engineer) Skills Data Development Process Design, build and test data products that are complex or large scale Build and lead teams to complete data integration services integration and reusable pipelines that meet performance, quality and scalability standards Collaborate with architects to align solutions with enterprise data strategy and target architectures Data Engineering and Manipulation Work with data analysts, engineers and data science and AI specialists to design and deliver products into the organisation effectively. Understand the reasons for cleansing and preparing data before including it in data products and can put reusable processes and checks in place. Access and use a range of architectures (including cloud and on-premise) and data manipulation and transformation tools deployed within the organisation. Optimise data pipelines and queries for performance and cost efficiency in distributed environments Testing (Data) Review requirements and specifications, and define system integration testing conditions for complex data products and support others to do the same Identify and manage issues and risks associated with complex data products and support others to do the same Analyse and report system test activities and results for complex data products and support others to do the same Other Skills Proficiency in developing and maintaining complex data models (conceptual, logical and physical). Strong skills in data governance and metadata management. Experience with data integration design and implementation. Ability to write efficient, maintainable code for large scale data systems. Experience with CI/CD pipelines, version control, and infrastructure-as-code (e.g. Git, Azure DevOps). Strong stakeholder communication skills, with the ability to translate technical concepts into business terms. Ability to mentor junior engineers, foster collaboration, and build a high-performing data engineering culture. Behaviours and PACT values Purpose: Be values-driven, recognising that our client’s needs are paramount. Approach client engagements with professionalism and creativity, balancing commercial and operational needs. Accountability: Be accountable for delivering your part of a project on time and under budget and working well with other leaders. Lead by example, promoting a culture where quality and client experience are foremost. Craft: B alance multiple priorities while leading high-performing teams. Navigate ambiguity and set the technical direction and approach to support positive outcomes. Togetherness: Collaborate effectively with others across TPXimpact. Build strong relationships with colleagues and clients. About Us People-Powered Transformation We’re a purpose driven organisation, supporting organisations to build a better future for people, places and the planet. Combining vast experience in the public, private and third sectors and expertise in human-centred design, data, experience and technology, were creating sustainable solutions ready for an ever-evolving world. At the heart of TPXimpact, were collaborative and empathetic. Were a team of passionate people who care deeply about the work we do and the impact we have in the world. We know that change happens through people, with people and for people. Thats why we believe in people-powered transformation. Working in close collaboration with our clients, we seek to understand their unique challenges, questioning assumptions and building in their teams the capabilities and confidence to continue learning, iterating and adapting. Benefits Include: 30 days holiday + bank holidays 2 volunteer days for causes that you are passionate about Maternity/paternity - 6 months Maternity Leave, 3 months Paternity Leave Life assurance Employer pension contribution of 5% Health cash plan Personal learning and development budget Employee Assistance Programme Access to equity in the business through a Share Incentive Plan Green incentive programmes including Electric Vehicle Leasing and the Cycle to Work Scheme Financial advice Health assessments About TPXimpact - Digital Transformation We drive fundamental change in approaches to product and service development, delivery and technology. Our agile, multidisciplinary teams use technology, design and data to deliver better results, improving outcomes for individuals, organisations and communities. By working in the open, in partnership with our clients, we not only transform their systems and services but also build the capability of their teams, so work can continue without us in the longer term. Our focus is sustainable change, always delivered with positive impact. Were an inclusive employer, and we care about diversity in our teams. Let us know in your application if you have accessibility requirements during the interview. TPBN1_UKTJ
Senior Manager - Network Modelling & Optimisation
Robert Walters
East Midlands
Remote or hybrid
Senior
Private salary
RECENTLY POSTED
TECH-AGNOSTIC ROLE
Senior Manager, Network Modelling and Optimisation Salary: Competitive and based on experience Location: Melbourne (relocation support provided) A blue-chip logistics provider is seeking a Senior Manager, Network Modelling and Optimisation to transform one of Australia’s largest logistics networks. This role offers an exciting opportunity to shape the future of a complex logistics system by leveraging advanced mathematical modelling and optimisation techniques. As a key leader, you will drive strategic scenario planning, develop innovative solutions, and deliver actionable insights that enhance operational efficiency and cost-effectiveness. Working within a collaborative and inclusive environment, you’ll have access to cutting-edge tools, professional development opportunities, and the chance to make a tangible impact on both business outcomes and community service. Key Responsibilities Develop innovative network models aligned with long-term strategic objectives. Evaluate network options using advanced optimisation tools and methodologies. Collaborate with cross-functional teams to ensure practical, sustainable solutions. Lead automated data processes for site-level volume analysis across multiple aggregation levels. Construct detailed cost-benefit analyses to support regional strategies and business case development. Analyse volume flows and cost drivers to identify opportunities for improvement. Foster a culture focused on safety, wellbeing, customer-centricity, and continuous improvement. About You To excel in this role, you will bring: Proven expertise in quantitative methods within large-scale logistics or consulting environments (PhD preferred). Strong commercial acumen with hands-on experience in supply chain optimisation or network modelling. Exceptional leadership skills with experience managing teams toward shared goals while nurturing professional growth. Advanced proficiency in mathematical modelling tools (e.g., linear programming) and expert use of MS Excel or similar platforms. Demonstrated ability to build automated processes for large-scale volume modelling at various aggregation levels. Experience creating detailed cost-benefit analyses for scenario evaluation, including sensitivity testing for capital planning purposes. Relevant qualifications such as a Master’s or PhD in Data Science, Mathematics, or related fields are highly desirable. Why Join Us? This organisation fosters an inclusive workplace built on trust, collaboration, and innovation: Flexible working arrangements support work-life balance. Ongoing training ensures career growth tailored to your aspirations. A strong focus on wellbeing prioritises mental health alongside professional success. By joining this forward-thinking organisation, you’ll play a pivotal role in driving operational excellence while contributing to positive social impact-connecting people nationwide through innovation. Robert Walters Operations Limited is an employment business and employment agency and welcomes applications from all candidates
Data Modeller (Finance)
DGH Recruitment Ltd
Multiple locations
Remote or hybrid
Senior - Leader
ÂŁ70,000 - ÂŁ76,000
dot-net
python
sql
Data Modeller (Finance) / Data Manager - ModellingSummary: An exciting opportunity for a qualified, self-motivated, and highly organized individual to lead financial data modelling activities. The role involves managing a small team, driving improvements in financial data modelling processes, and supporting strategic decision-making.Key Responsibilities Team Leadership: Supervise and develop a finance team to deliver accurate, high-quality models and reports. Model Development & Maintenance: Design, build, and maintain financial data models for planning, forecasting, and performance evaluation. Technology Integration: Collaborate with IT and security teams to adopt modern tools and platforms. Process Improvement: Streamline modelling processes, enhance automation, and ensure data accuracy. Governance & Compliance: Maintain documentation, version control, and audit readiness. Stakeholder Engagement: Provide financial insights and reports to support business decisions.What You’ll Bring Technical Skills: Advanced Excel and experience with BI tools; knowledge of VBA, SQL, Python, .NET, and related technologies. Leadership: Proven ability to manage and motivate teams. Analytical & Commercial Acumen: Strong financial analysis and strategic thinking. Communication: Ability to present complex information clearly to senior stakeholders. Project Management: Experience delivering projects from planning to implementation.In accordance with the Employment Agencies and Employment Businesses Regulations 2003, this position is advertised based upon DGH Recruitment Limited having first sought approval of its client to find candidates for this position.DGH Recruitment Limited acts as both an Employment Agency and Employment Business
Power BI Developer - Nottingham / Remote ÂŁ65k
Akkodis
Nottingham
Fully remote
Mid - Senior
ÂŁ55,000 - ÂŁ65,000
fabric
dax
My prestigious client are looking for a Power BI enthusiast to come in play a key role in shaping their ambitious Data strategy.They are incredibly well-known with their sector with a flawless reputation and an enviable portfolio of clients. If you’re looking to join a company that’s investing heavily in its technology transformation and strategic growth agenda Look no further!As a business, they are in a great position as income and growth of the business is continually rising year on year. Growth has both been organic through good placing & hard work in their market, and some acquisitions too. They have an enviable portfolio of clients, including some huge corporate and public sector clients and are recognised as a leader in their market on both a local and national scale.There is currently a huge focus on their technology strategy and with an ambitious road-map in place for 2026, they’re now looking for their first, dedicated Power BI Developer to take the reins on the design and on-going development of a range of best-in-class BI solutions.You will be at the top of your game with a proven track record in using enterprise-level Power BI in a professional services environment. There is also scope for you to get involved in high-level design and complex architecture to help truly shape their strategy. Essentially, you will take the lead in designing their BI solutions moving forward - think plenty of data modelling, DAX and striving to deliver high-quality reporting solutions across various departments. You’ll be to “go-to person” for all things BI and reporting - upskilling the existing team and inspiring better ways of working!Naturally you’ll have solid knowledge across Power BI Desktop & Power BI Service, Power Query and DAX. This is key, as you’ll join as the sole Power BI expert in the team and your remit will be to help up-skill the wider team too. Any Microsoft Fabric exposure for scalable datasets would be hugely desirable as they have a vision to implement Fabric into the business very soon.Essentially, you will take the lead in designing their BI solutions moving forward - think plenty of data modelling, DAX and striving to deliver high-quality reporting solutions across various departments.What I really like about this role, is that it is a newly-created and autonomous position and very much a “blank canvass.”. a role you can make your own and one where you can inspire others whilst shaping the companies long-term Data strategy.This is your chance to join and work for an awesome Manager who has a great vision for the companies Data journey and you’ll play a key role alongside him, in shaping the way inwhich the company ultitise Data. It’s the type of environment where your voice will be both heard and valued too - they have a great reputation for treating their staff incredibly well. Its essentially, a lovely place to work - a close-knit and collabertive team where you’ll be truly supported from day one.We are flexible on ways of working but you must be open to visit their Nottingham-based HQ 1-2 times a month (or whenever needed!) and you can work the rest from home.Salary up to 65k depending on experience plus an awesome benefits package including bonus scheme, great pension and much more!I’m looking to shortlist this role ASAP so if you’re interested, please apply today or contact me directly on (phone number removed) or laura. removed) for immediate consideration.Modis International Ltd acts as an employment agency for permanent recruitment and an employment business for the supply of temporary workers in the UK. Modis Europe Ltd provide a variety of international solutions that connect clients to the best talent in the world. For all positions based in Switzerland, Modis Europe Ltd works with its licensed Swiss partner Accurity GmbH to ensure that candidate applications are handled in accordance with Swiss law.Both Modis International Ltd and Modis Europe Ltd are Equal Opportunities Employers.By applying for this role your details will be submitted to Modis International Ltd and/ or Modis Europe Ltd. Our Candidate Privacy Information Statement which explains how we will use your information is available on the Modis website.
Oracle Batch Developer
Experis
London
Remote or hybrid
Mid - Senior
ÂŁ510/day
processing-js
gitlab
Role: Oracle Batch DeveloperLocation: UK Remote but occasional client site visits in either Newcastle or ManchesterDuration: 6 MonthsDay rate: 510 Umbrella OnlyCandidates are required to have been a UK resident for a minimum of 5 yearsAbout the RoleWe are seeking an experienced Oracle Batch Developer to support a major Government department operating a highly integrated, mission-critical Oracle 19c OLTP environment. This role supports backend workloads that underpin essential business processes across multiple government departments and local authorities.You will play a key role in maintaining, optimising, and modernising the batch processing estate, ensuring the performance, reliability, and stability of a system that supports thousands of users and vital public services.Key Responsibilities
Develop, maintain, and optimise Pro C, PL/SQL, and shell script-based batch processes.
Support a high-volume OLTP environment across custom Oracle applications and Oracle E-Business Suite.
Analyse and modernise legacy code, working within multi-disciplinary teams to understand requirements, propose technical options, and prototype solution approaches.
Deliver robust, performant code and conduct performance testing in high-volume, transaction-driven environments.
Contribute to the ongoing enhancement of a mission-critical platform used across numerous government functions.
Essential Skills & Experience
Strong hands-on experience with:
Pro C
PL/SQL
Shell scripting (bash/ksh)
Experience developing and supporting batch workloads and high-volume OLTP systems within Oracle custom or Oracle Apps/EBS environments.
Proven analytical capability, including:
Legacy code analysis
Working with multidisciplinary teams
Providing technical input into solution design
Rapid prototyping of solution approaches
Ability to deliver highly performant, resilient, and maintainable code.
Experience with performance testing in demanding, high-transaction environments.
Desirable Skills
Experience with Oracle encryption, key management, and DBMS_CRYPTO.
Exposure to containerising workloads and migrating services to AWS EKS.
Experience building and maintaining CI/CD pipelines in GitLab.
Knowledge of AWS services.
Operational experience with:
Autosys (job scheduling)
Dynatrace (observability)
Prometheus (monitoring)
AI Engineer
Haystack - Partnerships
United Kingdom
Fully remote
Mid - Senior
ÂŁ90,000 - ÂŁ110,000
python
tensorflow
pytorch
pandas
react
typescript
Haystack is working with a direct employer on this opportunity!📍 Fully remote, UKDescriptionWe’re looking for an AI Engineer to join a product-led team building intelligent, data-driven systems that power real-world applications. You’ll play a key role in designing, developing, and deploying AI models and pipelines that deliver meaningful insights and automation at scale.You’ll collaborate closely with data engineers, product teams, and software developers to translate business requirements into production-grade machine learning systems. This is a hands-on role where you’ll contribute to architecture, experimentation, and optimisation — with a focus on delivering robust, explainable, and maintainable AI solutions.What you’ll do
Design and implement machine learning and AI models to solve practical problems
Build and maintain data pipelines for model training, evaluation, and deployment
Conduct model experimentation, hyperparameter tuning, and performance tracking
Integrate AI models into production systems and APIs
Collaborate with cross-functional teams to identify opportunities for automation and predictive analytics
Stay up to date with emerging trends in ML, AI frameworks, and LLM applications
Core stack & skills
Python (NumPy, Pandas, Scikit-learn, PyTorch or TensorFlow)
Experience building and deploying models via APIs or microservices
Familiarity with cloud ML platforms (AWS Sagemaker, Vertex AI, or Azure ML)
Strong understanding of data preprocessing, feature engineering, and model evaluation
Knowledge of version control (Git) and containerisation (Docker)
Familiarity with prompt engineering, embeddings, and LLM APIs
Nice to have
Experience with vector databases (Pinecone, Weaviate, FAISS)
Exposure to MLOps and CI/CD for model lifecycle management
Knowledge of natural language processing (Transformers, Hugging Face)
Experience fine-tuning or integrating large language models
Experience with full stack development using React and TypeScript
Benefits
Fully remote working within the UK
Flexible hours and autonomy
Annual learning & development budget
Generous holiday allowance and wellbeing support
Page 2 of 2

Frequently asked questions

What types of remote Data Engineer jobs are available on Haystack?
Haystack features a wide range of remote Data Engineer positions, including roles focused on data pipeline development, ETL processes, data warehousing, cloud data engineering, and real-time data processing across various industries.
How can I apply for remote Data Engineer jobs through Haystack?
You can browse remote Data Engineer job listings on Haystack, create a profile, upload your resume, and apply directly through the platform to employers offering remote positions.
Are remote Data Engineer roles on Haystack full-time, part-time, or freelance?
Haystack offers a variety of remote Data Engineer roles including full-time, part-time, contract, and freelance opportunities. You can filter job listings by employment type to find the best fit.
What skills are commonly required for remote Data Engineer positions listed on Haystack?
Common skills for remote Data Engineer jobs include proficiency in SQL, Python or Java, experience with cloud platforms like AWS or Azure, knowledge of big data tools such as Hadoop or Spark, and expertise in data modeling and ETL frameworks.
Does Haystack provide resources to help me prepare for remote Data Engineer interviews?
Yes, Haystack offers interview tips, sample questions, and career advice specifically geared towards Data Engineers seeking remote positions to help you prepare effectively.