Make yourself visible and let companies apply to you.
Role title
Roles
AWS Jobs in Milton Keynes
Trending AWS jobs in Milton Keynes
Get notified about new jobs that match this search?
Trainee AI Engineer Placement Programme
ITOL Recruit
Multiple locations
Fully remote
Graduate
£30,000 - £45,000

Trainee AI Engineer – No Experience Needed Future-proof your career in Artificial Intelligence – starting today. Looking for a career change? Currently employed but want something better? Or maybe you're between jobs and ready for a fresh start? ITOL Recruit's AI Traineeship is designed to get you into one of the fastest-growing industries with zero experience required. Train online at your own pace and land your first AI Engineer role in 1-3 months. Please note this is a training course and fees apply Job guaranteed - complete the programme and get a job or get your money back. Our candidates earn £30,000-£45,000. Why AI? AI is reshaping every industry you can think of. Healthcare, finance, retail, and manufacturing – they’re all scrambling for skilled professionals. The demand far outstrips supply, which means excellent salaries, flexible working arrangements, and genuine job security. How It Works Step 1 – AI Engineering Fundamentals Start with the basics of AI, including neural networks and large language models, to build a solid foundation in AI engineering. Step 2 – Data Fundamentals Understand the data workflow, from collection to cleaning, and learn how to prepare data for AI applications. Step 3 – Notebooks & IDEs Get hands-on with industry-standard tools like Jupyter Notebooks and VS Code to develop AI systems. Step 4 – Python Programming Master Python, covering everything from the basics to object-oriented programming (OOP). Step 5 – Python Streamlit Project Apply your Python skills by building a car price prediction app using Python and Streamlit. Step 6 – Python for Data Learn essential Python libraries like NumPy, Pandas, and Matplotlib for data manipulation and visualisation. Step 7 – AI Sentiment Analysis Project Work with Hugging Face to build a sentiment analysis classifier using real-world AI techniques. Step 8 – AI Prompt Engineering Master prompt engineering, learning how to craft effective prompts for controlling AI outputs. Step 9 – Retrieval-Augmented Generation (RAG) Learn how to integrate external knowledge into AI systems using RAG techniques and vector databases. Step 10 – AI Specialised Customer Service Chatbot Project Combine prompt engineering and RAG to build an AI-powered customer service chatbot, delivering intelligent responses using vector databases and knowledge bases. Step 11 – Machine Learning Fundamentals Understand machine learning principles and algorithms, and how to train and test models using scikit-learn. Step 12 – Machine Learning Project Put your machine learning knowledge into practice with a hands-on project. Step 13 – AI & Data Ethics Study the ethical considerations in AI, including issues of bias, fairness, and data privacy. Step 14 – Oral Exam Complete a virtual oral exam to assess your understanding and ability to apply your learning. Step 15 – AWS Certified Cloud Practitioner Finish with the AWS Certified Cloud Practitioner course and exam to gain essential cloud computing knowledge. What You Get · 100% online, self-paced training · Microsoft AI-900 certification included · 1-to-1 tutor and recruitment support · Real-world project experience · Job guarantee – get a job or your money back · Starting salary of £30,000–£45,000 We Get You Hired! We're not new to this. ITOL Recruit has 15+ years of experience and has placed over 5,000 people into new roles. Our job programmes include certified tutors, UK-accredited qualifications, and one-on-one support from a recruitment adviser focused on placing you. We don't believe in empty promises. Complete our programme, follow the process, and if you don't land a job, you get your money back. "Five months from complete beginner to AI engineer. Best decision I ever made." – Jamie W., now working as a Junior AI Engineer in London Ready to Start? If you’re motivated, curious, and excited about technology, we’ll help you turn that into a career you can be proud of. Apply now, and one of our expert Career Advisors will be in touch within 4 working hours to guide you through your next steps

Spotlight
Senior Data Platform Engineer
easyJet
Luton
Hybrid
Senior - Leader
Private salary

With a big investment into Databricks, and with a large amount of interesting data, this is the chance for you to come and be part of an exciting transformation in the way we store, analyse and use data in a fast paced organisation. You will join as a Senior Platform Data Engineer providing technical leadership to the Data Engineering team. You will work closely with our Data Scientists and business stakeholders to ensure value is delivered through our solutions.

Job Accountabilities

  • Develop robust, scalable data pipelines to serve the easyJet analyst and data science community.
  • Highly competent hands-on experience with relevant Data Engineering technologies, such as Databricks, Spark, Spark API, Python, SQL Server, Scala.
  • Work with data scientists, machine learning engineers and DevOps engineers to develop, develop and deploy machine learning models and algorithms aimed at addressing specific business challenges and opportunities.
  • Coach and mentor the team (including contractors) to improve development standards.
  • Work with Business Analysts to deliver against requirements and realise business benefits.
  • Build a documentation library and data catalogue for developed code/products.
  • Oversight of project deliverables and code quality going into each release.

Requirements of the Role

Key Skills Required

  • Technical Ability: has a high level of current, technical competence in relevant technologies, and be able to independently learn new technologies and techniques as our stack changes.
  • Clear communication; can communicate effectively in both written and verbal forms with technical and nontechnical audiences alike.
  • Complex problem-solving ability; structured, organised, process-driven and outcome-oriented. Able to use historical experiences to help with future innovations.
  • Passionate about data; enjoy being hands-on and learning about new technologies, particularly in the data field.
  • Self-directed and independent; able to take general guidance and the overarching data strategy and identify practical steps to take.

Technical Skills Required

  • Significant experience designing and building data solutions on a cloud based, big data distributed system.
  • Hands-on software development experience with Python and experience with modern software development and release engineering practices (e.g. TDD, CI/CD), and software deployment automation with GitHub actions or Azure DevOps.
  • Experience in testing automation of data transformation pipelines, using frameworks like Pytest or dbt Unit Test.
  • Comfortable writing efficient SQL and debugging.
  • Data warehouse operations and tunning experience in schema evolution, indexing, partitioning.
  • Hands-on IaC development experience with Terraform or CloudFormation.
  • Understanding of ML development workflow and knowledge of when and how to use dedicated hardware.
  • Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Hadoop, Beam)
  • Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture.
  • Experience with data quality and/or and data lineage frameworks like Great Expectations, dbt data quality, OpenLineage or Marquez, and data drift detection and alerting.
  • Understanding of Data Management principles (security and data privacy) and how they can be applied to Data Engineering processes/solutions (e.g. access management, data privacy, handling of sensitive data (e.g. GDPR)

Desirable Skills

  • Experience in event-driven architecture, ingesting data in real time in a commercial production environment with Spark Streaming, Kafka, DLT or Beam.
  • Understanding of the challenges faced in the design and development of a streaming data pipeline and the different options for processing unbounded data (pubsub, message queues, event streaming etc)
  • Understanding of the most commonly used Data Science and Machine Learning models, libraries and frameworks.
  • Knowledge of the development lifecycle of analytical solutions using visualisation tools (e.g. Tableau, PowerBI, ThoughtSpot)
  • Hands-on development experience in an airline, e-commerce or retail industry.
  • Worked within the AWS cloud ecosystem.
  • Experience of building a data transformation framework with dbt.
Page 2 of 2