Make yourself visible and let companies apply to you.
Roles
Dimensions Jobs
Overview
Looking for top Dimensions jobs? Discover the latest IT career opportunities featuring Dimensions roles on Haystack. Whether you're an experienced developer or a project manager, our curated job listings help you find the perfect Dimensions position to advance your tech career. Start your job search today and connect with leading companies hiring skilled professionals in Dimensions technology.
Lead Enterprise AI Engineer
SMS
Cardiff
Fully remote
Senior
Private salary
RECENTLY POSTED

Why choose us?

Choosing to work for SMS means choosing to make a difference.We are changing how businesses and consumers use energy for the better, helping achieve a greener, sustainable, and more affordable energy system for everyone. Through our range of innovative energy solutions, we are delivering the future of smart energy working closely with private and public sector partners we are playing a critical role in transforming and decarbonising the UK economy by 2050.

What’s in it for you?

  • 25 personal holiday days per year (with additional 8 public holidays) increasing to 30 personal days after 5 years of service (includes options to buy and sell)
  • Hybrid working options (for some positions).
  • Enhanced Maternity leave. Paternity and Adoption leave.
  • 24/7 free and confidential employee assistance service.
  • Simply Health plan offers a wide variety of benefits from cashback on everyday healthcare treatments like optical, dental and physio treatments. Discounted gym memberships and free 24/7 online GP.
  • Life Insurance (4 x annual salary)
  • Pension matching scheme (up to 5% of salary)

Visit Our People page

What’s the role?

Step into a role where strategy meets engineering excellence. As our Lead Enterprise AI Engineer, youll be the driving force turning bold business opportunities into production-ready AI solutions. Youll shape intelligent agents, craft next-generation conversational experiences using Microsoft Copilot and Databricks Mosaic AI, and architect the Semantic Layer that ensures our models deliver accurate, trusted insights every time.

If youre ready to take ownership of an organisations AI futureand build the systems that make it realthis is your stage.

You will report to the Data, Analytics and AI Director and work remotely on a full-time, 40-hour contract.

Please note: That travel is required, to have face to face meeting with your line manager.

Key responsibilities:

AI Solution Development & Agent Building

  • Design, build, and deploy low-code and pro-code AI agents using Microsoft Copilot tools to automate business workflows (e.g., HR queries, IT support, operational data retrieval).
  • Develop custom RAG (Retrieval-Augmented Generation) solutions within Databricks to allow LLMs to reason over proprietary SMS documents and data.
  • Integrate AI agents with enterprise systems (Dynamics 365, SharePoint, etc.) via APIs and Power Automate connectors.

Semantic Modelling & Databricks Genie Curation

  • Own the creation and maintenance of Databricks Genie Spaces. This involves translating complex database schemas into business-friendly semantic models.
  • Define and govern standard metrics, dimensions, and synonyms within Unity Catalog to ensure the AI “speaks the language of the business.”
  • Continuously monitor Genie performance, reviewing “human feedback” on answers to refine the semantic model and improve accuracy over time.

Business Engagement & Prototyping

  • Partner with business stakeholders (Finance, Operations, Commercial) to decompose high-level use cases into technical requirements.
  • Rapidly prototype AI solutions to demonstrate value and gain quick traction within business units.
  • Act as a technical evangelist, demonstrating to non-technical teams how to interact with Genie spaces and Copilots effectively.

AI Operations (LLMOps) & Governance

  • Implement monitoring frameworks to track the cost, latency, and quality of AI model outputs.
  • Ensure all AI solutions adhere to the organisation’s data governance and security standards (e.g., preventing data leakage via LLMs).
  • Manage the lifecycle of AI models and agents from development through to production and retirement.

To be considered for this role, we would love you to have:

  • Degree in Computer Science, Data Engineering, Artificial Intelligence, or related field, or equivalent industry experience.
  • Deep hands-on experience with Azure/Microsoft Fabric ecosystem (specifically Copilot Studio & Power Platform) and Databricks (SQL, Unity Catalog, Mosaic AI).
  • Strong ability to design data models for analytics. Experience with defining metrics layers (e.g., DBX semantic layer or Databricks Genie) is essential.
  • Practical experience with Large Language Models (LLMs), Prompt Engineering, and RAG architectures.
  • Proficient in Python (for data manipulation and API interaction) and SQL (for data modelling).
  • Experience working with REST APIs and connecting disparate business systems.
  • Ability to explain technical AI concepts to non-technical business users and translate their feedback into code.

#LI-Remote

Embedded Ada Software Engineer
Line Up Aviation
Bristol
In office
Mid - Senior
ÂŁ65/hour
RECENTLY POSTED

An opportunity has arisen with my client for a Embedded Ada Software Engineer to join them on an initial 12 -month contract. As the Embedded Ada Software Engineer you will work across the entire software engineering lifecycle, from discussing requirement change with the Systems team, all the way to being involved with qualification and software releases.

Role: Embedded Ada Software Engineer
Pay: 68 per hour via Umbrella Company
Location: Bristol
Contract: 12 Months
Hours: Monday - Friday, 37 hours per week
Security Clearance: Security Clearnce to start, UK Eyes only project

Responsibilites

  • Develop and maintain safety-critical command and control (C2) software for advanced maritime and land-based missile systems, ensuring compliance with rigorous industry and defence standards.
  • Collaborate with Systems Engineering teams to review, analyse, and implement evolving system requirements throughout the full software development lifecycle.
  • Design, code, test, and verify Ada-based software solutions, maintaining high standards of quality, documentation, and traceability in a safety-critical environment.
  • Support software qualification, validation, and release activities, including integration testing and compliance with safety and security regulations.
  • Apply formal design methodologies and tools (e.g., DOORS, UML/SysML, Rhapsody, Dimensions) to ensure structured development, configuration control, and full requirements traceability.

Essential Experience

  • Solid background in Safety critical SW from Def, aero, rail, nuclear or medical sectors.
  • The development is Safety critical, so a high standard of coding, process & documentation is required.
  • Formal design methods and tools: Doors, Dimensions, Rhapsody/UML/SysML/Mascot
  • Experience in developing Linux and networking applications.

If you are interested in applying for this position and you meet the requirements, please send your updated CV to: Natalie Dalkin at Line Up Aviation
Line Up Aviation has carved its own place in the recruitment of Aviation and Aerospace personnel all over the world for more than 30 years. We work with some of the industry’s best-known companies who demand the highest standard of applicants.
" on Twitter for all of our latest vacancies, news and pictures from our busy UK Head Office. Interact with us using the tag at anytime! Thank you for your follow!"

PySpark Developer
Randstad Digital
London
Fully remote
Senior - Leader
ÂŁ300/day - ÂŁ350/day

Lead PySpark Engineer (Cloud Migration)

Role Type: 5-Month Contract

Location: Remote (UK-Based)

Experience Level: Lead / Senior (5+ years PySpark)

Role Overview

We are seeking a Lead PySpark Engineer to drive a large-scale data modernisation project, transitioning legacy data workflows into a high-performance AWS cloud environment. This is a hands-on technical role focused on converting legacy SAS code into production-ready PySpark pipelines within a complex financial services landscape.

Key Responsibilities

  • Code Conversion: Lead the end-to-end migration of SAS code (Base SAS, Macros, DI Studio) to PySpark using automated tools (SAS2PY) and manual refactoring.
  • Pipeline Engineering: Design, build, and troubleshoot complex ETL/ELT workflows and data marts on AWS.
  • Performance Tuning: Optimise Spark workloads for execution efficiency, partitioning, and cost-effectiveness.
  • Quality Assurance: Implement clean coding principles, modular design, and robust unit/comparative testing to ensure data accuracy throughout the migration.
  • Engineering Excellence: Maintain Git-based workflows, CI/CD integration, and comprehensive technical documentation.

Technical Requirements

  • PySpark (P3): 5+ years of hands-on experience writing scalable, production-grade PySpark/Spark SQL.
  • AWS Data Stack (P3): Strong proficiency in EMR, Glue, S3, Athena, and Glue Workflows.
  • SAS Knowledge (P1): Solid foundation in SAS to enable the understanding and debugging of legacy logic for conversion.
  • Data Modeling: Expertise in ETL/ELT, dimensions, facts, SCDs, and data mart architecture.
  • Engineering Quality: Experience with parameterisation, exception handling, and modular Python design.

Additional Details

  • Industry: Financial Services experience is highly desirable.
  • Working Pattern: Fully remote with internal team collaboration days.
  • Benefits: 33 days holiday entitlement (pro-rata).

Randstad Technologies is acting as an Employment Business in relation to this vacancy.

Page 2 of 2
Frequently asked questions
Our job board features a wide range of Dimensions-related positions, including roles in software development, data analysis, database administration, infrastructure management, and IT consulting, all focused on IBM Cognos TM1/IBM Planning Analytics and IBM DB2 Dimensions software.
To apply for a Dimensions job, simply create an account, upload your resume, and click the 'Apply' button on the job listing. You may be redirected to the employer’s application site or be able to submit your application directly through our platform.
Yes, our job board includes a variety of remote, freelance, and contract opportunities for Dimensions professionals. You can filter job listings by work type to find positions that match your preferred work arrangement.
Typical requirements include proficiency in IBM Cognos TM1 or IBM Planning Analytics, experience with IBM DB2 Dimensions, strong analytical and problem-solving skills, familiarity with data modeling and database management, and knowledge of relevant programming languages such as SQL, Python, or Java.
Absolutely! You can subscribe to customized job alerts by setting your preferences for job type, location, and keywords like 'Dimensions'. This way, you’ll receive email notifications whenever new relevant jobs are posted.