Job Title: Data Engineer (SQL, Python) This role offers the opportunity to work with very large, complex data sets to influence product decisions for digital products used by hundreds of millions of people every day. You will contribute directly to user growth, retention and experience by building and maintaining robust data warehouse solutions, data models and pipelines. You will join a high-calibre data engineering team solving challenging web and mobile data problems at a scale that few organisations can match, with a strong focus on adoption and buyer personalisation. Responsibilities Maintain and support existing ETL pipelines and data processes running in production, ensuring stability and reliability. Work with very large-scale data sets, applying appropriate SQL techniques and performance optimisation strategies. Manage data warehouse plans and roadmaps for a product or group of products, ensuring alignment with business priorities. Interface regularly with engineers, product managers and product analysts to understand data needs and translate them into technical solutions. Build deep data expertise in your areas of ownership and take responsibility for data quality across those domains. Design, build and launch new production-grade data models that support analytics, reporting and personalisation use cases. Design, build and launch new data extraction, transformation and loading (ETL) processes in production environments. Rewrite and refactor data pipelines when upstream tables, schemas or privacy rules change, updating SQL and Python code accordingly. Implement changes to data visualisation dashboards, including wholesale reimplementation to meet new visualisation standards and guidelines. Contribute to task-focused delivery, taking ownership of tightly scoped backlog items and driving them to completion within agreed timelines. Use data analysis to identify deliverables, gaps and inconsistencies, and propose solutions to improve data coverage and quality. Communicate data-driven insights and technical considerations clearly and effectively to both technical and non-technical audiences.Essential Skills At least 5 years’ experience with programming languages, with strong hands-on experience in Python. At least 5 years’ experience writing efficient, optimised SQL statements for large and complex data sets. Hands-on experience with schema design and dimensional data modelling. Strong ability to analyse data to identify deliverables, gaps and inconsistencies. Ability to understand and work with Python language constructs, including constraints, conditional statements and code structure, in order to read, debug and rewrite existing code. Proficiency in building and maintaining ETL pipelines using tools or frameworks similar to Airflow (for example, internal orchestration tools with comparable concepts). experience working with large-scale data sets and applying SQL techniques suitable for high-volume environments. experience with data visualisation and dashboard development, ideally using Tableau. Excellent communication skills, including the ability to explain data-driven insights and technical topics clearly to stakeholders.Additional Skills & Qualifications Familiarity with industry-standard data pipeline orchestration tools such as Airflow or similar frameworks. experience working with cross-functional teams including product management, analytics and marketing. Strong problem-solving skills and the ability to work independently on task-focused backlog items. Attention to detail when implementing changes to dashboards, visualisation standards and reporting guidelines.Why Work Here? You will work on cutting-edge data technologies and large-scale systems that operate at a level of complexity and volume matched by few organisations. The environment encourages collaboration with some of the brightest minds in data engineering, product and analytics, offering continuous learning and exposure to modern tools and practices. Work Environment You will work in a modern, technology-driven environment focused on large-scale web and mobile data. The team uses SQL, Python and advanced data warehousing technologies to process and analyse massive data sets. Location London, UK Trading as TEKsystems. Allegis Group Limited, Maxis 2, Western Road, Bracknell, RG12 1RT, United Kingdom. No. (phone number removed). Allegis Group Limited operates as an Employment Business and Employment Agency as set out in the Conduct of Employment Agencies and Employment Businesses Regulations 2003. TEKsystems is a company within the Allegis Group network of companies (collectively referred to as "Allegis Group"). Aerotek, Aston Carter, EASi, Talentis Solutions, TEKsystems, Stamford Consultants and The Stamford Group are Allegis Group brands. If you apply, your personal data will be processed as described in the Allegis Group Online Privacy Notice available at (url removed)> To access our Online Privacy Notice, which explains what information we may collect, use, share, and store about you, and describes your rights and choices about this, please go to (url removed)> We are part of a global network of companies and as a result, the personal data you provide will be shared within Allegis Group and transferred and processed outside the UK, Switzerland and European Economic Area subject to the protections described in the Allegis Group Online Privacy Notice. We store personal data in the UK, EEA, Switzerland and the USA. If you would like to exercise your privacy rights, please visit the "Contacting Us" section of our Online Privacy Notice at (url removed)/en-gb/privacy-notices for details on how to contact us. To protect your privacy and security, we may take steps to verify your identity, such as a password and user ID if there is an account associated with your request, or identifying information such as your address or date of birth, before proceeding with your request. If you are resident in the UK, EEA or Switzerland, we will process any access request you make in accordance with our commitments under the UK Data Protection Act, EU-U.S. Privacy Shield or the Swiss-U.S. Privacy Shield