We are rapidly expanding our focus on data, both internally and within our products and services. We are now looking for more help in this area to keep up with the growing demands of a dynamic, data driven organisation. We need someone that wants to get stuck in and do stuff! The role requires knowledge and understanding of some data and technical concepts and tools, but we do not expect candidates to have a lot of professional experience in all these areas. This role will enable you to get hands on experience in multiple areas, making it an amazing opportunity to learn very quickly.
Responsibilites
- Develop, maintain, and optimise ETL processes for data extraction, transformation, and loading
- Create and manage data models and data warehousing solutions
- Write and maintain structured queries as well as content scraping projects
- Utilise programming languages like Python, SQL for data processing tasks
- Enablement of integration of apps / services and connecting to internal and third-party APIs
- Collaborate with cross-functional teams to ensure seamless integration of data processes
- Optimise data pipelines for performance and efficiency.
- Work closely with data scientists and analysts to support their data needs.
- Build pipelines to transform raw, unstructured data into useful information by leveraging AI and LLMs
Requirements
- Proven experience in data engineering and proficient in designing and implementing scalable data architectures.
- Strong experience with ETL processes, data modelling, and data warehousing (we use airflow, dbt and redshift)
- Expertise in database technologies, both relational (SQL) and NoSQL.
- Knowledge of cloud platforms (AWS)
- Solid understanding of data security measures and compliance standards.
- Excellent Python experience
- Collaborative skills to work closely with data scientists and analysts
- Ability to optimize data pipelines for performance and efficiency.
- Ability to build, test and maintain tasks and projects
- Experience with version control systems, like Git.
It would be nice if you had:
- Experience with Airflow and/or dbt
- Experience working in Agile environment using SCRUM/Kanban
- Hands-on experience working within a DevOps environment