This global logistics business, based in central London, is undergoing an exciting data transformation programme as it invests in a new team charged with building and implementing a new Azure Databricks platform. Working five days a week in the central London office, you'll join this growing team and play a key role in building and deploying robust data infrastructure and analytics solutions using the modern data stack.
London (five days per week on site)
£65-75k (dependent on experience)
Position
Key Responsibilities And Primary Deliverables
- Collaborate with Data Engineer in the design, build, and maintain scalable data pipelines using Azure Data Factory and Databricks to automate data ingestion, transformation, and processing workflows.
- DCreate and maintain dimensional data models and semantic layers that support business intelligence and analytics use cases.
- Build and optimise data transformation workflows using dbt, SQL, and Python to create clean, well-documented, and version-controlled analytics code.
- Implement automated data quality checks, monitoring systems, and alerting mechanisms to ensure data reliability and trustworthiness across the analytics platform, linking issues to the business impact.
- Develop reusable data assets, documentation, and tools that enable business users to independently access and analyse data through Power BI and other visualization platforms.
- Work closely with data analysts, and business stakeholders to understand requirements and translate them into technical solutions.
- Create and maintain technical documentation, establish coding standards, and maintain data catalogue to support governance and compliance requirements.
Skills & Experience
- Advanced technical proficiency in SQL, Python, and modern data transformation tools (dbt strongly preferred), with experience in cloud data platforms (Azure Databricks, Snowflake, or similar).
- Proven experience designing and implementing scalable data architectures, including dimensional modelling, data lakehouse / warehouse concepts, and modern data stack technologies.
- Strong software engineering practices including version control (Git), CI/CD pipelines, code testing, and infrastructure as code principles.
- Deep understanding of data quality frameworks, data governance principles, and experience implementing automated monitoring and alerting systems.
- Analytics platform expertise with hands‑on experience in business intelligence tools (Power BI, Tableau, Looker) and understanding of self‑service analytics principles.
- Strong problem‑solving abilities with experience troubleshooting complex data issues, optimizing performance, and implementing scalable solutions.
McGregor Boyall is an equal opportunity employer and do not discriminate on any grounds.