Job Search and Career Advice Platform

Enable job alerts via email!

Analytics Engineer

Jgasurveyors

City of London

On-site

GBP 65,000 - 75,000

Full time

30+ days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A global logistics business in London seeks a professional to join a team focused on building and implementing a new Azure Databricks platform. Responsibilities include designing data pipelines, ensuring data quality, and providing analytics solutions. Candidates should have strong skills in SQL, Python, and experience with data transformation tools. This full-time role requires on-site presence five days a week and offers a salary of £65-75k, depending on experience.

Qualifications

  • Experience designing and implementing scalable data architectures.
  • Deep understanding of data governance principles.
  • Strong problem-solving abilities in data issues.

Responsibilities

  • Collaborate in the design and maintenance of scalable data pipelines.
  • Build and optimise data transformation workflows.
  • Implement automated data quality checks.

Skills

Advanced technical proficiency in SQL
Advanced technical proficiency in Python
Experience with dbt
Experience in cloud data platforms (Azure Databricks, Snowflake)
Strong software engineering practices (Git, CI/CD)
Understanding of data quality frameworks
Experience with business intelligence tools (Power BI, Tableau)
Job description

This global logistics business, based in central London, is undergoing an exciting data transformation programme as it invests in a new team charged with building and implementing a new Azure Databricks platform. Working five days a week in the central London office, you'll join this growing team and play a key role in building and deploying robust data infrastructure and analytics solutions using the modern data stack.

London (five days per week on site)

£65-75k (dependent on experience)

Position
Key Responsibilities And Primary Deliverables
  • Collaborate with Data Engineer in the design, build, and maintain scalable data pipelines using Azure Data Factory and Databricks to automate data ingestion, transformation, and processing workflows.
  • DCreate and maintain dimensional data models and semantic layers that support business intelligence and analytics use cases.
  • Build and optimise data transformation workflows using dbt, SQL, and Python to create clean, well-documented, and version-controlled analytics code.
  • Implement automated data quality checks, monitoring systems, and alerting mechanisms to ensure data reliability and trustworthiness across the analytics platform, linking issues to the business impact.
  • Develop reusable data assets, documentation, and tools that enable business users to independently access and analyse data through Power BI and other visualization platforms.
  • Work closely with data analysts, and business stakeholders to understand requirements and translate them into technical solutions.
  • Create and maintain technical documentation, establish coding standards, and maintain data catalogue to support governance and compliance requirements.
Skills & Experience
  • Advanced technical proficiency in SQL, Python, and modern data transformation tools (dbt strongly preferred), with experience in cloud data platforms (Azure Databricks, Snowflake, or similar).
  • Proven experience designing and implementing scalable data architectures, including dimensional modelling, data lakehouse / warehouse concepts, and modern data stack technologies.
  • Strong software engineering practices including version control (Git), CI/CD pipelines, code testing, and infrastructure as code principles.
  • Deep understanding of data quality frameworks, data governance principles, and experience implementing automated monitoring and alerting systems.
  • Analytics platform expertise with hands‑on experience in business intelligence tools (Power BI, Tableau, Looker) and understanding of self‑service analytics principles.
  • Strong problem‑solving abilities with experience troubleshooting complex data issues, optimizing performance, and implementing scalable solutions.

McGregor Boyall is an equal opportunity employer and do not discriminate on any grounds.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.