Job Search and Career Advice Platform

Enable job alerts via email!

Analytics Engineer

Killik & Co

City of London

On-site

GBP 60,000 - 80,000

Full time

30+ days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A financial services company in the UK is seeking an Analytics Engineer to enhance their data team. You will focus on creating and maintaining reusable data models and developing efficient ELT pipelines with tools like Databricks and Azure. The role emphasizes collaboration, governance, and innovative practices to optimize data usage for analytics and business intelligence.

Benefits

Generous benefits package
Competitive salary

Qualifications

  • Experience with data modelling in Databricks and Azure.
  • Proficient in building ELT pipelines.
  • Strong understanding of data governance practices.

Responsibilities

  • Build and maintain clean, reusable, and scalable data models.
  • Develop ELT pipelines ensuring quality and performance.
  • Contribute to data governance strategy and documentation.

Skills

Data Modelling
Pipeline Development
Governance
Collaboration
Innovation

Tools

Databricks
Azure
SQL
Python
PySpark
Azure DevOps
GitHub Actions
Job description
Overview

Location: London - various locations. Salary: Competitive salary plus a generous benefits package. Application Deadline: Tuesday, November 25, 2025.

We are looking for an Analytics Engineer to join our Data team and help build the firm’s modern data foundation. You’ll design governed, reusable models and scalable transformation layers using Databricks and Azure, turning raw data into trusted, insight-ready assets. With a focus on data modelling, CI/CD, and interoperability, you’ll enable analytics, AI, and business intelligence across the organisation—driving smarter decisions, automation, and exceptional outcomes.

For the full role specifics and requirements, please view the job description.

Key accountabilities

Data Modelling

  • Build and maintain clean, reusable, and scalable models that transform raw data into curated layers of logic, metrics, and dimensions
  • Define, document, and own business-critical metrics to drive consistency and trust across reporting and AI/ML applications
  • Design data structures (e.g. star/snowflake schemas) in Databricks and Azure environments that optimise query performance and user accessibility
  • Partner with dashboard developers and analysts to shape models that align to visual and operational use cases

Pipeline Development & Deployment

  • Build and maintain efficient ELT pipelines using Databricks (SQL, Python, PySpark), ensuring they are monitored, observable, and recoverable
  • Implement CI/CD workflows for analytics assets using Azure DevOps or GitHub Actions, ensuring reliable, version-controlled deployments
  • Set up robust data validation, alerting and testing practices to ensure high data quality and transparency
  • Collaborate with Data Engineers to ensure upstream data ingestion and structures meet transformation needs

Governance & Interoperability

  • Contribute to the firm’s data governance strategy through clear documentation, data contracts, lineage mapping, and metadata capture
  • Enable interoperability with internal systems (CRM, finance, digital platforms) and third-party tools (GA4, ESPs, IMiX) through standardised, API-ready data assets
  • Help define and maintain an internal data dictionary and analytics asset catalogue

Collaboration & Enablement

  • Act as a subject matter expert and partner to analysts, providing guidance on how to best use and extend curated models
  • Assist requirement-gathering and technical discovery sessions with business stakeholders to inform solution design
  • Foster a culture of curiosity, continuous improvement, and modular design thinking within the wider Data & AI team

Innovation & Continuous Improvement

  • Explore opportunities to use AI-assisted tools and code generation for improved development velocity and maintainability
  • Stay abreast of best practices in metadata-driven design, open standards, and data model evolution
  • Help shape and refine our approach to analytics modularisation and downstream consumption by multiple tools and teams

Key Competencies

Getting Things Done: Delivers on agreed objectives promptly; prioritises workload; remains professional under pressure;

Communication & Sharing Knowledge: Confident, clear and accurate with all communication; maintains accurate records and makes effective use of new technology;

Customer Service: Positive attitude to find solutions in line with TCF principles; Uses customer feedback to improve service;

Effectiveness & Adaptability: Able to maintain a high volume of work, striving for continual improvements; understands individual contribution in relation to corporate objectives; presents a positive image and approach to change; and

Team Working: Shares knowledge, skills and experience with colleagues; understands team goals; is cooperative and supportive of others.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.