Job Search and Career Advice Platform

Enable job alerts via email!

Tech Lead - Data Analytics & Data Engineering (AWS)

Data Freelance Hub

Greater London

Hybrid

GBP 60,000 - 80,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading data consultancy in the UK is seeking a Tech Lead for Data Analytics and Engineering to lead a team in delivering AWS-based data platforms for central government projects. This hands-on role requires SC and NPPV3 clearance, along with strong skills in AWS services such as S3, Glue, and Redshift. The candidate will oversee technical direction, ensure engineering standards, and work closely with stakeholders to deliver high-quality data solutions. Competitive daily rates apply, along with flexibility in work location.

Qualifications

  • Active SC and NPPV3 clearance required.
  • Proven experience as a Tech Lead or Lead Data Engineer.
  • Experience working with data lakes and partitioning strategies.

Responsibilities

  • Provide technical leadership to a multi-disciplinary team.
  • Design data transformation patterns using AWS services.
  • Own data architecture and engineering standards.

Skills

Strong hands-on experience with Amazon S3
Advanced Python (PySpark)
Amazon Glue
Amazon Redshift
CI/CD pipelines experience
Strong communication with stakeholders
Agile delivery experience

Tools

AWS Glue
Terraform
AWS Lambda
Job description
Tech Lead - Data Analytics & Data Engineering (AWS) - SC+NPPV3

⭐ - Featured Role | Apply direct with Data Freelance Hub

This role is for a Tech Lead - Data Analytics & Data Engineering (AWS) with a contract from April 1, 2026, to November 30, 2026, paying £550 - £600 per day. Requires SC and NPPV3 clearance, strong AWS skills, and Agile experience.

United Kingdom

#Storage #dbt (data build tool) #Athena #Cloud #Data Ingestion #Data Quality #Data Engineering #S3 (Amazon Simple Storage Service) #Scala #Infrastructure as Code (IaC) #Python #Security #AWS (Amazon Web Services) #IAM (Identity and Access Management) #Terraform #DevOps #Observability #Data Lake #Spark (Apache Spark) #Leadership #Lambda (AWS Lambda) #Redshift #Amazon Redshift #PySpark #Data Architecture #AWS Glue #ETL (Extract #Transform #Load) #SQL (Structured Query Language) #Agile

Posted January 28, 2026 £550 - £600 per day London Contract Tech Lead – Data Analytics & Data Engineering (AWS) – Central Government (Contract) Duration: 1 April 2026 – 30 November 2026 Rate: Up to £600 per day IR35 Status: Outside IR35 Location: Hybrid / Client site as required Clearance: SC and NPPV3 (both essential) Sector: UK Central Government

We are seeking an experienced Tech Lead – Data Analytics & Data Engineering to support a central government programme delivering secure, scalable AWS‑based data platforms within a highly regulated environment. This engagement is Outside IR35, operating on a project‑based delivery model with a clear focus on outcomes rather than headcount substitution. The role is hands‑on and delivery‑focused, combining technical leadership with deep engineering expertise, working closely with delivery leadership, architects, and senior stakeholders to shape and deliver incremental value across a modern data platform.

The Role

As Tech Lead, you will take ownership of the technical direction and delivery of a small multi‑disciplinary team, spanning data engineering, analytics engineering, and DevOps. You will be accountable for solution design, engineering standards, and delivery assurance across an AWS data platform using S3, Glue, and Redshift.

Key Responsibilities
  • Provide hands‑on technical leadership and mentoring to a small delivery team
  • Design and deliver data ingestion, transformation, and storage patterns using S3, AWS Glue, and Redshift
  • Own data architecture decisions, including data modelling, partitioning strategies, and performance optimisation
  • Define and enforce engineering standards across code quality, testing, CI/CD, IaC, and observability
  • Build and optimise ELT/ETL pipelines using Glue (PySpark), Lambda, Step Functions, and event‑driven patterns
  • Champion security‑by‑design in collaboration with security and assurance teams (IAM, KMS, encryption, auditability)
  • Implement CI/CD pipelines and Infrastructure as Code using Terraform or CloudFormation
About You
  • Active SC and NPPV3 clearance (both required)
  • Proven experience as a Tech Lead or Lead Data Engineer across data analytics and data engineering domains
  • Strong hands‑on experience with Amazon S3 (data lake patterns, partitioning, lifecycle management)
  • Amazon Glue (Jobs, Crawlers, PySpark, orchestration)
  • Amazon Redshift (data modelling, performance tuning, WLM, Spectrum)
  • Advanced Python (PySpark) and SQL skills
  • Experience implementing CI/CD pipelines and Infrastructure as Code
  • Proven ability to engage senior stakeholders and clearly communicate technical trade‑offs
  • Experience working in Agile delivery environments
Nice to Have
  • Experience with AWS Lake Formation, Lambda, Step Functions, Athena, EMR, DataBrew
  • Familiarity with data quality, lineage, or observability tooling
  • Knowledge of GDS ways of working and public sector delivery frameworks
  • Experience with privacy‑by‑design, data retention, and FOIA considerations
  • Exposure to dbt, Redshift RA3, or AQUA acceleration

Reasonable Adjustments: Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people of all backgrounds and perspectives. Our success is driven by our people, united by the spirit of partnership to deliver the best resourcing solutions for our clients. If you need any help or adjustments during the recruitment process for any reason, please let us know when you apply or talk to the recruiters directly so we can support you.

Freelance data hiring powered by an engaged, trusted community — not a CV database.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.