Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer

Data Freelance Hub

Remote

GBP 60,000 - 80,000

Full time

18 days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A technology consultancy is seeking a Data Engineer for a long-term, fully remote position on a 1-year contract. Responsibilities include building and maintaining data pipelines, developing ingestion workflows, and implementing Infrastructure as Code. Candidates must have expertise in Azure, Python, and SQL, with experience in DDaT and GDS-Government projects. This role is ideal for those looking to contribute to high-impact projects while ensuring data quality and compliance.

Qualifications

  • Experience in building and maintaining data pipelines in Azure.
  • Proven skills in Python and SQL for data transformations.
  • Experience with DDaT and GDS-Government projects.

Responsibilities

  • Build and maintain data pipelines in Azure (ADF, Databricks, Key Vault).
  • Develop ingestion workflows for APIs and structured data.
  • Implement Infrastructure as Code (Bicep) for Dev/Test environments.
  • Deliver data transformations using Python and SQL.
  • Create data validation and reporting processes.
  • Support integration across various systems.

Skills

Azure
Python
SQL
Data pipeline development
ETL
Infrastructure as Code (IaC) – Bicep
Data quality
Agile
Data access controls
GDS / DDaT experience

Tools

Azure Data Factory (ADF)
Databricks
CRM (Dataverse)
Job description
Overview

This role is a Data Engineer on a 1-year contract, paying £490/day, fully remote. Key skills include Azure, Python, SQL, and data pipeline development. Experience in DDaT and GDS-Government projects is required.

Details
  • Contract: 1 year
  • Rate: £490/day
  • Location: United Kingdom; Remote
  • IR35: Inside IR35
  • Start date (approx): January 25, 2026
  • Duration: More than 6 months
Responsibilities
  • Build and maintain data pipelines in Azure (ADF, Databricks, Key Vault).
  • Develop ingestion workflows for APIs, structured data, and CSV sources.
  • Implement Infrastructure as Code (Bicep) for Dev/Test environments.
  • Deliver Bronze and Silver layer transformations using Python and SQL.
  • Create data validation and data quality reporting processes.
  • Support integration across systems (IDS/Postgres, Dataverse CRM, Web App platform).
  • Implement secure, role-based data access controls.
  • Ensure code quality, scalability, and compliance with DfE technical and security standards.
Delivery & Collaboration
  • Contribute to documentation, architecture materials, and production sign-off.
  • Participate in code reviews and adopt Agile delivery practices.
  • Support knowledge transfer and upskilling of internal teams.
  • Work closely with Data Quality and Data Innovation leads to resolve issues.
  • This is a long-term contract and project ideal for someone who has worked in DDaT and GDS-Government projects.
Keywords
  • ETL
  • Azure Data Factory (ADF)
  • CRM (Dataverse)
  • Databricks
  • Python
  • SQL
  • GDS / DDaT experience
  • Security & Compliance
  • Infrastructure as Code (IaC) – Bicep
  • Data quality
  • Agile
  • Data access controls

Talent International UK and its subsidiaries act as an employment agency for the supply of temporary workers. By applying for this opportunity, you accept the terms and privacy policy available at talentinternational.co.uk.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.