Job Search and Career Advice Platform

Enable job alerts via email!

AWS Data Engineer

Jgasurveyors

City of London

Hybrid

GBP 75,000 - 90,000

Full time

30+ days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A fintech organization in Central London is seeking an AWS Data Engineer to join their innovative team. This full-time role offers a competitive salary range of £75,000 - £90,000 and is designed for a mid-senior level professional. Responsibilities include designing scalable data architectures and optimizing data pipelines. The ideal candidate has strong experience in data engineering, Python, SQL, and AWS, with a proactive problem-solving mindset. Join a culture that values innovation and collaboration.

Qualifications

  • Proven experience in data engineering and building scalable data solutions.
  • Strong experience with ETL processes, data modelling, and data warehousing.
  • Expertise in relational (SQL) and NoSQL database technologies.

Responsibilities

  • Design, develop, and maintain scalable data architectures and ETL pipelines.
  • Build and manage data models and data warehouse solutions.
  • Collaborate with data scientists, analysts, and engineering teams.

Skills

Data engineering experience
ETL processes
Python
SQL
AWS
Data modelling
Data warehousing
Collaboration skills
Problem-solving mindset

Tools

Airflow
dbt
Git
Job description

AWS Data Engineer at John Goddard Associates

Location: Central London (Hybrid – 3 days on‑site)

Salary: £75,000 - £90,000

We're proud to partner with a high‑growth fintech on the lookout for an AWS Data Engineer to join their fast‑paced, data‑driven organisation. This role is a great opportunity for someone who's eager to make an impact and get hands‑on with modern tools.

What You'll Be Doing
  • Design, develop, and maintain scalable data architectures and ETL pipelines
  • Build and manage data models and data warehouse solutions (Airflow, dbt, and Redshift)
  • Write clean, efficient Python and SQL code for data processing and transformation
  • Integrate data from internal and third‑party APIs and services
  • Optimise data pipelines for performance, scalability, and reliability
  • Collaborate with data scientists, analysts, and engineering teams to support business needs
  • Implement and uphold data security and compliance standards
  • Use version control systems (e.g. Git) to manage and maintain project codebases
  • Contribute to the continuous improvement of data processes and tooling across the organisation
Experience Required
  • Proven experience in data engineering and building scalable data solutions
  • Strong experience with ETL processes, data modelling, and data warehousing
  • Proficiency in Python and SQL
  • Expertise in relational (SQL) and NoSQL database technologies
  • Hands‑on experience with AWS
  • Solid understanding of data security, privacy, and compliance principles
  • Ability to optimise data pipelines for performance and maintainability
  • Strong collaboration skills and a proactive, problem‑solving mindset
Bonus Points For
  • Experience with Airflow and/or dbt
  • Experience working in Agile environments (Scrum/Kanban)
  • Exposure to DevOps practices or CI/CD pipelines

You'll join a business that values innovation, collaboration, and continuous learning, with a culture that champions autonomy and impact.

Seniority level
  • Mid‑Senior level
Employment type
  • Full‑time
Job function
  • Information Technology
  • Data Infrastructure and Analytics

McGregor Boyall is an equal opportunity employer and does not discriminate on any grounds.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.