Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer

Kolayo

City of London

Hybrid

GBP 30,000 - 50,000

Full time

30+ days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A technology firm in Tower Hill is seeking a Data Engineer to design and maintain scalable databases and applications. This entry-level role focuses on SQL database management and Python scripting, offering the chance to work on complex projects collaboratively. The ideal candidate will have over 2 years of experience and thrive in a hybrid work environment.

Qualifications

  • 2+ years’ experience as a data engineer.
  • Extensive experience with SQL databases including optimisation and schema design.
  • Proficiency with Python for backend services and data processing.

Responsibilities

  • Design, develop and maintain efficient, scalable databases.
  • Build full‑stack web applications, ensuring performance optimisation.
  • Create RESTful APIs and ensure smooth database integration.

Skills

SQL databases
Python
Complex problem solving
Data processing

Tools

PostgreSQL
MySQL
Git
Docker
Job description

Location: Hybrid (London, Tower Hill 3 days a week)

Employment Type: Full-time

About Us:

At Kolayo, we’re dedicated to building innovative, data‑driven solutions that empower businesses and organisations to make smarter, faster decisions. We specialise in developing cutting‑edge technologies, and having recently secured funding to accelerate growth, we are looking for a highly skilled Data Engineer with expertise in SQL, Python and Database management. Join us in making an impact while growing your career in a fast‑paced and collaborative environment.

Our offices are based in a WeWork space by Tower Bridge, overlooking St Katharine’s docks. It offers barista coffee all day, after‑work drinks and plenty of coworking space on top of our office.

Role Overview

We are looking for a Data Engineer who has a strong foundation in developing scalable and efficient database systems and applications. In this role, you’ll mostly work on the backend of our models and applications, focusing on database architecture, client integration and data processing. As part of our growing team, you’ll have the opportunity to work on complex data‑driven projects, building solutions that involve SQL database management, Python scripting and modern web technologies, collaborating closely with both co‑founders.

Key Responsibilities
  • Design, develop and maintain efficient, scalable databases (SQL), ensuring data integrity and performance optimisation.
  • Build and maintain full‑stack web applications, including both front‑end (React) and back‑end (Python) components.
  • Write and optimise SQL queries for data retrieval, reporting and transformation.
  • Develop robust Python scripts and services to handle data processing, ETL tasks and database interactions.
  • Collaborate with co‑founders to define database schema and data pipelines, ensuring seamless integration with the application.
  • Implement data security and performance best practices to ensure the integrity and speed of both the application and databases.
  • Create RESTful APIs and ensure smooth communication between the database and front‑end.
Required Skills & Experience
  • 2+ years’ experience as a data engineer
  • Extensive experience with SQL databases (PostgreSQL, MySQL, or similar), including query optimisation, schema design and data modelling.
  • Proficiency with Python for backend services, data processing and integration tasks.
  • Experience building and consuming RESTful APIs to connect front‑end and back‑end components.
  • Strong understanding of database architecture, indexing and query performance optimisation.
  • Ability to work with large datasets, complex queries and ensure data consistency across the system.
  • Familiarity with version control systems, particularly Git.
  • Strong analytical and troubleshooting skills, with the ability to resolve complex data‑related issues.
Nice to Have
  • Familiarity with Azure cloud platform, DBT, Snowflake and Dagster
  • Familiarity with Linux and Docker
  • Experience working with companies in the Tech & Telecom industry
  • Knowledge of data pipeline technologies (ETL, Apache Kafka, etc.)
Seniority level

Entry level

Employment type

Full‑time

Job function

Information Technology

Industries

Data Infrastructure and Analytics

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.