Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer

Applied Intuition Inc.

Greater London

On-site

GBP 60,000 - 80,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A technology platform is seeking a Data Engineer to enhance their Data Team in West London. The role requires over 3 years of experience in data infrastructure, proficiency in SQL and Python, and familiarity with tools such as Docker and Kafka. Responsibilities include optimizing data integration pipelines and enforcing code quality. Employees enjoy flexible working hours, stock options, and various perks like life insurance and cycle-to-work schemes.

Benefits

Flexible working hours
Stock options
Life and critical illness insurance
Monthly credit for Wheely journeys
Cycle to work scheme
Top-notch equipment
Relocation allowance

Qualifications

  • 3+ years of experience in Data Engineer or related roles.
  • Fluent in SQL and Python.
  • Intermediate level of English required.
  • Experience with open-source technologies in data ingestion and modeling.

Responsibilities

  • Enhance Data team with architectural best practices.
  • Support evolving data integration pipelines.
  • Enforce code quality and automated testing.

Skills

Data Infrastructure Engineering
SQL
Python
Docker
Kafka
Airflow
Snowflake
Data Modeling

Education

Technical university degree

Tools

Debezium
MLflow
Metabase
Text-2-SQL
GitOps
k8s
Job description

Wheely is not a traditional ride-hailing company. We are building a platform with user privacy at its core while successfully scaling a five-star service to millions of rides across multiple cities.

We are looking for a Data Engineer to strengthen our Data Team at Wheely, proactively seeking and providing Business Users and Data Scientists with best-in-class and seamless data experience.

Responsibilities
  • Enhance Data team with architectural best practices and low-level optimizations
  • Support on evolving data integration pipelines (Debezium, Kafka, dlt), data modelling (dbt), database engines (Snowflake), ML Ops (Airflow, MLflow), BI reporting (Metabase, Observable, Text-2-SQL), reverse ETL syncs (Census)
  • Cover up business units with feature requests / bugfixes / data quality issues
  • Enforce code quality, automated testing and code style
Requirements
  • 3+ years of experience in Data Infrastructure Engineer / Data Engineer / MLOps Engineer roles;
  • Have work experience or troubleshooting experience in the following areas:
    • Analytical Databases: configuration, troubleshooting (Snowflake, Redshift, BigQuery)
    • Data Pipelines: deployment, configuration, monitoring (Kafka, Airflow or similar)
    • Data Modeling: DRY and structured approach, applying performance tuning techniques
    • Containerizing applications and code: Docker, k8s
  • Fluent with SQL and Python;
  • At least Intermediate level of English;
  • Have experience in researching and integrating open-source technologies (data ingestion, data modelling, BI reporting, LLM applications, etc.);
  • Ability to identify performance bottlenecks;
  • Team work: GitOps, Continuous Integration, Code reviews;
  • Technical university graduate.
What we Offer

Wheely expects the very best from our people, both on the road and in the office. In return, employees enjoy flexible working hours, stock options and an exceptional range of perks and benefits.

  • Office-based role located in West London
  • Competitive salary & equity package
  • Life and critical illness insurance
  • Monthly credit for Wheely journeys
  • Cycle to work scheme
  • Top-notch equipment
  • Relocation allowance (dependent on role level)
  • Wheely has an in‑person culture but allows flexible working hours and work from home when needed.

Interested in building your career at Wheely? Get future opportunities sent straight to your email.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.