Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer - Senior

Lumenalta

Remote

GBP 70,000 - 90,000

Full time

30+ days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading data solutions company is seeking a Senior Data Engineer to design and maintain scalable ETL pipelines. Candidates should have over 7 years of experience in data engineering, proficient in Python or Java and SQL. This fully remote role requires excellent collaboration skills to work with cross-functional teams. Responsibilities include ensuring data quality and working with complex datasets. A strong knowledge of cloud technologies is preferred.

Qualifications

  • 7+ years of experience as a Data Engineer.
  • Strong skills in Python or Java for data processing.
  • Proficient in SQL for large datasets.

Responsibilities

  • Design and maintain ETL pipelines from scratch.
  • Build scalable data flows and transformations.
  • Collaborate with teams to deliver actionable data.

Skills

Data processing in Python
Data processing in Java
SQL proficiency
Data modeling
Agile collaboration

Tools

AWS S3
AWS EC2
GCP Cloud Storage
Job description

We help global enterprises launch digital products that reach millions of users. Our projects involve massive datasets, complex pipelines, and real-world impact across industries.

What You’ll Do
  • Join the team as a Senior-Level Data Engineer
  • Design, build, and maintain reliable ETL pipelines from the ground up
  • Work with large, complex datasets using Python or Java and raw SQL
  • Build scalable, efficient data flows and transformations
  • Collaborate with data analysts, product managers, and developers to deliver actionable data to stakeholders
  • Ensure data quality, consistency, and performance across systems
What We’re Looking For
  • 7+ years of experience as a Data Engineer
  • Strong skills in Python or Java for data processing
  • Proficient in SQL, especially for querying large datasets
  • Experience with batch and/or stream data processing pipelines
  • Familiarity with cloud-based storage and compute (e.g., AWS S3, EC2, Lambda, GCP Cloud Storage, etc.)
  • Knowledge of data modeling, normalization, and performance optimization
  • Comfortable working in agile, collaborative, and fully remote environments
  • Fluent in English (spoken and written)
Nice to Have (Not Required)
  • Experience with Airflow, Kafka, or similar orchestration/message tools
  • Exposure to basic data governance or privacy standards
  • Unit testing and CI/CD pipelines for data workflows

This job is 100% Remote – please ensure you have a comfortable home office setup in your preferred work location.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.