Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer

Viridiengroup

Crawley

Hybrid

GBP 45,000 - 65,000

Full time

30+ days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A dynamic technology company in Crawley is seeking a Data Engineer to develop their software solution aimed at transforming complex data challenges. This role involves building data platforms, developing flexible data transformation frameworks, and collaborating with cross-functional teams. Candidates should have significant experience with Python, SQL, and ETL tools, particularly Airflow, and be familiar with database systems like PostgreSQL and NoSQL. The position supports hybrid working arrangements.

Benefits

Discounts on restaurants
Cinema ticket discounts
Tech and travel discounts

Qualifications

  • Experience designing and maintaining data transformations in a product setting.
  • Ability to write secure, performant code in Python and SQL.
  • Experience using orchestrators and data pipelines.

Responsibilities

  • Develop the data platform infrastructure including orchestration systems.
  • Build robust data pipelines for metadata-driven solutions.
  • Collaborate with analysts and scientists to ensure data accessibility.

Skills

Python
SQL
ETL tools
Airflow
PostgreSQL
DevOps
DataOps

Tools

Docker
Kubernetes
Git
NoSQL databases
Job description
) for more information.**Job Summary**The Data Engineer plays an important role in the development of our software solution, used by our clients to help them with their complex data transformation challenges. Our system combines the latest ML based techniques with logic-based transformation, overseen by domain experts, to provide innovative solutions to our clients. This role supports the development of the data system focusing on orchestration, resilience and scaling. Additionally, we aim to provide a framework on which our data transformation modules can be developed by a growing team of junior engineers and technical SMEs. The role may also support the implementation of the systems, including deployment and integration with clients’ own data stores, processes and workflows.**Team Description**Data Hub is a dynamic team of scientists and developers who love solving complex problems. We provide leading edge technology solutions and services to solve our clients’ data transformation and analytics challenges across a range of industries including geothermal, environmental, hydrocarbon and mineral exploration. You will be working in an open and collaborative environment with opportunities to learn, grow, and develop. We have an informal team culture and believe work should be fun and rewarding. You will be based in one of our hub locations (Crawley or Llandudno) and you will be working alongside our teams of data engineers, machine learning engineers, software engineers and subject matter experts. We offer a hybrid working and remote working can be considered.**Key Responsibilities*** Contribute to the development of our data platform infrastructure. This includes our orchestration systems, data processing logic and the interactions between system components.* Help develop a flexible framework for data transformations by creating a modular system where new transformation logic can be easily developed and integrated into our product offering.* Build robust data pipelines with a focus on dynamic, end-to-end, metadata driven solutions that consider a wide range of implications, such as downstream application/UI data access patterns, maintainability, monitoring, access control etc.* Influence our choice of architecture and technology. You will be expected to communicate design ideas and solutions clearly through architectural diagrams and documentation to both technical and non-technical stakeholders.* Awareness of best practices in software and data engineering, writing secure, performant, and maintainable code (Python, SQL). You will have a keen eye for minimising technical debt and optimising performance where it matters.* Partner with data analysts, data scientists, and other end-users to understand their requirements and ensure the platform and its data are accessible, reliable, and meet project delivery needs.* Share your work and best practices; collaborate with others; ensure what we build and how we build it aligns to our ambition for growth.**Qualifications and Experience**Previous experience of designing, building and maintaining data transformations in a system or product setting.Ability to write secure and performant code in Python and SQL, and ability to Significant experience using orchestrators and ETL tools, especially AirflowSignificant RDBMS experience (PostgreSQL, Oracle). Experience with other database types such as NoSQL database (e.g. Neo4j, Elastic) or Vector also beneficialData architecture experience relating to data modelling, data warehousing and schema design (3NF, dimensional modelling, medallion architecture).Experience using docker, VCS (git, Gitlab) and knowledge of CI/CDKnowledge of DevOps and DataOps best practices.Kubernetes deployment experience. Previous experience building web applications together with wide-ranging knowledge of web frameworks, HTTP, networking, security etc.**Benefits Package** Discounts on nationwide restaurants, cinema tickets and days out through our benefits platformTech, Travel and Fashion discounts all available through our benefits platform Create a brighter future for
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.