Job Search and Career Advice Platform

Enable job alerts via email!

Senior Data Engineer (contract)

Dubizzle Limited

Ledbury

On-site

GBP 60,000 - 80,000

Full time

30+ days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading IT services consultancy in the UK is seeking a Senior Data Engineer to develop and manage data pipelines, focusing on hybrid cloud data solutions. Candidates should have strong Python skills, experience with ETL processes, and expertise in Azure technologies. The role is on-site and full-time, offering an opportunity to empower business leaders through actionable data insights.

Qualifications

  • Expertise in Python for scripting and automating data processes.
  • Experience in developing and optimizing ETL/ELT workflows.
  • Knowledge of Azure cloud services and integration with on-premises infrastructure.
  • Experience with containerization and orchestration using Docker and Kubernetes.
  • Practical experience with Azure Data Factory.
  • Ability to manage event streaming platforms like Kafka.
  • Experience with implementing security practices across platforms.

Responsibilities

  • Develop and manage data pipelines for streaming and batch processing.
  • Implement and manage hybrid cloud and data storage solutions.
  • Utilize Docker and Kubernetes for application deployment.
  • Automate data flows and manage workflows with Azure Data Factory.
  • Use event-driven technologies to handle real-time data streams.
  • Manage security setups and access controls for data integrity.
  • Design and develop PostgreSQL databases ensuring performance.

Skills

Python
ETL/ELT
Hybrid Cloud Data Architecture
Docker
Kubernetes
Azure Data Factory
Event Streaming
Data Security
Elasticsearch
PostgreSQL
Job description
Senior Data Engineer

On-site • Full time

Methods Business and Digital Technology Limited

Methods is a £100M+ IT Services Consultancy who has partnered with a range of central government departments and agencies to transform the way the public sector operates in the UK. Established over 30 years ago and UK-based, we apply our skills in transformation, delivery, and collaboration from across the Methods Group, to create end-to-end business and technical solutions that are people-centred, safe, and designed for the future.

Our human touch sets us apart from other consultancies, system integrators and software houses - with people, technology, and data at the heart of who we are, we believe in creating value and sustainability through everything we do for our clients, staff, communities, and the planet.

We support our clients in the success of their projects while working collaboratively to share skill sets and solve problems. At Methods we have fun while working hard; we are not afraid of making mistakes and learning from them.

Predominantly focused on the public-sector, Methods is now building a significant private sector client portfolio.

Methods was acquired by the Alten Group in early 2022.

Requirements

On-site, Full time.

This role will require you to have ACTIVE Security Clearance, with a willingness to move to DV.

Your Responsibilities
  • Develop and Manage Data Pipelines: design, construct, and maintain efficient and reliable data pipelines using Python/Go/Azure Data Factory, capable of supporting both streaming and batch data processing across structured, semi-structured, and unstructured data in on-premises and Azure environments.
  • Hybrid Cloud and Data Storage Solutions: implement and manage data storage solutions leveraging both on-premises infrastructure and Azure, ensuring seamless data integration and accessibility across platforms.
  • Containerisation and Orchestration: utilise Docker for containerisation and Kubernetes for orchestration, ensuring scalable and efficient deployment of applications across both cloud-based and on-premises environments.
  • Workflow Automation: employ tools such as Azure Data Factory to automate data flows and manage complex workflows within hybrid environments.
  • Event Streaming Experience: utilise event-driven technologies such as Kafka and NATS to handle real-time data streams effectively.
  • Security and Compliance: manage security setups and access controls, incorporating tools like Keycloak to protect data integrity and comply with legal standards across all data platforms.
  • Database Development: designing and developing PostgreSQL databases, ensuring high performance and availability across diverse deployment scenarios.
Essential Skills and Experience
  • Strong Python Skills: expertise in Python for scripting and automating data processes across varied environments.
  • Experience with ETL/ELT: demonstrable experience in developing and optimising ETL or ELT workflows, particularly in hybrid (on-premises and Azure) environments.
  • Expertise in Hybrid Cloud Data Architecture: knowledge of integrating on-premises infrastructure with Azure cloud services.
  • Containerisation and Orchestration Expertise: solid experience with Docker, GitHub and Kubernetes in managing applications across both on-premises and cloud platforms.
  • Proficiency in Workflow Automation Tools: practical experience with Azure Data Factory in environments.
  • Experience in Event Streaming: proven ability in managing and deploying event streaming platforms like Kafka and NATS.
  • Data Security Knowledge: experience with implementing security practices and tools, including Keycloak, across multiple platforms.
  • Search and Database Development Skills: strong background in managing Elasticsearch and PostgreSQL in environments that span on-premises and cloud infrastructures.
Your Impact

In this role, you will empower business leaders to make informed decisions by delivering timely, accurate, and actionable data insights from a robust, hybrid infrastructure. Your expertise will drive the seamless integration of on-premises and cloud-based data solutions, enhancing both the flexibility and scalability of our data operations. You will champion the adoption of modern data architectures and tooling, and play a pivotal role in cultivating a data-driven culture within the organisation, mentoring team members, and advancing our engineering practices.

Desirable Skills and Experience
  • Certifications in Azure and Other Relevant Technologies: Certifications in cloud and on-premises technologies are highly beneficial and will strengthen your application.
  • Experience in Data Engineering: A minimum of 5 years of experience in data engineering, with significant exposure to managing infrastructure in both on-premises and cloud settings.
  • Some DevOps Engineering experience would be preferable.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.