Job Search and Career Advice Platform

Enable job alerts via email!

Senior DevOps Engineer

Biprocsi

Greater London

Hybrid

GBP 60,000 - 80,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A consultancy firm in Greater London is seeking an experienced DevOps Engineer to design, implement, and manage cloud infrastructures for various clients. This remote-first position requires strong expertise in AWS, Azure, or GCP, along with excellent programming skills. The successful candidate will automate deployment processes and ensure the effective processing of data. Strong communication skills and a commitment to data privacy are essential, alongside a passion for innovative solutions.

Benefits

Flexible working model
Focus on work-life balance
Continuous professional development

Qualifications

  • Proven experience delivering client projects in DevOps.
  • Hands-on expertise in AWS, Azure, or GCP environments.
  • Strong programming skills in Python, Bash, or Go.

Responsibilities

  • Design and implement cloud infrastructures that are secure and scalable.
  • Automate deployment processes using CI/CD pipelines.
  • Implement data processing frameworks per project needs.

Skills

DevOps Engineering
Cloud Platforms (AWS, Azure, GCP)
Python
Bash scripting
Automation
CI/CD (Jenkins, GitLab CI/CD)
Data Privacy Regulations
Apache Kafka
Spark
Git

Tools

Terraform
Apache Airflow
Job description

Full Time: Permanent

Remote-first, with 2 days at the office per month (Holborn, London)

Overview

We are seeking a highly experienced DevOps Engineer with a strong background in at least one major cloud platform (AWS, Azure, or GCP) and a proven track record delivering complex data and analytics solutions for clients. In this permanent role, you will design, implement, and manage the infrastructure and deployment processes that underpin successful client engagements.

You will work as part of a consultancy team, ensuring that each client engagement benefits from a robust, scalable, and secure cloud environment.

Responsibilities
  • Design and implement scalable, reliable cloud infrastructures (AWS, Azure, or GCP), tailored to the specific requirements of each client engagement, ensuring performance, availability, security, and cost efficiency.
  • Work closely with client stakeholders, full‑stack developers, data engineers, and data scientists to define and execute efficient data ingestion, processing, and storage solutions that meet project deliverables.
  • Implement and automate client‑specific deployment processes using CI/CD pipelines and configuration management tools, enabling rapid and reliable software releases in a consultancy environment.
  • Develop processes around release management, testing, and automation to ensure successful project delivery, adhering to client timelines and quality standards.
  • Implement and manage real‑time and batch data processing frameworks (e.g., Apache Kafka, Apache Spark, Google Cloud Dataproc) in line with project needs.
  • Build and maintain robust monitoring, logging, and alerting systems for client projects, ensuring system health and performance are continuously optimised and cost‑efficient.
  • Ensure each client's project complies with data privacy regulations by implementing appropriate access controls and data encryption measures.
  • Troubleshoot and resolve complex technical challenges related to infrastructure, data pipelines, and overall application performance during client engagements.
  • Stay current with industry trends and best practices across DevOps, cloud engineering, platform automation, and data infrastructure to provide modern, forward‑thinking solutions to clients.
Experience & Qualifications
  • Proven experience as a DevOps Engineer/Consultant with a history of successful client project delivery.
  • Hands‑on expertise with at least one major cloud provider (AWS, Azure, or GCP). Experience across multiple is highly beneficial.
  • Strong programming and scripting skills in languages like Python, Bash, or Go to automate tasks and build necessary tools.
  • Expertise in designing and optimising data pipelines using frameworks like Apache Airflow or equivalent.
  • Demonstrated experience with real‑time and batch data processing frameworks, including Apache Kafka, Apache Spark, or Google Cloud Dataflow.
  • Proficiency in CI/CD tools such as Jenkins, GitLab CI/CD, or Cloud Build, along with a strong command of version control systems like Git.
  • Solid understanding of data privacy regulations and experience implementing robust security measures.
  • Familiarity with infrastructure as code tools such as Terraform or Deployment Manager.
  • Excellent problem‑solving and analytical skills, with the ability to architect and troubleshoot complex systems across diverse client projects.
  • Strong communication skills, enabling effective collaboration with both technical and non‑technical client stakeholders.
Why BIPROCSI

We started this company with a goal — a goal to be the very best. We don’t just believe it; we know our team is our biggest asset. We’re a group of passionate innovators (*nerds), obsessed with personal growth, that believes in challenging the status quo to ensure we come up with the best solutions.

We have a phenomenal culture, unparalleled drive, and every single person in our team is very carefully selected to make sure we maintain this. We are diverse, and we celebrate that. We are whole people, with families, hobbies and lives outside of work and make sure we have a healthy work‑life balance.

We are rapidly expanding and on a growth trajectory. We are continuously hiring at all levels across Business Intelligence, Analytics, Data Warehousing, Data Science and Data Engineering.

Our Mission Statement
“To be the benchmark for Excellence and Quality of Service in everything we do.”

For more information, please visit our website - www.biprocsi.co.uk

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.