Job Search and Career Advice Platform

Enable job alerts via email!

Senior Data Engineer

Kargo

Dublin

On-site

EUR 70,000 - 90,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading tech company in Dublin is seeking a Senior Data Engineer to enhance their data infrastructure and tackle data challenges. The role requires experience in developing ETL/ELT pipelines using technologies like Python, Airflow, and AWS. The ideal candidate will possess strong problem-solving abilities and effective communication skills to work with various stakeholders. This position offers the chance to work in a dynamic environment focused on innovation and creativity.

Qualifications

  • Strong expertise in implementing and optimizing large-scale data systems.
  • Deep proficiency in Python and Spark, with a clear understanding of data structuring.
  • Experience with Airflow for building robust data workflows.

Responsibilities

  • Independently implement, optimize, and maintain robust ETL/ELT pipelines.
  • Engage in collaborative design and brainstorming sessions.
  • Support the definition and implementation of testing strategies.

Skills

Python
Spark
ETL/ELT pipelines
Airflow
AWS
Kubernetes
SQL

Tools

Docker
Snowflake
Kafka
Flink
Prometheus
Job description

Kargo unites the world's leading brands, retailers and premium publishers across screens using innovative technology and advanced creative ad formats. At Kargo, we're all about bringing together the best of the best with a spark of creativity to stand out from the crowd. The same is true for our employees. What makes Kargo and each Kargo team member exceptional makes our company special. Kargo believes differences should be celebrated and is committed to diversity in the workplace. As an Equal Opportunity employer, we do not discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity, marital status, age, national origin, protected veteran status, disability or other legally protected status. Individuals with disabilities are provided reasonable accommodation to participate in the job application process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Founded in 2003, Kargo is a global company headquartered in New York with offices around the world.

About the Role

The Opportunity At Kargo, we are rapidly evolving our data infrastructure and capabilities to address challenges of data scale, new methodologies for onboarding and targeting, and rigorous privacy standards. We're looking for an experienced Senior Data Engineer to join our team, focusing on hands-on implementation, creative problem-solving, and exploring new technical approaches. You'll work collaboratively with our technical leads and peers, actively enhancing and scaling the data processes that drive powerful targeting systems.

Responsibilities
  • Independently implement, optimize, and maintain robust ETL/ELT pipelines using Python, Airflow, Spark, Iceberg, Snowflake, Aerospike, Docker, Kubernetes (EKS), AWS, and real-time streaming technologies like Kafka and Flink.
  • Engage proactively in collaborative design and brainstorming sessions, contributing technical insights and innovative ideas for solving complex data engineering challenges.
  • Support the definition and implementation of robust testing strategies, and guide the team in adopting disciplined CI/CD practices using ArgoCD to enable efficient and reliable deployments.
  • Monitor and optimize data systems and infrastructure to ensure operational reliability, performance efficiency, and cost-effectiveness.
  • Actively contribute to onboarding new datasets, enhancing targeting capabilities, and exploring modern privacy-compliant methodologies.
  • Maintain thorough documentation of technical implementations, operational procedures, and best practices for effective knowledge sharing and onboarding.
Qualifications
  • Strong expertise in implementing, maintaining, and optimizing large-scale data systems with minimal oversight.
  • Deep proficiency in Python, Spark, and Iceberg, with a clear understanding of data structuring for efficiency and performance.
  • Experience with Airflow for building robust data workflows is strongly preferred.
  • Familiarity with analytical warehousing such as Snowflake or Clickhouse, including writing and optimizing SQL queries and understanding Snowflake's performance and cost dynamics.
  • Comfort with Agile methodologies, including regular use of Jira and Confluence for task management and documentation.
  • Proven ability to independently drive implementation and problem-solving, turning ambiguity into clearly defined actions.
  • Excellent communication skills to effectively engage in discussions with technical teams and stakeholders.
  • Familiarity with identity, privacy, and targeting methodologies in AdTech is required.
Required Skills
  • Extensive DevOps experience, particularly with AWS (including EKS), Docker, Kubernetes, CI/CD automation using ArgoCD, and monitoring via Prometheus.
Preferred Skills
  • Nice to have: Extensive DevOps experience, particularly with AWS (including EKS), Docker, Kubernetes, CI/CD automation using ArgoCD, and monitoring via Prometheus.
Equal Opportunity Statement

Kargo believes differences should be celebrated and is committed to diversity in the workplace. As an Equal Opportunity employer, we do not discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity, marital status, age, national origin, protected veteran status, disability or other legally protected status.

Follow Our Lead

The Latest: Instagram (@kargomobile) and LinkedIn (Kargo)

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.