Senior Data Operations Engineer

Wasserkraft GmbH
Hamburg, Germany
2 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

Hamburg, Germany

Tech stack

Sql Data Warehouse
Microsoft Windows
Airflow
Azure
Bash
Cloud Computing
Computer Networks
System Configuration
Continuous Integration
Information Engineering
Data Infrastructure
ETL
Data Warehousing
Linux
DevOps
Mobile Application Software
Python
SQL Azure
Windows Server
Powershell
Prometheus
DataOps
SQL Databases
Data Logging
Scripting (Bash/Python/Go/Ruby)
Snowflake
Grafana
GIT
Kubernetes
Infrastructure Automation Frameworks
Information Technology
Terraform
Data Pipelines
Docker

Job description

  • Arbeitszeit Vollzeit

  • Typ Festanstellung

Gewünschte Fähigkeiten & Kenntnisse

Kubernetes Socia-Media-Tool Networking Website Cloud CAN Windows Make Data Warehouse Entwicklungsumgebungen Linux Automatisierung Design Security SQL Azure IT across Snowflake Übersetzungssoftware Mobile App Python PowerShell Docker DataOps Grafana GIT Bash Continuous Integration DevOps Engineering Teamfähigkeit Engagement, * Design, implement and maintain scalable and automated data pipelines and our underlying infrastructure in Azure as well as managing and developing our CI/CD workflows, and the associated tools.

  • Monitoring data infrastructure performance and proactively performing troubleshooting will also be in your responsibility while you can also develop our security and business continuity best-practices in collaboration with central IT and security teams.
  • Closely collaborate with central IT teams, vendors and DevOps experts across the organization to mitigate and resolve incidents, plan and perform software updates and facilitate knowledge exchange.
  • Team up with our architect and data engineers to further improve the user experience, drive automation and make sure we apply state-of-the-art technologies.
  • Join the team of data engineers and partner-up to design and implement new use cases from data source to report layer. We work with top-notch technologies like Snowflake, Matillion, Python and infrastructure tools like Terraform, Docker, and Kubernetes.
  • Take responsibility to ensure continuous operations of the data warehouse across all domains. I.e. ensure daily processes run smoothly and data is readily accessible for all users.
  • Enjoy a high degree of self-responsibility and the possibility to expand your expertise towards areas of your interest.

Requirements

Your profile:

  • Pro-active, self-driven team player with excellent communication skills and a broad interest across all technical domains is a must-have in this position.
  • At least three years hands-on experience in designing, implementing, and maintaining scalable data solutions using Microsoft Azure cloud services. During this time, you were in the lead to develop and maintain DevOps build and release pipelines using e.g. Git.
  • Sound practical experience with cloud infrastructure troubleshooting and understanding of networking components.
  • Good hands-on experience with application containerization into Kubernetes, managing Kubernetes namespaces and containers, and experience with managing VM infrastructure.
  • Comfortable maintaining, patching and configuring both Linux and Windows servers as admin. Proficiency with the scripting languages Python and Bash as well as with Linux will be required. Knowledge of Powershell would be a merit.
  • Experience with infrastructure provisioning and deployment using Infrastructure-as-Code (IaC) tools like Terraform.
  • University degree in Business and Information Technology, Engineering or other related fields and fluent in English, written and spoken. As a merit you also have
  • Experience with implementing observability best practices - logging, monitoring, and alerting using tools like Grafana, Prometheus, or Azure monitor.
  • It would ease your transition into the data engineering part of this role if you have hands-on experience with SQL based data modelling in a cloud data warehouse-environment. Previous experience with data orchestration tools, e.g. Matillion or Airflow will further simplify your onboarding.

Additional Information

We welcome your application in English, no later than 22nd of April 2026. We kindly request that you do not send applications by any means other than via our website as we cannot guarantee that we will be able to process applications that are not made via our website.

If your profile aligns with the role, well invite you to an intervieweither online or in person. The first conversation focuses on your experience and motivation. If successful, a second interview may follow, giving us both the chance to explore your fit with the team and introduce you to future colleagues.

Apply for this position