DevOps for AI: running LLMs in production with Kubernetes and KubeFlow
Aarno Aukia - 11 months ago
Explore the essentials of deploying and managing large language models (LLMs) in production environments using Kubernetes and KubeFlow. As AI and LLMs transition from experimental phases to business-critical applications, this session provides best practices, architectural design insights, and hands-on demonstrations to streamline AI workflows, ensure scalability, and maintain reliability. Ideal for developers and DevOps professionals, this talk will enhance your AI deployment strategies and operational efficiency in real-world business scenarios.
Jobs with related skills

(Senior) Experte (w/m/d) Data & KI
Raven51 AG
·
5 days ago
Melsungen, Germany
Hybrid

(Senior) IT Cloud Architekt /Banking (all genders)
msg
·
8 days ago
Frankfurt am Main, Germany
+8
Hybrid

DevOps / Datenbankadmin. Schwerpunkt Oracle (m/w/d)
Techniker Krankenkasse
·
14 days ago
Hamburg, Germany
Hybrid

Product Owner (m/w/d) Betrieb – Cloud & SaaS
PROSOZ Herten GmbH
·
25 days ago
Herten, Germany
Hybrid
Related Videos