Sign up or log in to watch the video
DevOps for AI: running LLMs in production with Kubernetes and KubeFlow
Aarno Aukia - 9 months ago
Explore the essentials of deploying and managing large language models (LLMs) in production environments using Kubernetes and KubeFlow. As AI and LLMs transition from experimental phases to business-critical applications, this session provides best practices, architectural design insights, and hands-on demonstrations to streamline AI workflows, ensure scalability, and maintain reliability. Ideal for developers and DevOps professionals, this talk will enhance your AI deployment strategies and operational efficiency in real-world business scenarios.
Jobs with related skills
GenAI Ops Engineer(x|f|m)
Sartorius
·
1 month ago
Municipality of Madrid, Spain
Hybrid
AI Software Engineer (m/f/d)
Sunhat
·
8 days ago
Berlin, Germany
+1
Hybrid
Principal Engineer AI Services (w/m/d)
BWI GmbH
·
12 days ago
München, Germany
+1
Hybrid
DevOps Engineer* NoSpamProxy
Instaffo
·
15 days ago
Paderborn, Germany
Related Videos