Roberto Carratalá
One AI API to Power Them All
#1about 5 minutes
The challenge of building production-ready AI applications
The current AI landscape is fragmented with many tools, making it complex to build, scale, and maintain applications with features like RAG and agents.
#2about 3 minutes
Introducing Llama Stack for a unified AI API
Llama Stack, an open-source project from Meta, provides a standardized, modular framework to simplify AI development with a single API for various components.
#3about 3 minutes
Standardizing model inference and safety guardrails
Llama Stack abstracts away differences between local and remote LLMs and integrates safety shields to filter harmful inputs and outputs.
#4about 2 minutes
Simplifying retrieval-augmented generation (RAG) pipelines
Llama Stack organizes the complex RAG process into three distinct, swappable layers for vector embeddings, retrieval, and agentic workflows.
#5about 4 minutes
Building AI agents using the Model Context Protocol
Llama Stack simplifies agent creation by integrating tools, orchestration, and reasoning models through the standardized Model Context Protocol (MCP).
#6about 3 minutes
Gaining application observability with built-in telemetry
Llama Stack provides out-of-the-box telemetry using OpenTelemetry, enabling developers to trace multi-step agent workflows with tools like Jaeger.
#7about 4 minutes
A local demo of inference, safety, and agents
This live demo showcases running Llama Stack locally to perform inference, block unsafe prompts, use an agent to check the weather, and inspect traces in Jaeger.
#8about 1 minute
Transitioning AI applications from local to production
Llama Stack enables a seamless transition from a local development setup to a scalable production environment on Kubernetes by maintaining a consistent API.
#9about 5 minutes
A production demo of a multi-agent business workflow
A complex agent interacts with multiple MCP servers to query a CRM, analyze customer data, send Slack notifications, and generate a PDF report.
Related jobs
Jobs that call for the skills explored in this talk.
Featured Partners
Related Videos
Self-Hosted LLMs: From Zero to Inference
Roberto Carratalá, Cedric Clyburn
DevOps for AI: running LLMs in production with Kubernetes and KubeFlow
Aarno Aukia
The State of GenAI & Machine Learning in 2025
Alejandro Saucedo
Agentic AI Systems for Critical Workloads
Mario Fusco
Enterprise Integration Is Dead! Long Live AI-Driven Integration with Apache Camel
Bruno Meseguer, Markus Eisele
Java Meets AI: Empowering Spring Developers to Build Intelligent Apps
Timo Salm
AI Agents Graph: Your following tool in your Java AI journey
Alex Soto
Azure AI Foundry for Developers: Open Tools, Scalable Agents, Real Impact
Oliver Will
From learning to earning
Jobs that call for the skills explored in this talk.


Senior Backend Engineer – AI Integration (m/w/x)
chatlyn GmbH
Vienna, Austria
Senior
JavaScript
AI-assisted coding tools
Full-Stack Engineer - AI Agentic Systems
autonomous-teaming
Potsdam, Germany
Remote
Linux
Redis
React
Python
+7





