Christian Weyer

Semantic AI: Why Embeddings Might Matter More Than LLMs

Are we too focused on LLMs? This talk argues that embeddings are the true foundation of modern AI, enabling powerful, deterministic systems for retrieval and routing.

Semantic AI: Why Embeddings Might Matter More Than LLMs
#1about 1 minute

Moving beyond hype with real-world generative AI

An internal company tool serves as a practical case study for applying language and embedding models to solve real business problems.

#2about 3 minutes

Integrating AI with existing enterprise data sources

The system combines API-based data from a third-party planning tool with document-based data from a Git-based knowledge base.

#3about 4 minutes

Building language-enabled universal interfaces for software

Instead of extending traditional GUIs, a universal interface allows users to interact with systems using natural language through platforms like Slack or voice.

#4about 3 minutes

Demonstrating a multi-system AI chat interface

A live demo shows how a single chat interface can query both a knowledge base and an employee availability system, providing source links to verify information.

#5about 3 minutes

Contrasting language models and embedding models

Language models are non-deterministic and generative, while embedding models are deterministic and create vector representations for comparison and retrieval.

#6about 4 minutes

Implementing retrieval-augmented generation for documents

The RAG pattern uses embeddings and a vector database to find relevant document chunks to provide as context for an LLM's answer.

#7about 4 minutes

Using LLMs for structured data and API calls

By providing a technical schema in the prompt, a language model can be forced to generate structured, machine-readable output for reliable API integration.

#8about 4 minutes

How semantic routing directs user queries

Semantic routing uses embeddings to classify a user's intent by finding the closest cluster of example questions, directing the request to the correct backend system.

#9about 1 minute

Why embeddings are the foundation of AI systems

Embeddings are crucial not just within LLMs but also for encoding meaning and enabling core architectural patterns like semantic routing and guarding.

Related jobs
Jobs that call for the skills explored in this talk.

Featured Partners

From learning to earning

Jobs that call for the skills explored in this talk.