What if your AI could find and use new tools on its own? See how dynamic tool discovery creates powerful agents that can scrape the modern web.
#1about 1 minute
Why web data is essential for training large language models
LLMs are trained on massive web datasets like Common Crawl, but this leads to knowledge cutoffs and hallucinations.
#2about 2 minutes
How RAG provides LLMs with up-to-date context
Retrieval-Augmented Generation (RAG), or context engineering, feeds external, live data to LLMs to produce more accurate and timely answers.
#3about 3 minutes
Navigating the complexities of modern web scraping
Modern websites use dynamic JavaScript rendering and anti-bot measures, requiring headless browsers, proxies, and CAPTCHA solvers to access data.
#4about 2 minutes
Cleaning messy HTML and scaling data extraction
To avoid the 'garbage in, garbage out' problem, you must clean HTML by removing cookie banners and ads, and manage complexities like sitemaps and robots.txt.
#5about 3 minutes
Demo of scraping a website with Apify Actors
A demonstration shows how to use the Apify Website Content Crawler to perform a deep crawl of a website and extract its content into markdown.
#6about 2 minutes
Building a RAG chatbot with scraped data and Pinecone
The scraped website data is uploaded to a Pinecone vector database, enabling a chatbot to answer questions using the site's specific content.
#7about 1 minute
Using the Model Context Protocol for AI agent integration
The Model Context Protocol (MCP) provides a fluid, dynamic interface for AI agents to communicate with and discover tools, unlike static traditional APIs.
#8about 3 minutes
Demo of dynamic tool discovery using MCP
An AI agent uses MCP to dynamically search the Apify store for a Twitter scraper, add it to its context, and then use it to fetch live data.
Related jobs
Jobs that call for the skills explored in this talk.
WebMCP: Empowering Agents as First-Class Citizens of the WebWebMCP is an exciting W3C proposal that just landed in Chrome Canary to try out . The idea is that you can use some HTML attributes on a form or register JavaScript tool methods to give agents direct access to content. This gives us as content prov...
Daniel Cranney, Chris Heilmann
Dev Digest 215: Agent Memory, JS2026, Googlebot Analysis & Canvas❤️HTMLInside last week’s Dev Digest 215 .
🗿 Make AI talk like a caveman
🧠 A guide to context engineering for LLMs
🤖 Simon Willison on agentic engineering
🔐 Axios supply chain attack post mortem
🛡️ Designing AI agents to resist prompt injection
🎨 HTML in c...
Daniel Cranney
Dev Digest 210: AI Agents Are Go! Is MCP Dead? LLMs Crack AnonymityInside last week’s Dev Digest 210 .
🪦 Is MCP already dead?
🐍 Secure snake on the CLI
🏗️ The architecture behind open source LLMs
⚖️ AI companies and governments at odds
🦫 Is Go the best language for AI agents?
🕵️ “Security research” bot hacks Micros...