Christian Liebel

Generative AI power on the web: making web apps smarter with WebGPU and WebNN

What if your web app could run generative AI without cloud costs or latency? Discover how WebGPU and the upcoming WebNN API make on-device AI a reality.

Generative AI power on the web: making web apps smarter with WebGPU and WebNN
#1about 1 minute

Generative AI use cases and cloud provider limitations

Cloud-based AI faces challenges like required internet connectivity, data privacy risks, and high costs, creating a need for local alternatives.

#2about 13 minutes

Running large language models locally with Web LLM

Web LLM enables running multi-gigabyte language models like Llama 3 directly in the browser for offline use, despite initial download and initialization times.

#3about 2 minutes

The technology behind in-browser AI execution

In-browser AI performance is accelerated by combining WebAssembly for efficient computation and the new WebGPU API for direct access to the system's GPU.

#4about 4 minutes

Boosting performance with the upcoming WebNN API

The Web Neural Network (WebNN) API provides access to dedicated Neural Processing Units (NPUs) for even faster, more efficient on-device model inference.

#5about 6 minutes

Solving model duplication with the new Prompt API

The experimental Prompt API addresses the issue of redundant model downloads by allowing websites to access a single, shared OS-level model like Gemini Nano.

#6about 3 minutes

Using the Prompt API for on-device data extraction

A demonstration shows how the Prompt API can use a local model to accurately extract structured data from unstructured text, highlighting its practical application.

#7about 2 minutes

Generating images in the browser with WebSD

WebSD brings text-to-image generation to the browser by running Stable Diffusion models locally using WebGPU, enabling creative AI tasks without cloud dependency.

#8about 1 minute

Weighing the pros and cons of local AI models

Local AI models offer superior privacy, offline availability, and low cost, but come with trade-offs like lower quality, high system requirements, and slower performance.

#9about 1 minute

The future of on-device AI in web development

While cloud-based models are currently superior, the trend towards more compact open-source models and OS-integrated AI suggests a growing role for local AI in specialized web applications.

Related jobs
Jobs that call for the skills explored in this talk.

Featured Partners

From learning to earning

Jobs that call for the skills explored in this talk.