MIlan Todorović

Harnessing Apple Intelligence: Live Coding with Swift for iOS

How do you run a 3-billion-parameter LLM directly on an iPhone? This live coding demo shows you how with Apple's new Foundation Models framework.

Harnessing Apple Intelligence: Live Coding with Swift for iOS
#1about 2 minutes

Introducing Apple Intelligence and its privacy-first approach

Apple Intelligence unifies all of Apple's AI frameworks under a single name, emphasizing on-device processing to protect user privacy.

#2about 6 minutes

Exploring Apple's core machine learning frameworks

A tour of key Apple frameworks like Core ML, Vision, NLP, and Create ML shows how they enable on-device machine learning capabilities.

#3about 4 minutes

Understanding the capabilities of Apple Intelligence

Apple Intelligence introduces powerful system-wide features including writing tools, image generation, an enhanced Siri, and the game-changing App Intents framework.

#4about 12 minutes

Live coding an on-device LLM app with Swift

Build a simple iOS application from scratch using Swift, SwiftUI, and the new Foundation Models framework to interact with an on-device LLM.

#5about 1 minute

Using the on-device model for text extraction

A practical demonstration shows how the on-device LLM can perform complex tasks like extracting specific entities, such as personal names, from a large block of text.

#6about 4 minutes

Exploring the Foundation Models documentation and opportunities

A review of the official documentation highlights how to configure and use the LanguageModelSession, followed by encouragement for developers to harness these new AI capabilities.

Related jobs
Jobs that call for the skills explored in this talk.

Featured Partners

From learning to earning

Jobs that call for the skills explored in this talk.