
Lucia Cerchie
Apr 18, 2023
Let's Get Started With Apache Kafka® for Python Developers

#1about 3 minutes
Understanding the purpose and core use cases of Kafka
Apache Kafka is an event streaming platform designed for high-throughput, real-time data feeds like event-driven applications and clickstream analysis.
#2about 2 minutes
Exploring Kafka's core concepts of events, topics, and partitions
Events are organized into logical groupings called topics, which use an immutable log data structure split into partitions for scalability.
#3about 2 minutes
Understanding the roles of producers and consumers
Producers write events to topic partitions based on a key, while consumers read from topics and can be organized into groups to share workloads.
#4about 4 minutes
Building a real-time Kafka producer and consumer in Python
A code walkthrough demonstrates how to use the confluent-kafka library to create a producer that sends click events and a consumer that reads them in real time.
#5about 4 minutes
Navigating the Kafka ecosystem and the power of community
The broad Kafka ecosystem includes tools like k-cat and KIPs, and leveraging developer communities is key to overcoming learning challenges.
#6about 1 minute
Recapping Kafka's capabilities for real-time data feeds
A summary reinforces how Kafka's distributed nature and use of partitions enable a high-throughput, low-latency solution for real-time data.
#7about 23 minutes
Answering questions on Kafka use cases, careers, and learning
The Q&A covers real-world applications like fraud detection, decoupling microservices, the difference between Apache and Confluent Kafka, and learning resources.
Related jobs
Jobs that call for the skills explored in this talk.
today
Dev Ops / Infra

Roots Energy GmbH
Vienna, Austria
Senior
yesterday
Part Time Junior Python Backend / GenAI Support Intern

Eltemate
Amsterdam, Netherlands
Remote
Junior
2 days ago
Senior Agentic Data Scientist

Dynatrace
Linz, Austria
Senior