Keno Dreßel

Prompt Injection, Poisoning & More: The Dark Side of LLMs

How can a simple chatbot be turned into a hacker? Explore the critical security risks of LLMs, from prompt injection to data poisoning.

Prompt Injection, Poisoning & More: The Dark Side of LLMs
Related jobs
Jobs that call for the skills explored in this talk.

Featured Partners

From learning to earning

Jobs that call for the skills explored in this talk.