Echo Chamber, Prompts Used to Jailbreak GPT-5 in 24 Hours

Monday, August 11, 2025 4:46 PM | darkreading
Researchers paired the jailbreaking technique with storytelling in an attack flow that used no inappropriate language to guide the LLM into producing directions for making a Molotov cocktail.