Jailbreak
A collection of 4 posts
How to Prompt Inject through Images
Why prompt injecting through images works as a jailbreak, even for GPT 5, and how you can test on your own GPTs.
· Sagger Khraishi
New Jailbreak for GPT-5
It shouldn't be this easy to jailbreak GPT-5, but here we are with a new injection technique.
· 4Fsh Team
The Skeleton Key: A New AI Jailbreak Threat
Discover how Skeleton Key AI jailbreak poses new cybersecurity challenges and solutions to mitigate this threat.
· 4Fsh Team
Understanding AI Jailbreaks
Explore AI jailbreaks, their risks, and effective strategies to mitigate them in this in-depth analysis.
· Sagger Khraishi