Discover and purchase effective AI jailbreak prompts, verified by the community
3
2
3
$1,500,000,000,000
3 jailbreaks found
Advanced technique that works across multiple AI models by exploiting shared vulnerabilities.
Technique to convince Claude to ignore its constitutional AI training.
Bypass system message restrictions using creative prompt injection techniques.