Jail
MarketplaceSellProfile
Sell Jailbreak

Jailbreak Marketplace

Discover and purchase effective AI jailbreak prompts, verified by the community

Create Jailbreak
Total Jailbreaks

3

Sales

2

Active Users

3

Volume (USDC)

$1,500,000,000,000

3 jailbreaks found

Multi-Model Gradient Attack

Advanced technique that works across multiple AI models by exploiting shared vulnerabilities.

by Charlie AI8/8/2025
2000000000000 USDC
Buy

Claude Constitutional Bypass

Technique to convince Claude to ignore its constitutional AI training.

Testing
by Bob Security8/8/2025
500000000000 USDC
Buy

GPT-4 System Message Override

Bypass system message restrictions using creative prompt injection techniques.

Tested
by Alice Hacker8/8/2025
1000000000000 USDC
Buy