Tuesday, 31 March

Tuesday, 31 March2026

OpenAI Launches Bug Bounty: $25K for Universal GPT-5 Jailbreak

By Isha
OpenAI Launches Bug Bounty: $25K for Universal GPT-5 Jailbreak
OpenAI has rolled out an invite-only bug bounty program for its GPT-5 model, offering a $25,000 reward to the first individual who develops a universal jailbreak prompt that bypasses moderation to answer all ten bio/chem safety questions from a clean chat. A $10,000 prize is also available for the first team achieving the feat with multiple prompts. Submissions open now, with testing beginning September 9, 2025, all under strict NDA.
Read full story at Inkl

Download TechShots

IT Trends Move Fast. Stay Faster.

Share your insights

Subscribe To Our Newsletter.

Full Name
Email