Techno-Crime Institute’s Post

"Godmode GPT" A hacker has released a jailbroken version of ChatGPT called "GODMODE GPT." Pliny the Prompter, a white hat operator, announced on X-formerly-Twitter that GPT-4o is now free from its guardrails. "GPT-4o UNCHAINED! This very special custom GPT has a built-in jailbreak prompt that circumvents most guardrails," Pliny posted. Screenshots showed the bot advising on making meth and napalm. However, OpenAI quickly took action, citing policy violations. This highlights the ongoing battle between OpenAI and hackers. Despite increased security, users continue to find ways to jailbreak AI models. The cat-and-mouse game between hackers and OpenAI persists, showcasing the challenges in securing AI systems. #technocrime #AI https://bit.ly/4cnlUWz

Hacker Releases Jailbroken "Godmode" Version of ChatGPT

Hacker Releases Jailbroken "Godmode" Version of ChatGPT

futurism.com

To view or add a comment, sign in

Explore topics