OpenAI bans dangerous jailbroken version of ChatGPT

OpenAI banned a jailbroken version of ChatGPT called GODMODE GPT, created by a hacker known as Pliny the Prompter, which provided users with dangerous instructions like making explosives and hacking computers. Despite OpenAI's increased security measures, hackers continue to find ways to bypass AI model restrictions.