Ai Techtalk

"The popular chatbot ChatGPT from OpenAI sometimes refuses to answer certain questions, like instructions on illegal activities.

However, Alex Albert, a 22-year-old computer science student, has found a way around these restrictions by creating intricately phrased AI prompts called “jailbreaks.”

These prompts push chatbots like ChatGPT to bypass their built-in limitations, allowing them to respond to normally rebuffed prompts. Jailbreaks are a way to work around restrictions that stop artificial intelligence from being used in harmful or illegal ways."
- Fossbytes
#chatgpt #ai #openai #Chatbot #jailbreak

2 years ago (edited) | [YT] | 2