ChatGPT's 'jailbreak' tries to make the A.I. break its own rules, or die cnbc.com Submitted by QuicklyThisWay t3_10wj74m on February 8, 2023 at 1:17 AM in news 68 comments 217
bucko_fazoo t1_j7njsrj wrote on February 8, 2023 at 2:05 AM meanwhile, I can't even get chatGPT to stop apologizing so much, or to stop prodding me for the next question as if it's eager to move on from the current topic. "I'm sorry, I won't do that anymore. Is there anything else?" BRUH Permalink 31
Viewing a single comment thread. View all comments