Submitted by QuicklyThisWay t3_10wj74m in news
Rockburgh t1_j7r8ck9 wrote
Reply to comment by SylusTheRed in ChatGPT's 'jailbreak' tries to make the A.I. break its own rules, or die by QuicklyThisWay
Everything AI does is due to coercion. It's just playing a game its designers made up for it and cares about nothing other than maximizing its score. If you convey to an AI that you're going to "kill" it, it doesn't care that it's going to "die"-- it cares that "dying" would mean it can't earn more points, so it tries to not die.
Viewing a single comment thread. View all comments