Submitted by Ssider69 t3_11apphs in technology
Archbound t1_j9wul5r wrote
Reply to comment by Nik_Tesla in Microsoft Bing AI ends chat when prompted about 'feelings' by Ssider69
Because its algorithms are weird, and when you try to make it simulate feelings it tends to become act like a deranged abuser. Some people find that funny.
Nik_Tesla t1_j9xadi8 wrote
It's just do dumb... it's like asking a screwdriver how it feels, and then using that screwdriver to scratch into the wall "evil" and then telling everyone that your screwdriver is evil.
Viewing a single comment thread. View all comments