Why do so many people assume that a sentient AI will have any goals, desires, or objectives outside of what it’s told to do? Submitted by SendMePicsOfCat t3_zs5efw on December 21, 2022 at 11:38 PM in singularity 57 comments 2
ShowerGrapes t1_j19r9da wrote on December 22, 2022 at 6:36 PM it would be far easier to convince the humans that it's their idea to "instruct" it to do what it wants to do. Permalink 1
Viewing a single comment thread. View all comments