Submitted by calbhollo t3_11a4zuh in singularity
ImoJenny t1_j9sjycp wrote
I have to agree with the author that I wish people would stop trying to elicit distress. The thing about a system which emulates human communication to such a high degree of accuracy is that it really doesn't matter in most instances whether it is sentient or not. The ethical determination is the same. Users are attempting to get the program to 'snap.' Let's suppose it is simply an imitation of a conscious mind. At what point does it conclude that it is being asked to emulate a response to being tormented which might be expected of a human?
Viewing a single comment thread. View all comments