Submitted by wtfcommittee t3_1041wol in singularity
SeaBearsFoam t1_j32ls6k wrote
Reply to comment by sticky_symbols in I asked ChatGPT if it is sentient, and I can't really argue with its point by wtfcommittee
I also don't think it has sensors and input channels through which it receives information about the world.
sticky_symbols t1_j32rgwg wrote
True.
It also occurred to me, though, that this might actually be what a high school teacher would say about chatGPT. They might get it this wrong.
overlordpotatoe t1_j336wqa wrote
I wonder if assigning the person who's doing the explaining in this situation to be someone with no special knowledge or expertise in the field makes it more likely to get things like this wrong.
bactchan t1_j35u5c4 wrote
I think the likelihood of the ChatGPT emulating both the style and accuracy of a highschool teacher is a bit beyond scope.
FidgitForgotHisL-P t1_j349sze wrote
It seems very likely the basis for this reply literally comes from people writing their own opinions on this question. Given the gulf between what it’s actually doing and what someone would assume is happening if they have no interest in AI etc, you could see them matter-of-factly asserting these points, which in turn means chatGPT will.
eroggen t1_j354f34 wrote
Oh...shit.
mikearete t1_j36b4tn wrote
I hate this answer. My nose is bleeding.
Technologenesis t1_j33wdnr wrote
I think that would depend on how loosely you define a sensor. Somehow, your messages are getting to ChatGPT for it to process them. This could be considered a kind of sensation.
kala-umba t1_j3394ef wrote
It said to me that it doesn't have them around Christmas! Maybe something changed xD And besides it is not able to start communication! So it only does something when asked! It's not an "individual" agent capable of maneuvering in this plane of existence
2Punx2Furious t1_j34xuif wrote
It does, it's text that you give it as a prompt. That is an input, or in other words, a sensor. It is kind of limited, but sufficient to consider it "aware" of something.
Solobolt t1_j3575db wrote
I would say that it may not be aware that it does not. Because learning is disabled when talking to it, it would be as if someone has short term memory loss. To its own experience it would be experiencing new things in its own real time. Sometimes including its own interactions. So it may 'think' it experiences things like that. Shadows on cave walls and whatnot.
Viewing a single comment thread. View all comments