Submitted by wtfcommittee t3_1041wol in singularity
gleamingthenewb t1_j3605ep wrote
Reply to comment by LarsPensjo in I asked ChatGPT if it is sentient, and I can't really argue with its point by wtfcommittee
Nope, that's just its prediction of what string of characters corresponds to your prompt.
LarsPensjo t1_j36arit wrote
Ok. Is there anything you can ask me, where the answer can't be explained as me just using a prediction of a string of characters corresponds to your prompt?
gleamingthenewb t1_j36ht6y wrote
That's a red herring, because your ability to generate text without being prompted proves that you don't just predict strings of characters in response to prompts. But I'll be a good sport. I could ask you any personal question that has a unique answer of which you have perfect knowledge: What's your mother's maiden name? What's your checking account number? Etc.
LarsPensjo t1_j36mdv5 wrote
But that doesn't help to determine whether it uses reflection on its own thinking.
gleamingthenewb t1_j36srlq wrote
You asked for an example of a question that can't be answered by next-word prediction.
FusionRocketsPlease t1_j36r2pm wrote
Try asking an unusual question that one would have to reason with to answer. He will miss. This has already been shown in this same sub today.
Viewing a single comment thread. View all comments