Viewing a single comment thread. View all comments

gleamingthenewb t1_j3605ep wrote

Nope, that's just its prediction of what string of characters corresponds to your prompt.

1

LarsPensjo t1_j36arit wrote

Ok. Is there anything you can ask me, where the answer can't be explained as me just using a prediction of a string of characters corresponds to your prompt?

1

gleamingthenewb t1_j36ht6y wrote

That's a red herring, because your ability to generate text without being prompted proves that you don't just predict strings of characters in response to prompts. But I'll be a good sport. I could ask you any personal question that has a unique answer of which you have perfect knowledge: What's your mother's maiden name? What's your checking account number? Etc.

1

LarsPensjo t1_j36mdv5 wrote

But that doesn't help to determine whether it uses reflection on its own thinking.

1

gleamingthenewb t1_j36srlq wrote

You asked for an example of a question that can't be answered by next-word prediction.

1

FusionRocketsPlease t1_j36r2pm wrote

Try asking an unusual question that one would have to reason with to answer. He will miss. This has already been shown in this same sub today.

1