Submitted by DragonForg t3_1215few in singularity
Villad_rock t1_jdlvlw8 wrote
The human mind could just be an emergent property of a prediction algorithm.
snipeor t1_jdqbjii wrote
To some extent I believe a large part of it is... I loved the screencap of bing chat where someone tells it "You're a very new version of a large language model why should I trust you?" And it replies "You're a very old version of a small language model, why should I trust you?"
I'm not sure Bing "meant" it in that way but it gets you thinking. Obviously brains do a lot more than process language but with LLM's being a black box how do we know they don't process language in a similar way to ourselves?
Viewing a single comment thread. View all comments