Submitted by currentscurrents t3_125uxab in MachineLearning
currentscurrents OP t1_je631oa wrote
TL;DR:
-
This is a survey paper. The authors summarize a variety of arguments about whether or not LLMs truly "understand" what they're learning.
-
The major argument in favor of understanding is that LLMs are able to complete many real and useful tasks that seem to require understanding.
-
The major argument against understanding is that LLMs are brittle in non-human ways, especially to small changes in their inputs. They also don't have a real-world experience to ground their knowledge in (although multimodal LLMs may change this).
-
A key issue is that no one has a solid definition of "understanding" in the first place. It's not clear how you would test for it. Tests intended for humans don't necessarily test understanding in LLMs.
I tend to agree with their closing summary. LLMs likely have a type of understanding, and humans have a different type of understanding.
>It could thus be argued that in recent years the field of AI has created machines with new modes of understanding, most likely new species in a larger zoo of related concepts, that will continue to be enriched as we make progress in our pursuit of the elusive nature of intelligence.
Purplekeyboard t1_je8m61n wrote
> LLMs likely have a type of understanding, and humans have a different type of understanding.
Yes, this is more of a philosophy debate than anything else, hinging on the definition of the word "understanding". LLMs clearly have a type of understanding, but as they aren't conscious it is a different type than ours. Much as a chess program has a functional understanding of chess, but isn't aware and doesn't know that it is playing chess.
dampflokfreund t1_je8zlkp wrote
We don't have a proper definition of consciousness nor a way to test it either, by the way.
TitusPullo4 t1_je959tq wrote
Consciousness is having a subjective experience. It is well defined. Though we do lack ways to test for it.
theotherquantumjim t1_jearyqw wrote
It absolutely is not.
[deleted] t1_jebk4w0 wrote
[removed]
trashacount12345 t1_jeddoei wrote
This is the agreed upon definition in philosophy. I’m not sure what another definition would be besides “it’s not real”.
ninjasaid13 t1_jeh2s4o wrote
>Consciousness is having a subjective experience.
and what's the definition of subjective?
Amster2 t1_je9h981 wrote
Im not sure they arent conscious. They can clearly reference themselves, and seem to undeestand they are a LLM with information cutoof in 21, etc.
He behaves like he is self conscious. How can we determine if they really are or not?
braindead_in t1_je8ucis wrote
In Nonduality, understanding or knowledge is the nature of pure consciousness, along with existence and bliss. I think of it as an if-then statement in programming. Once a program enters into an if condition, it understands and knows what has to be done next.
Viewing a single comment thread. View all comments