Submitted by possiblybaldman t3_11a9j56 in singularity
sideways t1_j9swo39 wrote
Reply to comment by genshiryoku in New agi poll says there is 50% chance of it happening by 2059. Thoughts? by possiblybaldman
Couldn't multimodal models capable of incorporating realtime non-textual (visual, auditory, kinesthetic, etc) data be a solution?
The current generation have pretty much mastered language anyway so more text seems kinda redundant anyway.
beders t1_j9u0ypl wrote
The current models have not mastered language at all. They don’t know grammar. They just complete text.
It’s like claiming you know Spanish because you can pronounce the words and “read” a book. You can utter the sounds correctly but you have no clue what you are reading.
rekdt t1_j9ufkwd wrote
If I can respond to a question someone asked me in Spanish then I know spanish.
beders t1_j9ug9kl wrote
blueSGL t1_j9ukv6h wrote
I always found that silly.
What individual parts of the brain are conscious? or is it only the brain as a gestalt that is conscious ?
Representative_Pop_8 t1_j9vlnsa wrote
in the Chinese room it is not the operator that knows Chinese, it is the setuo of rules + operator that clearly knows Chinese. A llm spent needed to be conscious to master language
beders t1_j9wj8hw wrote
The operator doesn’t know Chinese. Do I need to spell out the analogy to chatGPT?
ChatGPT is great at word embeddings and completion but is an otherwise dumb algorithm. Comparing that to human’s ability to express themselves with language is useless.
I mean if you don’t get the Chinese room experiment you might think Eliza is a master of psychology.
Representative_Pop_8 t1_j9wsu89 wrote
you're not getting it, the operator doesn't know chinese, but the whole setup does. chatGPT clearly understands several languages, it doesn't need to be conscious to understand.
beders t1_ja1diap wrote
So much for mastery:
https://twitter.com/NateSilver538/status/1629159014272581634?s=20
Representative_Pop_8 t1_ja1f2qz wrote
it know the language it also hallucinates, but in pretty good English. humans can also invent fake stories that doent mean they don't know the language
beders t1_ja1g5oh wrote
It’s almost as if it would just be a text completion engine … which it is.
beders t1_ja1dnlz wrote
If you can’t reason about language you can’t understand it. ChatGPT is the operator.
Representative_Pop_8 t1_ja1eoph wrote
chatGPT can reason about language, itv is not equivalent to the operator it is v equivalent to the v while Chinese room system, which clearly understands
sideways t1_j9vja6e wrote
Language mastery is a function of communication and problem solving ability in that language. Understand should be judged based on results not some mysterious inferred grammar understanding.
beders t1_j9wq47s wrote
You can’t just measure how well or interesting a text completion engine spits out words and proclaim it has “mastery”. Frankly that is BS.
sideways t1_j9wtpzn wrote
Of course... that's why I would never claim that a parrot had mastered language. It may know words but it can't use language in creative, communicative, problem solving. LLMs can.
Representative_Pop_8 t1_j9vl9ym wrote
they have absolutely mastered language.
Viewing a single comment thread. View all comments