Submitted by No_Draft4778 t3_11rmgzs in MachineLearning
Necessary-Meringue-1 t1_jcm4o9d wrote
Reply to comment by harharveryfunny in Modern language models refute Chomsky’s approach to language [R] by No_Draft4778
>the Transformer is proof by demonstration that you don't need a language-specific architecture to learn language, and also that you can learn language via prediction feedback, which it highly likely how our brain does it too.
where to even start, how about this:
The fact that a transformer can appear to learn language on a non-specific architecture does not at all mean that humans work the same way.
​
Did you ingest billions of tokens of English growing up? How did you manage to have decent proficiency at the age of 6? Did you read the entire common crawl corpus by age 10?
​
This kind of argument is on paper stilts. LLMs are extremely impressive, but that does not mean they tell you much about how humans do language.
Viewing a single comment thread. View all comments