Submitted by LegendOfHiddnTempl t3_1169uzy in MachineLearning
liquiddandruff t1_j989luo wrote
Reply to comment by thecodethinker in [R] neural cloth simulation by LegendOfHiddnTempl
the stochastic parrot argument is a weak one; we are stochastic parrots
the phenomenon of "reasoning ability" may be an emergent one that arises out of the recursive identification of structural patterns in input data--which chatgpt is shown to do.
prove that "understanding" is not and cannot ever be reducible to "statistical modelling" and only then is your null position intellectually defensible
thecodethinker t1_j98puob wrote
Where has chat gpt been rigorously shown to have reasoning ability? I’ve heard that it passed some exams, but that could just be the model regurgitating info in its training data.
Admittedly, I haven’t looked to deeply in the reasoning abilities of LLMs, so any references would be appreciated :)
liquiddandruff t1_j98v6ko wrote
it's an open question and lots of interesting work is happening at a frenetic pace here
- Language Models Can (kind of) Reason: A Systematic Formal Analysis of Chain-of-Thought https://openreview.net/forum?id=qFVVBzXxR2V
- Emergent Abilities of Large Language Models https://arxiv.org/abs/2206.07682
A favourite discussed recently:
- Theory of Mind May Have Spontaneously Emerged in Large Language Models https://arxiv.org/abs/2302.02083
Viewing a single comment thread. View all comments