AbstractEngima t1_jac06su wrote
How is this even possible? Anyone with a brain knows that ChatGPT is nothing more than a unreliable narrator that pulls random bits of information and then puts into inaccurate mashed up information.
It's already basically following the same process as any other AI does, which is taking little bits of existing information and puts it together based on patterns, rather than actual understanding of the source material.
gurenkagurenda t1_jacebvc wrote
I don’t know how you want to define “understanding” when talking about a non-sentient LLM, but in my experiments, ChatGPT consistently gets reading comprehension questions from SAT practice tests right, and it’s well known that it has passed multiple professional exams. It’s nowhere close to infallible, but you’re also underselling what it does.
TeaKingMac t1_jad6gd4 wrote
>it’s well known that it has passed multiple professional exams.
Well yeah. There's very clearly defined correct answers for professional exams.
When a student is writing an essay, the primary objective is creating and defending an argument. Abdicating that responsibility to ChatGPT is circumventing the entire point of the assignment
gurenkagurenda t1_jada8dv wrote
Sure, but that’s an entirely different argument.
TeaKingMac t1_jadgiv6 wrote
"quoting" ChatGPT as a source is also stupid, because it's neither a primary (best) source, or even a secondary source, like a newspaper article.
It's just a random assortment of (mostly correct) information. That's the same reason why academia doesn't currently allow Wikipedia as a source for information.
Amir_Kerberos t1_jaeh7pq wrote
That’s a misunderstanding of why academia frowns upon Wikipedia. The fact that it can have questionable accuracy is not the major concern, but rather that it is not a primary source
TeaKingMac t1_jaepy4k wrote
> it is not a primary source
AND NEITHER IS ChatGPT
No original information comes from ChatGPT. It is just a repository.
That's my point.
>it's neither a primary (best) source, or even a secondary source, like a newspaper article.
> It's just a random assortment of (mostly correct) information. That's the same reason why academia doesn't currently allow Wikipedia as a source for information
MysteryInc152 t1_jacx3fi wrote
>I don’t know how you want to define “understanding”
People routinely invent their own vague and ill-defined definitions of understanding, reasoning etc just so LLMs won't qualify.
gurenkagurenda t1_jacxqtz wrote
Yes. A little while back, I had someone use a Computerphile video showing ChatGPT missing on college level physics questions as proof that ChatGPT is incapable of comprehension. The bar at this point has been set so high that apparently only a small minority of humans are capable of understanding.
Sumdood_89 t1_jad7ayx wrote
Just like a redditor
Mddcat04 t1_jadbcwg wrote
It sounds like a trap. You cite ChatGPT as a source and then your teacher fails you and gives you a lecture on quality of sources.
[deleted] t1_jadyypx wrote
[removed]
Viewing a single comment thread. View all comments