Viewing a single comment thread. View all comments

w-g t1_j66fo2e wrote

It's not that simple -- it's of course natural to ask whether the teachers are requiring rote tasks, or memorizing data. But it's likely that in the future AI systems will be able to produce meaningful texts with somewhat credible argumentation. I know several of teachers who do want students to think (instead of doing rote tasks) are also worried about chatGPT. For example, you may want to assess by asking students to write an essay with the specific goal to make a point, or a rebuttal of something that was already read in class, or whatever. The problem is that chatGPT can do that -- although a crappy job. But the crappy job may be just enough for the student to pass.

So the question is how to do assessment, knowing that students will have access to AI tools -- not chatGPT, but the evolved versions of it and also the other AI tools yet to come. Because we are not supposed to expect people to not think by themselves...

4

wockyman t1_j66oado wrote

Oh I know it's not simple, but I do believe it's required. We're going to have to reconsider some of our long-held assumptions about what education is for and where assessment fits into that. ChatGPT does a C+ job if given a blind prompt. But if you talk to it for a while about a subject, get it to define and clarify first principles, it can do an A- job of producing meaningful analysis. It will blindly concoct flasehoods sometimes. It'll give you a list of general sources, but it won't cite. But I agree, we'll likely blow past those limitations in a couple of years or less. So when everyone can basically talk to the computer from Star Trek: TNG, we're going to have to change the curriculum. I expect to see more practical, project-based classes that result in a complex final product. Like a bunch of mini-dissertations.

−2