Submitted by sailhard22 t3_1134rru in singularity

The reason I believe that AI will not pass the Turing test—and why the Turing test is flawed—is because AI responses are so intelligent, knowledge-filled and accurate that it will be obvious that the response is coming from an AI and not a human.

I can already spot ChatGPT-generated text in the wild because it’s so artificially accurate (at least grammatically) that it’s unnatural.

In other words, AI is too smart to pass a Turing test, and it would need to dumb itself down dramatically in order to convince someone that it was human.

What would be the point of dumbing down AI? It’s a fruitless exercise.

https://en.wikipedia.org/wiki/Turing_test

4

Comments

You must log in or register to comment.

AsheyDS t1_j8o1aso wrote

You're thinking about it the wrong way. It's not too smart, it just seems that way because it's quite verbose and you relate that to intelligence. If it were more intelligent, it would be both succinct and also considerate of whom it's interacting with. If the goal were to sound like a human and pass the turing test, it would take the things you mentioned into consideration when formulating a response, and it would seem to 'dumb down' and format its responses in a more natural-sounding way. But that isn't the goal, and it's not intelligent enough on its own to consider that.

Personally, I think the turing test is pointless anyway, because even as verbose and unnatural as the responses can be, people are still willing to believe it's sentient and embodies all the qualities of a human. Or to put it another way, we failed it already and have to come up with alternate ways of testing it.

11

h20ohno t1_j8qj7fb wrote

I like to think of the turing test as merely a small fraction of a greater benchmark, ideally you'd have a big table with say, 20 different tests you could try your system on.

1

prolaspe_king t1_j8oxlc2 wrote

People are dumb, that's why it will pass the test. People don't assuming they're talking to AI, they always assume they're talking to a human.

4

Cryptizard t1_j8t1dq7 wrote

>People don't assuming they're talking to AI, they always assume they're talking to a human.

Well, that's not the Turing test then.

1

Kolinnor t1_j8o9nek wrote

I see lots of those posts about the Turing test being flawed. So I'm just going to comment that the strong versions of Turing tests (that is, trying to mimick experts in a field, or at least an intelligent human, certainly like Turing imagined it) are still far from being solved and would be a big indicator of AGI.

3

apart112358 t1_j8qzg27 wrote

I hope I am wrong about the following post:

I don't think anything will ever pass the Turing test. No AI, no alien, no animal - nothing except humans.

If anything were to be considered equal, humans would have to share. We are not good at sharing resources. Treating everything else as "Not Intelligent" or not "Conscious" is a good way to not have to consider the rights of any other species.

Tl:dr The turing test is not there to test whether something is human. It is there to prove that something is not human.

−2