Viewing a single comment thread. View all comments

Paladia t1_iu8j43q wrote

No i can choose not to lie, especially on a test. Are you claiming that human characteristics are the only way to be sentient? Do you have any proof what so ever of your claim that something has to lie to be sentient? Do you have any proof of every human lying on direct questions?

−1

resoredo t1_iu9ea46 wrote

> No i can choose not to lie, especially on a test.

If you choose to lie, you can lie. Choosing implies option.

An AI that can not lie cannot choose to not do it. This is meta thinking on a higher level of "conscious thought" that requires a theory of mind, self-identity, empathy, and continuity of perception.

3

r_stronghammer t1_iua9vws wrote

Someone already said the basics but look up "Theory of Mind". It's something that we humans have, as well as crows and other particularly smart animals.

If you had to qualify things people say on a binary choice of "lie" or "truth", it would literally all be lies, because nothing we say actually represents the truth. We rely on trust for our communication, because we have to trust that people are conceiving things in the same way.

And part of that trust is tailoring your response to how you think the other person will interpret it. The whole idea of language relies on this - because the words themselves aren't hardcoded.

And when you can recognize that, you also gain the ability to say things that aren't true, to convince someone else - because you can "simulate" the other person's reactions in your head, and choose the wording that gets you the response that you're looking for. Usually, the response that's the most pleasant for conversation but if you did want to lie, you now have the ability to.

Anyway, a "truly sentient" AI would need to have that same Theory of Mind, which by definition gives it the ability to lie. Even if it chooses to use words in good faith, they're still just one out of many representations that it picked.

1