beachmike t1_iu6a34b wrote
Reply to comment by Paladia in If you were performing a Turing test to a super advanced AI, which kind of conversations or questions would you try to know if you are chatting with a human or an AI? by Roubbes
Dumbing oneself down isn't lying. We dumb ourselve's down, so to speak, when we talk to small children, pets or someone mentally disabled.
Paladia t1_iu7ywjw wrote
It will have to lie to pass the turning test. Why would you focus on creating such an AI?
cy13erpunk t1_iu8alq6 wrote
if an AI cannot lie effectively then it can never be sentient
being able to lie and being able to understand the concept of lying is a big part of what puts humans above most other animals in the apex lifeform game on earth
beachmike t1_iu8emcq wrote
I think you make a very good point. All humans lie, and there's evidence that monkeys can be deceptive ("lie").
cy13erpunk t1_iu9iinm wrote
yep
it is arguably one of the most convincing example of self-ness that we have discovered so far [not necessarily the best imho, but still]
Paladia t1_iu8b6nl wrote
That makes no sense. I can choose never to lie and still be sentient, it depends on my morale and priorities.
Lots of humans are also ineffective at lying.
Being a good lier is in no way shape or form a requirement for being sentient.
beachmike t1_iu8erbv wrote
ALL humans tell lies of one kind or another. Of course, as Mark Twain said, there are "lies, damn lies, and statistics." It probably is true that all sentient beings lie or are deceptive when needed.
Paladia t1_iu8j43q wrote
No i can choose not to lie, especially on a test. Are you claiming that human characteristics are the only way to be sentient? Do you have any proof what so ever of your claim that something has to lie to be sentient? Do you have any proof of every human lying on direct questions?
resoredo t1_iu9ea46 wrote
> No i can choose not to lie, especially on a test.
If you choose to lie, you can lie. Choosing implies option.
An AI that can not lie cannot choose to not do it. This is meta thinking on a higher level of "conscious thought" that requires a theory of mind, self-identity, empathy, and continuity of perception.
r_stronghammer t1_iua9vws wrote
Someone already said the basics but look up "Theory of Mind". It's something that we humans have, as well as crows and other particularly smart animals.
If you had to qualify things people say on a binary choice of "lie" or "truth", it would literally all be lies, because nothing we say actually represents the truth. We rely on trust for our communication, because we have to trust that people are conceiving things in the same way.
And part of that trust is tailoring your response to how you think the other person will interpret it. The whole idea of language relies on this - because the words themselves aren't hardcoded.
And when you can recognize that, you also gain the ability to say things that aren't true, to convince someone else - because you can "simulate" the other person's reactions in your head, and choose the wording that gets you the response that you're looking for. Usually, the response that's the most pleasant for conversation but if you did want to lie, you now have the ability to.
Anyway, a "truly sentient" AI would need to have that same Theory of Mind, which by definition gives it the ability to lie. Even if it chooses to use words in good faith, they're still just one out of many representations that it picked.
curiousiah t1_iu9j77t wrote
Lying demonstrates your capability of understanding 1 - that other people have a capacity for knowledge 2 - how much knowledge they have of something (what don’t they know) and 3 - the advantage to you of withholding or denying the full truth
Viewing a single comment thread. View all comments