Paladia
Paladia t1_iu8b6nl wrote
Reply to comment by cy13erpunk in If you were performing a Turing test to a super advanced AI, which kind of conversations or questions would you try to know if you are chatting with a human or an AI? by Roubbes
That makes no sense. I can choose never to lie and still be sentient, it depends on my morale and priorities.
Lots of humans are also ineffective at lying.
Being a good lier is in no way shape or form a requirement for being sentient.
Paladia t1_iu7ywjw wrote
Reply to comment by beachmike in If you were performing a Turing test to a super advanced AI, which kind of conversations or questions would you try to know if you are chatting with a human or an AI? by Roubbes
It will have to lie to pass the turning test. Why would you focus on creating such an AI?
Paladia t1_iu61g8q wrote
Reply to comment by beachmike in If you were performing a Turing test to a super advanced AI, which kind of conversations or questions would you try to know if you are chatting with a human or an AI? by Roubbes
Which is one of the main reasons why is a bad test. Why would you want an AI that lies on questions it knows? Like if you ask it for the square root of 17.
Paladia t1_it8tybe wrote
What do you need to do to set this up locally?
Paladia t1_iu8j43q wrote
Reply to comment by beachmike in If you were performing a Turing test to a super advanced AI, which kind of conversations or questions would you try to know if you are chatting with a human or an AI? by Roubbes
No i can choose not to lie, especially on a test. Are you claiming that human characteristics are the only way to be sentient? Do you have any proof what so ever of your claim that something has to lie to be sentient? Do you have any proof of every human lying on direct questions?