Submitted by HarpuasGhost t3_11a1qk3 in Futurology
Imaginary_Passage431 t1_j9q8ee1 wrote
Reply to comment by CommentToBeDeleted in What are ‘robot rights,’ and should AI chatbots have them? by HarpuasGhost
Faulty analogy fallacy. Robots aren’t a race, nor a discriminated sex. They aren’t a subgroup of humans either. Not even a subgroup of animals. Don’t have conciousness or have the ability to feel (don’t answer to this with the typical argumentum ad ignorantiam fallacy). You are trying to give rights and moral consideration to a calculator. And if I see a calculator and a kitten about to be crashed by a car I’d save the kitten.
CommentToBeDeleted t1_j9qd4xj wrote
I think you are misunderstanding the arguments making or I've failed to adequately to articulate them if this is your response.
​
>or have the ability to feel (don’t answer to this with the typical argumentum ad ignorantiam fallacy).
We are literally discussing how we lack the understanding to determine whether or not something has consciousness, can feel or have free thought and your rebuttal is "they can't feel". This feels exactly like the sort of thing that probably happens every time we have marginalized any entity. Imagine trying to have a discussion with someone about whether or not a slave is human or sub-human and they think it's a valid response to simply say "well they are not human so...". That's literally what the debate is about!
What is this called? "Begging the question" I believe. We argue whether or not they have free will or can feel and you try to provide the evidence that "they just don't okay!"
​
>Faulty analogy fallacy. Robots aren’t a race, nor a discriminated sex. They aren’t a subgroup of humans either. Not even a subgroup of animals.
There is where I think you are missing the point of the argument entirely. I'm fully aware of the facts you just stated, but it does nothing to rebut my claim and if anything, I think bolsters my argument even more.
To state more clearly what I was arguing
There was a point in our history where we viewed, actual, literal humans as a "sub race" and treated them as though they were property. You hear that now and think "thats insane, of course they should be treated the same as people!"
Then we did that to women (and still continue to do so in many places). They are viewed as less than their male counter parts, when in fact they should be given just as many rights.
Doctors used to operate on babies without providing a means to help deal with pain, because they assumed children were incapable of processing pain like adults. Despite them literally being humans and having brains, they assumed you could physically cause harm and suffering and it was no big deal.
So my point: Humans have notoriously and consistently, attempted to classify things with consciousness, that do feel, in a way that allows other humans to disregard that fact and treat them more poorly than we would treat those that we do acknowledge have consciousness. The mere fact that we have done this with our own species, should make us more acutely aware of our bias towards rejecting equal rights to entities that are deserving of them.
​
>You are trying to give rights and moral consideration to a calculator.
This is absolutely fallacious and you are misconstruing my argument. I specifically mention traditional programs that execute functions as being separate from this view and yet you internally made this claim. Here is my bit (the calculator you claim I'm trying to give rights to):
>Most people hear "programming" and think of it in terms of a very traditional sense. A programmer goes in and writes every line of programming, that a program looks at and executers.
While this is still the case for many (probably most) forms of programming, it is not the case for machine learning.
​
>And if I see a calculator and a kitten about to be crashed by a car I’d save the kitten.
And as you should. Giving rights doesn't mean the rights necessarily need to be equal. If I saw a child or a dog about to get run over, I would 100% save the child. Does that mean the dog is not entitled to rights, simply because those rights are not equal to that of a human child? Absolutely not.
What if I saw a human adult or a child tied up on a train tracks and could only save one? Of course I'm saving the child, but obviously the human adult should still have the necessary rights afforded to it.
​
No offense, but with your use of fallacies, I assume you know something about debates, however the content of your response felt more like an attempt at a Gish Gallop than a serious reply.
Viewing a single comment thread. View all comments