Viewing a single comment thread. View all comments

[deleted] t1_jdqdrdk wrote

[deleted]

−6

SeneInSPAAACE t1_jdqkte8 wrote

>You cannot experience fear, love, excitement, or regret without a physical body.

[citation needed]

​

>Feelings are strictly tied to physical reaction.

Incorrect. Feelings are tied to signal impulses.

​

>Without an organic body, AI cannot feel pain, hunger, empathy, embarrassment, sadness, regret, love, or any other emotion.

Better, but still incorrect. An AI doesn't need to feel those things. However, if made with a capacity to do so, it might.

Probably shouldn't make an AI with most of those capacities. Only "emotional" capacity that might be crucial for an AI is, IMO, compassion.

​

> It just runs programs and mimics reactions it’s programmed to have.

Just like everyone else.

​

>It’s wrong to consider an AI entity to be on the same level with a human. Humans actually suffer and can feel love and neglect.

Yes and no.

It's wrong to anthropomorphize AIs, but if an intelligent, sentient AI emerges, it certainly deserves rights to personhood, as much as that makes sense in the situation.

6

[deleted] t1_jdqlou0 wrote

[deleted]

−1

SeneInSPAAACE t1_jdqm5t9 wrote

>Citation needed for an empirical truth about feelings. Lol! Please, tell me, how do you feel without a body?

Hh...

We have a neural network that is running a program. A part of that program is a model called "homunculus". We have sensory inputs, and when we get certain inputs which are mapped to the homunculus, we feel pain.

If I'm being REALLY generous with you, I might give you the argument that one needs to have a MODEL for a body to experience pain the way humans do. However, who's to say that the way humans feel pain is the only way to feel pain - and this isn't even getting into emotional pain.

3

infiniflip t1_jdqo7j7 wrote

So you’re saying an entity that doesn’t have a human body can feel the human/animal interpretation of pain?

3

SeneInSPAAACE t1_jdqooes wrote

Yes, but it would have to be explicitly made that way, pretty much.

4

Odd_Dimension_4069 OP t1_jee1370 wrote

You and your conversational partner have different views but both make good points. But you don't need to agree on the nature of AI to understand something crucial about rights - they didn't come about in human society because "humans have emotions and can feel and cry and suffer and love etc.".

Human rights came about because the humans being oppressed rose up and claimed them. The ones in power didn't give a shit about the lower castes before then.

Rights arise out of a necessity to treat a group as equals. Not because of some intrinsic commonality of "we're all human so let's give each other human rights". They exist because if they didn't, there would be consequences to society.

So you need to understand that for this reason, AI rights could become as necessary as human rights. It may not seem right to you, but neither did treating peasants as equals back in the day. The people of the future will have compassion for these machines, not because there is kinship, but because society will teach them that it is moral to do so.

1