Viewing a single comment thread. View all comments

AutoMeta OP t1_irvoil9 wrote

I think the concept of empathy is not that hard to implement actually, an advanced AI should be able to understand and predict how a human might feel in a given situation. What to do with that knowledge could depend on wether or not you care or love that given person.

8

AdditionalPizza t1_irwxipm wrote

>What to do with that knowledge could depend on w[h]ether or not you care or love that given person.

Do you have more empathy for the people you love, or do you love the people you have more empathy for?

If I had to debate this I would choose the latter, as empathy can be defined. Perhaps love is just the amount of empathy you have toward another. You cannot love someone you don't have empathy for but you can have empathy for someone you don't love.

Would we program an AI to have more empathy toward certain people, or equally for all people? I guess it depends on how the AI is implemented, whether it's individual bots roaming around, or if it's one singular AI living in a cloud.

2