BetoBarnassian t1_j9fac28 wrote
We would need a good physical definition of what emotions are in a general sense. I think emotions are simply an impetus to behave in a certain way. How we act is some type of weird aggregated calculus of all the different things we want/don't want with varying degrees of intensity. In this sense emotions are more fundamental than the idea of being "happy", "sad", or "angry" and are simply behavioural expressions used to get what we want. Why do people get frustrated? Usually because they have to deal with stuff they don't want to. What does frustration do? Motivates people to leave a situation or change it. When we enjoy things, we usually seek more of that thing. Yet life is complicated and we have to balance many desires/wants against others leading to situations where we do things we don't want to get things we do. So long as you can program in a way for an Ai to have goals/wants/desires/priorities then emotions (imo) are simply the attempt to achieve these goals, fulfil these desires etc. Will they feel happiness or sadness in the same way we do? Probably not, they don't and will unlikely be made to mimic human biology so there will be differences in emotional expression, but I do think they will an analogous expressions that serve similar purposes.
This is just my quick 2cents. I'm sure there are decent arguments to be made against this point but I think it's a reasonably valid premise.
Viewing a single comment thread. View all comments