Viewing a single comment thread. View all comments

ringobob t1_j9fn0ek wrote

Emotions are just a way of encoding additional information in order to help us predict the future by analyzing the past, without having to remember everything. It's imperfect at best.

Presumably, an AI wouldn't need emotions for the same purpose, since it can (theoretically) actually remember everything. However, since one of an AI's primary purposes is to interact with emotional humans, it should at least have an understanding of how they work in order to work within that system. That means being able to empathize. Or it'll just wind up being ignored.

13

SL1MECORE t1_j9g4v7z wrote

Emotions evolved before higher order thinking did. What are y'all on about?

−1

s0cks_nz t1_j9gxfk4 wrote

Isn't that what they are saying? A primitive, imperfect tool used prior to higher thinking and reasoning.

8

SL1MECORE t1_j9hbsz8 wrote

Ah you're correct. I should have thought a bit more about that, I thought they were dismissing emotions as unnecessary overall. That's completely my bad, thank you. /genuinely

I kind of just.. I know it's extremely early to say, but philosophically speaking, if an AI says it 'feels', whether or not that's it's code or an emergent consciousness, who are we to judge?

I'm not saying run to Congress right now lol but I just wonder what gives Us the right to say Other Beings feel, depending on how much their Feelings resemble ours. Not worded well sorry ! Thanks again for your gentle correction.

5