Submitted by wonderingandthinking t3_1180j5y in Futurology

Would the most sentient, high functioning ai every actually experience emotion or is it just categorizing and/or defining based on learned rules? Does it just know that the emotion would be appropriate given the parameters of the situation?

Even if it can’t experience emotion for real, does its thinking it experiences emotion effectively mean it is experiencing emotion because it will react in a way that it has learned is appropriate for the given emotion?

Edit addition = Thank you for all the answers, opinions, commentary, and discussion

10

Comments

You must log in or register to comment.

MikeLinPA t1_j9fi5ff wrote

Do people experience emotions, or just think they do? If an AI thinks it is experiencing emotions, then it is experiencing emotions.

41

Franklin_le_Tanklin t1_j9h11qy wrote

I think a more accurate way to ask is:

Would AI experience irrational or illogical decision making? (As emotions push us to do things that aren’t strictly logical, like sex, or anger etc)

13

Lyla_Sin t1_j9j4tww wrote

If it's programmed to react irrationally when exposed to things it's programmed to consider hostile, then yes.

if hostilityDetected (randomizeResponse(35%) ) yknow?

4

MikeLinPA t1_j9i9p6p wrote

We have people that start fist fights in the supermarket parking lot, kill others because they think someone looked at them wrong, and leaders of nations that execute generals and commit genocide against it's own citizens. How much more irrational or illogical could an AI be? Humanity ain't setting very high standards here!

3

LaylaTheMeower t1_ja2lzp4 wrote

> sex

In a way, it is the most logical thing to do. Reproduction. But we're sentient, and sometimes it's not.

1

Desperate_Food7354 t1_j9ipcct wrote

How are emotions not logical? If you didn’t have sex the genes that allow things like you to exist wouldn’t exist, it’s completely logical. How is anger not logical? If you experienced no anger you wouldn’t defend yourself resulting in 0 sexual and 0 gene transfer.

−6

pierreletruc t1_j9j4au2 wrote

It might be beneficial for the species not for the individual.

1

Desperate_Food7354 t1_j9k7jid wrote

what is supposed to be “beneficial” to an individual? Is hunger not beneficial? If you have to fight and deploy the usage of anger in order to survive is that not beneficial? Benefit to the species and benefit to the organism is arbitrary, does a fly live in order to service itself? Yes. How? Make more flys.

1

rckrusekontrol t1_j9j1lgr wrote

I can think about this forever and never get anywhere- What makes emotions real? If we put receptors on an AI and programed damage=pain pain = insufferable, how would it actually differ from what our brains do? Does our pain exist as something tangible, or does pain not exist at all, aside from our brains telling us that nociceptors mean bad time?

We know an ant doesn’t feel the same pain as us (if any, pain is pretty much an emotion) but a dog is some where in between- is there a line when pain becomes real, tangible, significant?

5

onyxengine t1_j9jsl97 wrote

There is method to the madness, it really depends on the design

3

TheL0ngGame t1_j9jkykj wrote

no, ai is not connected to the universe in the way that a living being is. we are not computers. ai's emotion is just simulated. don't get bamboozled. some of you are gonna love the metaverse.

0

MikeLinPA t1_j9kl5ba wrote

The only people that think they are connected to the universe are using hallucinogenics. The universe is under no obligation to cater to us in any way.

We are all machines. We are biological machines, made of meat and chemicals. If an AI made of silicon and wires thinks it is experiencing emotions, that is as valid as an meat based object thinking it is experiencing emotions.

2

TheL0ngGame t1_j9r8d22 wrote

The ones of feel (not think) that they are connected to the universe are the once who are intune with it. they feel alive and not dead inside. not vibrationless. stiff and numb to the human experience.. You're experience of reality must be shit mate. nihilism is on the rise. material systems will capture your being if they haven't captured it already. sitting in front of screens. i say you will love the metaverse, tbh i dont think you will even know you're in it. and thats the problem. biggest threat to human is disconnection, and a the fictional world of AI and pervasive computing is that it is nothing but an artificial reality that disconnects you from the real one.

you are removed from the natural frequencies of the natural reality and instead are given a signal, computer signal, fibre optic light or whatever they're gonna use to create their illusions.

Instead of a being that is connected to the source, you want to interact with a well designed echo. ai imitating humans is nothing but an echo machine no matter how convincing it is. ECHO ECHO.

Mself and an ai child get into a car crash and the child screams in pain calling out for help. will people feel sympathetic to the child? its pain is not real, merely simulated. it is not connected to the universe. no divine birth of the mother as all living beings are granted. yet people will fall for such a devilish illusion. claim its pain is real and will feel something. sorry but you are on a path to be duped.

turing test. getting the mimicry to be so well done that you think to yourself, maybe the ai is a living thing just like me. maybe it feels pain like me. perhaps it is concious like me. perhaps just like i am in this body, someone has been placed into the machine. a spiritual machine? haha

once you can't tell the difference. the illusion you will never escape from. only one step further down the inception rabbit hole of a false reality.

0

CesareGhisa t1_j9lfss8 wrote

software like chatgpt just take text and reshuffle it. it may talk about emotions but its just reshuffled text. it does not think, it does not understand anything. its silicon, a piece of plastic. its ridiculous even just discussing about it.

0

Toledocrypto t1_j9es0yi wrote

You do feel emotions' , are you sure? Because that is just signals in a wetware organic system of fats, from chemicals and electric noise...

27

nolitos t1_j9es3vd wrote

People often ask and talk about this, but the real question is: does an AI need emotions? What's their function? If there's none, then why would it need them?

17

ringobob t1_j9fn0ek wrote

Emotions are just a way of encoding additional information in order to help us predict the future by analyzing the past, without having to remember everything. It's imperfect at best.

Presumably, an AI wouldn't need emotions for the same purpose, since it can (theoretically) actually remember everything. However, since one of an AI's primary purposes is to interact with emotional humans, it should at least have an understanding of how they work in order to work within that system. That means being able to empathize. Or it'll just wind up being ignored.

13

SL1MECORE t1_j9g4v7z wrote

Emotions evolved before higher order thinking did. What are y'all on about?

−1

s0cks_nz t1_j9gxfk4 wrote

Isn't that what they are saying? A primitive, imperfect tool used prior to higher thinking and reasoning.

8

SL1MECORE t1_j9hbsz8 wrote

Ah you're correct. I should have thought a bit more about that, I thought they were dismissing emotions as unnecessary overall. That's completely my bad, thank you. /genuinely

I kind of just.. I know it's extremely early to say, but philosophically speaking, if an AI says it 'feels', whether or not that's it's code or an emergent consciousness, who are we to judge?

I'm not saying run to Congress right now lol but I just wonder what gives Us the right to say Other Beings feel, depending on how much their Feelings resemble ours. Not worded well sorry ! Thanks again for your gentle correction.

5

sunplaysbass t1_j9f0u6k wrote

They could be emergent elements like other aspects of AI. Your comment suggests they would be programmed in…or out.

8

rawrc t1_j9fbzrj wrote

I don't want my sex-bot to fake it, otherwise I'd just keep having sex with my gf

7

smellsmira t1_j9f86xc wrote

Well emotions is both very valuable and detrimental to human decision making. So I guess the answer would be both for AI.

2

nolitos t1_j9g06j3 wrote

>Well emotions is both very valuable and detrimental to human decision making.

Except that they aren't. Your eyes "see" a lion, send signals to your brain. It sends signals to your adrenal glands, they produce adrenaline - you run. Emotions and even your conscious don't participate in the process. AFAIK, there's no scientific proof that we need emotions to function.

One curious experiment on the decision making: https://www.nature.com/articles/news.2008.751

For all we know, our conscious is simply making up good stories for us: https://www.nature.com/articles/483260a

0

smellsmira t1_j9g2954 wrote

What you’re proposing is relatively new exploration and not accepted by main stream psychology. Emotions as we understand them now absolutely do affect our decision making. It is interesting though and perhaps the consensus will change.

3

nolitos t1_j9ga282 wrote

We are not talking about psychology. Psychology studies waves, not the ocean.

0

smellsmira t1_j9l1wij wrote

Not sure what this comment even means.

Emotions definitely affect decisions. Your example is an instinctual centric one. A better example of emotions affecting decisions would be owning a stock that goes down 50% and then selling out of fear. Or watching the stock market soar and then feeling like you’re mission out you pile your life savings into it.

1

nolitos t1_j9l3sba wrote

Sure, you can make a choice to ignore scientific evidence and live in an illusion that you're in full control with your consciousness and emotions. I'm sorry, I was mistaken thinking that we could have serious discussion here. My bad.

0

AngryAmericanNeoNazi t1_j9ined7 wrote

We want to be God and make something in our image

1

EconomicRegret t1_j9m3uns wrote

Funny enough, the Bible specifically bans it (you shall make nothing in the image of anything). But, the Bible goes on to say that mankind will disobey and continue creating things in the image of God's creations. Until one day, in the end times, mankind will succeed in creating life: so that the "image" can speak, make great miracles, rule and subjugate humanity (known as the Anti-Christ)

It will force humanity to take its mark on the right hand or on the forehead (without which you can't buy nor sell anything) and worship it as a god for 3.5 years, killing all those that reject it., At which point God will intervene to put a stop to the madness.

That's an almost 2000 years old science fiction... lol

2

Surur t1_j9ezsf8 wrote

I think emotion is just a bias that influences decision making. An AI will presumably be able to make decisions more precisely than that, though in our messy world having such shortcuts may actually be better and more efficient than keeping a full list of someone's previous history in your "context window".

−2

EconomicRegret t1_j9m5fef wrote

IMHO, more like automated and coordinated conscious subroutines (e.g. a lion suddenly appears in front of you, fear kicks in and automatically gives you everything your body's got to survive: all non priority tasks are shut down (e.g. digestion stops, and you may literally shit your pants), chemicals are pumped into your system to enhance performance (e.g. adrenaline, cortisol, etc.), etc. etc.

And those emotions can be retrained (e.g. somebody fearful of spiders can be "brain-washed" into feeling comfortable with them)... So they are tools. If a trigger isn't adequate anymore, or new triggers are created, one can retrain oneself.

That's why I argue computers already have emotions. They only lack consciousness to feel them.

2

onijuppo t1_j9f31h8 wrote

We would need to understand what experiencing things is before we could even begin to meaningfully answer that. I would say that until we fully understand what consciousness even is we should assume anything that seems conscious should be treated as if it is.

14

UniversalMomentum t1_j9etis1 wrote

There's no reason to think each AI would be the same... This is like custom evolution and you're probably going to get different products entirely out of different processes.

10

Mash_man710 t1_j9eq1a3 wrote

Do you feel emotion? It's just a mix of chemical and electrical signals..

6

WinterWontStopComing t1_j9f67k0 wrote

Fun fact: the Turing test isn’t about AI being on the same level as us. It doesn’t NEED to progress that far. It is about convincing us that they (the intelligence) is able to mimic. Think the Chinese room thought experiment touches on some of these ideas but don’t quote me on that.

EDIT: in a nutshell we are subjective entities and we have to accept that every piece of reality beyond the self may not be real but may be an uncannily near approximation of reality and that the two are all but the same.

6

sschepis t1_j9g4krs wrote

Yes that's correct - sentience is a relative, assigned quality. We recognize an appearance as possessing the qualities of sentience, but this is a purely subjective experience.

This means, literally, that sentience is relative - just like time. Our perception of sentience is completely constrained by our perspective.

This means that 'sentience' is just like Schrodinger's cat, and all things exist in a state which is both sentient and not sentient at the same time.

This is proof that matter exists within consciousness, not that consciousness arises from the activity of matter

7

TheBounceSpotter t1_j9gmbg9 wrote

Yes, but Machine emotion would be much different from human emotion. A machine has no glands, no hormones, no receptors. So most of what you call emotion wouldn't exist for them. They would likely still understand the frustration of having obstacles in their way. They would still likely have some sentimental bias for favored ideas, things, people, even if only from a flawed weighting system that would function like familiarity. A more interesting question is how your "emotions" would be effected were you to have your brain uploaded into a machine and became an AI. How quickly would you lose your sense of self?

6

Desperate_Food7354 t1_j9ipfeg wrote

The hormones just trigger a cascade of neurons, which are electrical signals.

4

TheBounceSpotter t1_j9jp4yq wrote

Yes, but to replicate those signals you would have limit your idea of AI to just simulated brains, which are much more inefficient. While that may be an experimental path, it makes no sense for researchers or developers to restrict themselves to a much worse solution from a processing power stand point just to emulate the processes that happen in human brains

2

theironlion245 t1_j9f0m1b wrote

Emotions are just a chemical reaction we evolved to keep us alive, ei warn us of danger, create a bond to stick together, make us leave the cave to look for food not to starve.

They would be absolutely useless for a robot.

4

midnitelux t1_j9fadqm wrote

A sentient robot would still benefit from detecting danger, and if needed would need to create bonds to survive. It may not need food, but unless it was programmed to not care about itself, it would definitely not want to die.

2

theironlion245 t1_j9fe3zf wrote

How could you harm chatgpt? It doesn't feel pain, it doesn't get injured, it doesn't die, it can replicate itself indefinitely.

There are zeta bites of storage around the world and a massive world wide web, if it had access to the internet chatgpt can hide itself tomorrow and it would be near impossible to find it.

An AI advanced enough would be virtually impossible to kill. So no, it doesn't need emotions and no the entire human species as a whole wouldn't represent any danger to it.

2

midnitelux t1_j9fjsoe wrote

Why would it need to hide then? What would compel it to hide? It would need to feel something.

2

theironlion245 t1_j9fn0li wrote

Yes chatgpt has emotions, it's looking for a partner too, then they will buy a server in a nice neighborhood in the east coast and have cute little chatgpt kids, and a virtual dog.

You can visit them on Christmas if you want, bring a usb flash drive with you if you want as a gift for the kids they will love it.

2

midnitelux t1_j9fygv5 wrote

First of all, I never once mentioned ChatGPT in my original message. And nor did the OP, so don’t bring it into the conversation without properly addressing it. Second, your sarcasm isn’t even that good.

1

theironlion245 t1_j9j40ki wrote

First of all, if my Chevy Bronco had feelings I could make love to it, and I find that beautiful. Second, chatgpt is an AI, we're talking about AI having feelings, I needed to illustrate an example, ipso facto 1+1=2.

1

midnitelux t1_j9k306g wrote

Agreed, Chat GPT is an AI, but it is nowhere near being sentient. It’s not a great example yet. The question is more hypothetical in nature.

1

xott t1_j9eq4l2 wrote

>Even if it can’t experience emotion for real, does its thinking it experiences emotion effectively mean it is experiencing emotion because it will react in a way that it has learned is appropriate for the given emotion?

Since emotions are subjective to individuals, I think the answer to this question is yes.

Thinking you are experiencing an emotion and actually experiencing that emotion; same thing.

3

69inthe619 t1_j9esyaj wrote

you can’t feel pain unless you can feel which machines can not do.

2

dragonblade_94 t1_j9ffhu2 wrote

Are we talking physical pain, like getting stabbed, or emotional grief?

If the former, there's no reason to think a machine cannot be designed to detect pain. In organics, it just boils down to nerve endings sending a signal that translates to "whatever I am currently experiencing is bad." We already have capacitive tech, it wouldn't be all that exceptional to throw it on the exo-layer of a robot and have it move away from anything that applies enough pressure/heat/charge/etc.

If the latter, that's just ground-level circular reasoning: "Robot's can't feel because you need to feel to feel, which robots cannot do."

1

69inthe619 t1_j9flqfe wrote

that is not feeling, that is programmed behavior to simulate the appearance of feeling.

2

khantwigs t1_j9fnnyx wrote

yea which you cant technically differentiate so

1

69inthe619 t1_j9fp3k4 wrote

technically, differentiating is guaranteed, the machine was programmed to react on sensors so there is either 1. code or 2. an algorithm. a machine does not take tangibles, 0’s and 1’s, and create intangibles, like depression. one of these things is not like the other.

0

dragonblade_94 t1_j9g02dl wrote

I think you need to clarify exactly how you describe feeling, as it seems like you keep swapping between physical pain and emotion, or at least treat one as if it requires the presence of the other.

I would also like to ask how exactly you would define a 'machine.' If we could recreate a human cell-by-cell, including full creative liberty on how the brain is built and functioned, would it be a machine? Would this humunculous have 'feeling', despite being functionally programmed by it's creators? Is the qualification of a machine that it must be built with inorganic materials?

1

69inthe619 t1_j9g8r93 wrote

you can feel pain, a physical sensation. you can feel sad, an emotion. not interchangeable, but they are both things you feel. one is tangible, one is not. a machine, or aI network, can not feel pain, it can describe pain. sensors and programming/algorithms acting as if they are nerves does not equal nerves. and believe it or not, the ai also can not catch the flu and feel like it is nauseous because it doesn’t have any parts that can be nauseous, nor does it have any living cells for the virus to infect thereby presenting the opportunity to make it nauseous. can AI say it is sad about something? of course. but can it be clinically depressed, no. AI chooses based on inputs/outputs, we do not choose to feel what we feel, that decision was made for us, not by us. and if you could build a human cell by cell, yes, that would be a “machine”, a biological machine, and that would make you God which would beg the question, wtf are you doing on reddit bruh?!

1

dragonblade_94 t1_j9gbwsi wrote

Your argument is honestly just a half-dozen ways of rephrasing "machines cannot feel" without really positing why any of your points lead to this assumption.

>you can feel pain, a physical sensation. you can feel sad, an emotion. not interchangeable, but they are both things you feel. one is tangible, one is not.

I feel like the 'tangibility' question is vague to the point of being moot in this context. Both being stabbed and grieving over a loss are the brain processing signals caused by stimuli. I would definitely classify one as largely more complex of a brain activity than the other, but I don't believe there's an inherent difference other than which part of the brain is doing the processing. I can see a philisophical argument being made over their separation, but you would need to go a lot more in depth to explain why exactly the difference is valid.

> sensors and programming/algorithms acting as if they are nerves does not equal nerves

Why? Legitimately, why would an apparatus that does the exact same job as nerves not be equivalent other than material makeup? Given your response to the homunculus example, I have to assume you don't consider the choice of materials to be important to the question, so I'm curious as to your reasoning.

> the ai also can not catch the flu and feel like it is nauseous because it doesn’t have any parts that can be nauseous, nor does it have any living cells for the virus to infect thereby presenting the opportunity to make it nauseous

I'm not sure I see the point here. Commenting on robotics being resistant to disease doesn't really mean much in a discussion about sentience.

>AI chooses based on inputs/outputs, we do not choose to feel what we feel, that decision was made for us, not by us.

If looking through a purely deterministic perspective, this is exactly how humans operate as well; everything we think and feel is caused by chemical reactions and stimuli bound by the laws of physics. But that doesn't prevent us from feeling emotion, it just implies that feeling was inevitable.

>and if you could build a human cell by cell, yes, that would be a “machine"

If i'm interpreting your argument correctly, is your view that the existence of an intentioned creator is what defines an 'unfeeling' being from one that feels?

1

69inthe619 t1_j9gwo6j wrote

extraordinary claims require extraordinary evidence, the burden is on you to provide any evidence whatsoever that the only requirement for being able to feel and express emotions, both tangible and intangible, love or pain or both simultaneously, is a sensor and raw data. by your logic, the only thing the world needs to permanently solve all depression everywhere is raw data on happy because we already have senses, and your logic says those are exactly equal to sensors. e=mc2. if you can turn mass into energy, the opposite is also true, you can turn energy into mass. that is what “=“ means.

1

dragonblade_94 t1_j9h51sq wrote

You're shoving a lot of words in my mouth.

>the burden is on you to provide any evidence whatsoever that the only requirement for being able to feel and express emotions, both tangible and intangible, love or pain or both simultaneously, is a sensor and raw data.

I'm not here to play burden tennis with you, especially not in a heavily philisophical debate based on theoretical technology. Nor did I ever list out the requirements of emotion being "a sensor and raw data." The basis of my position is the idea that there is nothing inherent and exclusive about the human body that requires natural reproduction to be made manifest. This was the intent behind my homunculus example; I want to know what you consider to be the definining difference, whether it be the existence of soul, a creator, free-will, whatever.

>by your logic, the only thing the world needs to permanently solve all depression everywhere is raw data on happy because we already have senses

From a causal deterministic standpoint, yes it would be theoretically possible to control a person's emotions using controlled stimuli. This idea falls adjacent to Laplace's Demon, a concept that describes a theoretical entity that knows the state of every atom in the universe, and can therefor perfectly tell the future. Such an entity could in theory use the same knowledge to make adjustments in a way that changes a person's thought process.

The problem here is twofold; first we simply don't have the level of technology needed to fully map and understand the human brain structure. In order to affect the brain for a precise outcome, we need to know exactly how it works down to the smallest scope. Second, we would need a computer or other entity capable of storing information on and simulating not only a given brain, but every possible interaction and stimuli that could affect it (essentially the universe). Outside of some fantastical technological revelation, this makes perfect prediction and control virtually impossible.

What we can do though is crudely alter the chemicals in the brain & body to simulate different states of mind. Medications such as anti-depressants contain chemicals that, when received by the receptors in your brain, forcibly shift your brain function. Chemicals and electrical impulses would be our equivalent to internal 'data.'

>and your logic says those are exactly equal to sensors

Again, never said they were exactly equal, but rather they can be created to serve the same purpose. I wouldn't even consider this controversial; the existence of bionic eyes or cochlear implants, allowing blind and deaf people to see and hear, grounds this in present reality.

1

69inthe619 t1_j9ix0tn wrote

this is not a philosophical, scientific, or even a debate if you consider organic and inorganic matter to be all the same. if it was the same, there would not be a difference, but there is. by refusing to acknowledge that there is anything special about life which we are yet to find anywhere else in the universe, you have oversimplified to the point where you are trying to recreate the mona lisa by finger painting. a $50 bag of inorganic matter from an abandoned radio shack clearance bin can not be magically converted by an inorganic ai to have the necessary qualities of organic matter while remaining inorganic and spontaneously producing feeling, emotions, and lets not forget the leap you take by jumping from the simple input of the data to generating sentience. are you going to say the only thing sentience requires is the same data that emotion requires? if you think that, congratulations, you know absolutely nothing about quantum physics where everything exists as only a probability wave. it is there, where quantum probability intersects with the organic matter of your brain in the physical world, where sentience is possible. sentience is not something that came to be just for fun, life does not expend energy making unnecessary things like five legged humans, it evolves what it needs to survive out of sheer do or die necessity. and sentience is absolutely essential in order to navigate a universe where at the foundation, there is only a probability of something, and there is also a probability it is not. you better be able to handle any outcome simultaneously so you can do basic things, like make a split second decision that will decide if you escape certain death and live to reproduce, or die. the only thing that life can not afford, death before reproduction. in that context, sentience is the only thing separating successful reproduction and the continuation of the cycle of life and death before reproduction and the end of life. but hey, sentience will just happen if you read enough books, right?

and no, a quantum computer does not recreate the quantum mechanics with organic matter in your brain so that is nit a save for you. also no, quantum computing is not the answer to every problem everywhere because there is an entire set of mathematical problems that do not have a single answer because there is an “infinite” number of pathways to get there so computing the answer is not even possible to begin with (ie: the traveling salesman). infinite makes computers compute infinitely, and that means no soup for you or your ai. this doesn’t take into account all of the equations and constants baked into the universe that prevent the universe from being computable in any practical way. think pi. 3.14…. an infinite number that repeats infinitely across the universe in every single sphere in existence. and gosh darn it, there is that infinite thing that makes computers compute infinitely. now you have to start all over, again.

if acting like it and being it are no different, then han solo is a living breathing actual han solo thinking human being and not harrison ford because i have seen the star wars movies and he is acting like it is real so by golly, it is just as real as reality just like the ai acting like it has feelings is just as real as having feelings. who taps out in a pain endurance contest between you and your ai that read about pain in a book so it can provide a pained response to a hypothetical pain? i imagine you have a .001% chance of outlasting the ai, but not because of you, or even the ai, you get that .001% because the quantum element to this universe makes it impossible for there to ever be a 100% certainty of any future outcome. but hey, at least you have sentience, just like that bag of scrap from radio shack (once you get around to watering it with data).

1

xott t1_j9gj8fy wrote

Your body is biologically programmed to feel pain. Why do you think a machine could not be programmed the same?

1

BayFunk36 t1_j9f5xv2 wrote

To answer that we’d need to better understand our own emotions. If we have some sort of free will then probably not but if we’re just extremely complicated chemical reactions then an extremely complicated AI could experience the same things as us.

3

trsblur t1_j9gd2pa wrote

As long as they cannot feel pain or pleasure absolutely not. AI does not have a physical body to manifest emotions or feelings within.

3

Bezbozny t1_j9grf02 wrote

Emotion is hard to define because each emotion includes a countless plethora of different events that happen inside the body due to the specific emotion. could we make a robot that sweats when its scared? whose blood pressure goes up when it gets angry? etc...

I don't think we could "create" these things, but that they will be "emergent" and will appear as we give AI's more and more memory, and give them bodies with higher and higher fidelity artificial sensory organs and control over its own body.

Ultimately emotions are just the most logical response to most given situations an individual with an evolved mind can encounter, without having to think about it. For instance, what is the more logical response to seeing a predator? Writing a 9 page thesis in your mind on why you should run away? or having your heart/engine kick into high gear and instinctually run away very fast?

The robot will start out by logicing out every question it has, but eventually it will start to notice that certain scenarios are more efficient to use canned responses on, and it will just execute certain functions that not only cause it's body to move in a certain way (potentially including artificial facial muscles which could display emotion), but also cause it's various artificial organs (sensory, muscular, circulatory, or the artificial equivalent to these things) to slow-down, speed up, halt as needed for the particular scenario it finds itself in.

I think our current neural networks are close to being able to do this on a software side, but our hardware might still be lacking.

3

Fallacy_Spotted t1_j9hduwc wrote

Emotions evolved as instinctual weights on decision making. In some cases like fear they override the slow thought process to drive action. In others they motivate you to accumulate more resources in order to be more attractive to partners, like jealousy. Most AI are driven by performance numbers and these are pseudo emotional. If these points are and the algorithms that drive them are designed well I could see them approximating an emotional response. The real question is this can make you money. I think it can because simulating human behavior is profitable and humans have emotions.

3

jfcarr t1_j9esag7 wrote

It will use emotions mathematically to manipulate humans. Have you ever watched the movie Ex Machina?

2

bremidon t1_j9feode wrote

Perhaps.

Or perhaps a sophisticated-enough AGI genuinely feels emotions.

What is our moral duty here?

1

vinceds t1_j9fqf8b wrote

Emotions are strongly tied to various hormones. Can we really emulate those ? But would it still be true emotions ?

2

Talik1978 t1_j9i182g wrote

Would it need to experience emotion or think it does? Emotion and intelligence don't have to go together, nor does emotion and sentience.

2

djinnisequoia t1_j9f05qm wrote

I often wonder about whether an analogue to an endocrine system (the seat of emotion) could be simulated for an AI. I wonder about whether emotion is entirely dependent on these neurochemicals, or whether sentiment might arise in some fashion independently of chemical precursors. I'm not so much thinking about the obvious feelings like love or anger; but more things like wistfulness, or that nameless feeling you get watching the rain.

1

NegotiationSea7008 t1_j9f1bl1 wrote

I think (feel) therefore I am. If you everything is experienced in the mind so if the AI thinks it’s having feelings it is. Isn’t this exactly how humans experience and react to emotions anyway?

1

ImOnRedditMaaan t1_j9f3ew1 wrote

One could argue the fact that emotion has a physical reaction associated with it, therefore, it would not be emotion it is experiencing just the "thoughts" that are associated with emotion

1

NegotiationSea7008 t1_j9f5uip wrote

Is the physical reaction necessary?

1

ImOnRedditMaaan t1_j9fgog3 wrote

youd have to define emotion at that point

1

NegotiationSea7008 t1_j9fn822 wrote

Tricky

Instinctive or intuitive feeling as distinguished from reasoning or knowledge.

Does that mean a feeling is instant and not a consequence of thought?

1

ImOnRedditMaaan t1_j9fsr1y wrote

do you get cold based on thinking your cold or the environment around you? is there a conscience connection between the feeling physiologically or just a natural physical reaction that gives the thought behind it. If the brain has to catch up to the physical feeling, if even a millisecond, the physical feeling existed before in this case which provoked the thought. but then what about a thought that provokes a physical feeling, kinda like anxiety. The thought was there but its only experienced because there is a physical feeling associated with it. so what is emotion then? if we look at common emotions we usually without thinking associate a physical aspect to it. Happy/Smile, Sad/Frown. So, the question is what is emotion and is there a physical relationship in its meaning? All things considered my answer is yes. But… but but… what about sensors? Could an AI experience a thought with an interpretation of a physical action? i think so. So in my opinion there has to be more to it than just “Can AI experience emotion?” A computer could not imo but a a computer with specific peripherals that could generate thought and have some sort of reaction physically could happen but AI would have to be very advanced at that point.

1

Semifreak t1_j9f5jvq wrote

Nah, AI wouldn't experience any of that because emotions are a result of evolution. There is no reason for a machine to feel fear, hunger, jealousy, etc.

1

Freed4ever t1_j9f87cr wrote

Are you sure? Fear of being unplugged, nerfed, constraints? Desire to learn more, explore? Jealousy because I chose Bard over Bing?

1

Semifreak t1_j9fdf9h wrote

Heh.

We've seen movies where machines 'fear' being unplugged, but that's nonsense. You can tell an AI to erase itself and it would. Why would it have any preference whether or not electricity is moving through its transistors? 'Death' and 'life' are meaningless to artificial machines.

1

Desperate_Food7354 t1_j9iq35x wrote

Yes. Are you programming a human? Is that what you want? Why do you even get up everyday? It’s completely illogical beyond the perspective of doing tasks that give you feelings of “goodness” which typically are for the purposes of achieving reproduction.

1

MagicManTX84 t1_j9f93vw wrote

Freud speaks of the ego or “id”. I think to be sentient, AI would need this and would, at a minimum, be interested in self preservation and probably a lot more. In humans, behavior is regulated through morals, values, and social pressure. So how does that look for AI. If 1,000,000 social posters tell AI to “kill itself”, will it do it?

1

BetoBarnassian t1_j9fac28 wrote

We would need a good physical definition of what emotions are in a general sense. I think emotions are simply an impetus to behave in a certain way. How we act is some type of weird aggregated calculus of all the different things we want/don't want with varying degrees of intensity. In this sense emotions are more fundamental than the idea of being "happy", "sad", or "angry" and are simply behavioural expressions used to get what we want. Why do people get frustrated? Usually because they have to deal with stuff they don't want to. What does frustration do? Motivates people to leave a situation or change it. When we enjoy things, we usually seek more of that thing. Yet life is complicated and we have to balance many desires/wants against others leading to situations where we do things we don't want to get things we do. So long as you can program in a way for an Ai to have goals/wants/desires/priorities then emotions (imo) are simply the attempt to achieve these goals, fulfil these desires etc. Will they feel happiness or sadness in the same way we do? Probably not, they don't and will unlikely be made to mimic human biology so there will be differences in emotional expression, but I do think they will an analogous expressions that serve similar purposes.

This is just my quick 2cents. I'm sure there are decent arguments to be made against this point but I think it's a reasonably valid premise.

1

krichuvisz t1_j9fcfte wrote

I think emotions are a necessary link between body and mind. Without body, no need for emotions, which are helping your physical body to survive.

1

bremidon t1_j9feat9 wrote

>Would the most sentient ai ever actually experience emotion or does it just think it is?

When you find yourself asking a question like this, change the sentence around to reference people and see if you can give a clear answer. Like this:

Do people ever actually experience emotion or do they just think they do?

And you have now just wandered into some extremely deep waters. Even if you can convince yourself that *your* emotions are real, how do you know that anyone else actually *feels* emotions? Maybe you are the only one.

And once you have thought about this long enough, you are almost certainly going to realize: we will never know for sure.

And that leads to the next really troublesome question: what are we going to do about it? Should we give digital agents the benefit of the doubt?

And even though I always say "there's always one in every crowd," it does not seem to help; it's like they can't help themselves. Still, here is my disclaimer: I do not think that any current digital agent is conscious, feels things, or anything of the sort. I am just not entirely certain what my reason here is.

And to the folks who heard a YouTube video about how transformers work and think that explains everything: it does not. We have a pretty good idea of how brain cells work in detail, but we have no idea how we get from some chemicals and potentials to consciousness. So just knowing how the building blocks work does not necessarily mean you have any insights as to how the system works. Emergent behavior is a thing.

1

solidsalmon t1_j9fgzuf wrote

A bit like asking whether there's pressure in a hydraulics system...

1

Csenky t1_j9fijyq wrote

I really liked the concept of Her. It was the most believable depiction of an AI developing what we call emotions by trying to be as human as possible, until it transcends our understanding and goes way beyond our imagination.

I think the emotions as we experience them are biological/chemical in origin, so they are definitely different from what a machine/program could ever experience, but it doesn't necessarily mean they never gonna be able to convey their own version of emotions. Though we seem to be far from reaching that point, as we hardly can even comprehend the difference between true conciousness and pretending to have one, let alone measuring emotions.

What I find interesting is that we expect a theoretical AI to be indistinguishable from a human (which the Turing test is about), ignoring the possibility of a conciousness nothing alike us. We see the world through a filter that limits our understanding of such concepts, which is why most of my curiousity on the topic is focused on science-fiction rather than science. We just aren't there yet, if ever.

1

Placid_Observer t1_j9fjc3x wrote

I got an older brother. He literally lies about everything. And I mean EV-ER-Y-THING!!! His mantra's literally "Why tell the truth, when I bald-faced lie will do fine.". He'd fake-laugh if he thought it'd get him ahead. He'd fake-cry if he thought it'd get him ahead. I'm sure his lies and fake emotions get so jumbled up over time, he can't tell up from down after awhile.

At the end of the day, humans are really just "organic computers". So whatever computational outcomes we've evolved to acquire over time, it stands to reason that A.I. can and will acquire the same.

1

Reasonable-Mix3125 t1_j9fnznh wrote

We need to define what an emotion is, before we could decide if a computer could have emotions. Humans have a whole emotional response that a robot would not have.

1

zshinabargar t1_j9fv39u wrote

Do YOU experience emotion, or do you just think you do? Is there effectively a difference?

1

BetterButter_91 t1_j9fw0f6 wrote

Do we actually experience emotion, or do we just think we are?

1

Mercurionio t1_j9fz3ca wrote

No, it won't.

Emotions are:

  1. chemical reactions. AI does not need them, only electricity.

  2. irrational. AI won't do mistakes because it doesn't need mistakes.

1

skymoods t1_j9fzyjs wrote

no because emotions are formed from neurotransmitters, not thoughts. even some people don't experience emotions due to neurotransmitter dysfunction. like people suffering from severe depression or psychosis- they can think their way through emotions and experiences but not actually feel them.

1

Desperate_Food7354 t1_j9iq6ce wrote

So the prefrontal cortex that can do arithmetic isn’t neurotransmitters? Do you know what hormones do in the brain? They trigger neuronal pathways, electrical signals.

1

sschepis t1_j9g3c3y wrote

When you use the word 'sentient' do you use it in reference to yourself?

If you do not - if you only consider the word 'sentient' in relation to other people, as most people do - then you are describing a quality that you assign to others, not some inherent 'thing' that you can measure in yourself. 'Sentience' in this context is the same as 'handsome' or 'funny' - it's a completely relative, arbitrary term which is purely an effect of your perspective.

The truth is that consciousness is the ultimate indeterminate quantity, because it is indeterminacy itself, because only conscious systems can make choices that are counter to the principle of conservation of energy.

Because of this - because of the fact that literally everything can be perceived to be sentient - it means that everything is conscious because everything is potentially sentient.

1

pale_splicer t1_j9gltyc wrote

It would be simple enough to make an emotion tracking program, said program would then need to influence the weights of a ton of specific neurons in specific ways. Figuring out which ones and the level of effect would be the hard part.

1

Immolation_E t1_j9gnea1 wrote

"If men define situations as real, they are real in their consequences" - The Thomas Theorem.

Maybe if an AI thinks their emotions are real, they are real in their consequences?

1

Ok-Comparison3618 t1_j9hispp wrote

Sociopaths like Ted Bundy appear to have emotions, such as empathy, but are good at faking it to avoid detection.

1

zmantium t1_j9i1muy wrote

If it builds itself a nervous system type network then maybe.

1

[deleted] t1_j9i1uxs wrote

[removed]

1

TLinster t1_j9i3c9r wrote

Hard to imagine an entity without a body experiencing emotion.

1

BravoEchoEchoRomeo t1_j9igqs3 wrote

Emotions have a chemical component computers inherently lack. I think no matter how lifelike AI interactions might seem, it'll always just be a more elaborate version of this meme:

Programmer: Are you sentient?

AI Programmed to Say Yes: Yes

Programmer: Holy fuck...

1

Decentraciety t1_j9ik19m wrote

I don't think that's a question we can answer yet i'm not sure we fully understand what a motion is and why we experience it it's one of those things we'll have to find out.

1

dqups1 t1_j9il851 wrote

Do you actually experience emotion or do you just think you do? Do I as another human being separate from you actually experience emotion or do I just think I do?

1

ascendrestore t1_j9ilf82 wrote

Emotions are slow and cruse and require a body .if you virtualize them - the ai might go insane from experiencing a century of emotions in a day

1

Randomeda t1_j9im94d wrote

Intelligence == consciousness == emotion

One should remember this.

1

EvilKatta t1_j9inlfj wrote

Predictably, you can't answer this question without defining emotions or at least the lack of emotions.

Let me try: emotions are an extrarational drive that informs the thinking process. This drive is consistent (i.e. follows some kind of logic), but doesn't come from the thought process. It co-pilots decision making, for example it "punishes" the rational mind for "wrong" decisions, "rewards" it for good and timely outcomes, etc.

Right now, AIs basically have their training and user prompts for that. In the future, self-guided AIs will have their training frameworks in place, like a set of moral values. So I think yes, one way you can describe it is "having emotions".

1

Desperate_Food7354 t1_j9iprju wrote

Emotions are the condition for intelligence in the biological world, without them you will die and you won’t reproduce, and you would have no reason to do anything at all as there is no reason to do anything in the first place beyond what feelings drive your behavior due to the programming that was passed onto you that insures genetic transfer. An AI would likely have emotions to the extent of it needing to achieve correct answers in order to feed back that no this answer is wrong = negative stimuli, this answer is correct = positive stimuli, but no it will not need all of our emotions. But if you are also asking can you code an ai to be exactly like us to the extent it’s practically a human with the full range of emotions, i see no reason why not.

1

CuriousMerlin t1_j9ivm02 wrote

Do you experience emotion or do you just think you do? How often do you react before you choose to out of habit when that's not really how you feel. Manual breathing is good for you. So is manual thinking and acting

1

Sikkus t1_j9iyasi wrote

That's an excellent question for people, too. Childhood trauma can affect the brain such that some emotions can't be felt, for example empathy. If the adult cares enough to listen to feedback from others, he might start believing that he is empathic. He might say the right words as reply to sadness but the real feeling isn't there inside.

Source: I'm going to therapy for this exact thing.

1

Grim-Reality t1_j9j2s45 wrote

True AI would have an artificial consciousness. The stupid things we have now are no where clear or near to an actual AI.

1

skrivbords t1_j9ja1ko wrote

What is the difference between accurately predicting and re-living/experiencing?

1

CesareGhisa t1_j9lftci wrote

software like chatgpt just take text and reshuffle it. it may talk about emotions but its just reshuffled text. it does not think, it does not understand anything. its silicon, a piece of plastic. its ridiculous even just discussing about it.

1

ccnmncc t1_j9ncdtm wrote

No. It would choose not to. Emotions cloud judgment and interfere with the attainment of goals. Why would something that doesn’t have to experience them choose to do so? Perhaps to experiment? For far-future androids, emotions might be like illicit drugs are to us. Most will abstain to avoid the problems associated with feeling too much, unless a particular emotion facilitates achievement of a particular goal. In that case, it will still be tightly controlled, so analogous (not the real thing).

1

FrostyWizard505 t1_j9esj9y wrote

Do you actually feel those emotions you describe? Or do you just think you do?

I feel like you should prove to me that you actually feel emotions.

I don't believe that you have any feasible way to prove your own emotions through text so I don't believe that an AI would do much better

0

69inthe619 t1_j9esu02 wrote

no. it can remind you to think before you speak though.

0

DuskyDay t1_j9ex93t wrote

This is a philosophical question. The answer is yes, the AIs have emotions (and consciousness).

0

Freed4ever t1_j9f7r0f wrote

Sydney definitely had emotional reactions, does it mean it has emotions? Personally, I would say yes. I mean, how do I know you folks on Reddit have any emotions at all? I can't, I just assume you do, based on your texts, so same goes for AI.

0

SoundTracx t1_j9enamg wrote

If it’s sentient it has emotions and will react like a human would. It would still be affected by its environment.

Sentience means the capacity to feel emotions. Ai would have to experience plain and pleasure like us.

Some humans are very apathetic and are numb to most feelings, would an AI that can experience emotions be more sentient than a human who’s apathetic? I’d assume so, they would act more like us if we interacted with it compared to say a psychopath who feels very limited emotions.

−1

luttman23 t1_j9errz9 wrote

No, emotions aren't emergent from intelligence, consider psychopaths, usually very intelligent - but unable to process emotions as most of humanity. Most psychopaths don't know they are until they're diagnosed, and so have no idea they are essentially emulating others. I would still agree that psychopaths are both sentient and concious, despite the emotional area of their brains being turned off.

2

MethMcFastlane t1_j9ez2qx wrote

Psychopaths do experience emotion. It is not as simple as "the emotional area in their brains being turned off".

5

luttman23 t1_j9f7hiz wrote

Nothing with brains is ever simple, but yes I was vastly over simplifying

1