Viewing a single comment thread. View all comments

Lawjarp2 t1_it6yud5 wrote

There is nothing to be gained by simulating slightly evolved and lucky apes. We are unlikely to be in one because the post-singularity civilization, which eventually will go post-biological, will have no chemical emotions.

Edit : it's the ants man. We need to study the ants via simulation of every fundamental particle. We need to convince all the governments to spend most of our energy to study ants via simulation.

What? You think it's easier to just create a small colony in an artificial environment. Or even an entire planet? What makes you say such 'preposterous' thing bwana?

−3

SFTExP t1_it7dqg0 wrote

I wish people would debate vs downvote. It’s so lazy and you provide an interesting conjecture.

7

Plouw t1_it7frbr wrote

What makes you assume we have any idea what motives post-singularity civilization has? It might be so, that they are not interested in what 'chemical emotions' provide, it might be the opposite. A motive could be to learn by experiencing all aspects of reality. A motive could also be for the pure entertainment - we do not know.

4

Lawjarp2 t1_it8d7jy wrote

Experiences, in the way you imply, derives its value from the saturation of neurons from over stimulation to the same stimulus. Hence we crave new experiences. Why would something so fundamental to brains, something arising out of physical properties of a biological organ be relevant in a non biological world. Same goes for entertainment.

1

Plouw t1_itkqhaz wrote

>Why would something so fundamental to brains, something arising out of physical properties of a biological organ be relevant in a non biological world

We do not know, because we haven't seen the non-biological world in anything but a very premature stage.

The issue is you are assuming the function is based off of something biological, and not the other way around; that evolution build this function through something biological, because this function has a intellectual ( or other) benefit. Maybe it is not inherit to biological brains/intelligence but to intelligence, biological or not. Do we feel because of biological processes or do we feel because it has a functional purpose and biological evolution build processes to make us feel.

It feels off to attribute this to biological only, merely because you have only seen it biologically, as if you're ignoring the black swan.

1

Lawjarp2 t1_itkratm wrote

Necessity is what I'm arguing about. Feelings like the ones primates have are relevant only in the context of primates. Apes thinking evolved humans will simulate them to learn about them is just as stupid as humans thinking post-humans would. One, because it's unnecessary since it's easier to create a literal physical zoo and two we don't waste a lot of energy doing ape simulations precise to the quantum level because it's expensive energy wise.

1

Plouw t1_itks351 wrote

>Feelings like the ones primates have are relevant only in the context of primates

Why?

> It's unnecessary since it's easier to create a literal physical zoo

A physical zoo does not replace studying them in the wild.

>We don't waste a lot of energy doing ape simulations precise to the quantum level because it's expensive energy wise.

Yet.

1

Lawjarp2 t1_itktgsl wrote

Because those emotions were created for a social environment with similar beings.

A physical zoo can be as big as reserve or even a planet. Terraforming is still cheaper than planet simulation.

You clearly underestimate how energy intensive full quantum level simulations are.

1

Plouw t1_ityyc2q wrote

>Because those emotions were created for a social environment with similar beings.

So maybe to research a creature better it would be beneficial to experience the emotions.

>A physical zoo can be as big as reserve or even a planet. Terraforming is still cheaper than planet simulation.

If we were to be a simulation, you have no idea what is cheap or not in the world that is creating our simulation.

>You clearly underestimate how energy intensive full quantum level simulations are.

You seem to be too confident in your ability to predict the motivations of something that you have no to very limited experience with.

1

Kawawaymog t1_it76zpj wrote

Why not? Emotions are fun. If anything I expect post biological organisms to have vastly more, not less, emotional capabilities.

3

alisaxoxo t1_it7pavd wrote

For what purpose? When pleasure is freely accessible why would you want or need new emotions? They influence your state of mind and behavior in weird ways. Unless you’d just get pleasure from experiencing new things, I don’t really see any reason for wanting more emotions rather than just expanding on what we already have. Emotions such as anger, jealousy, rage, sadness, loneliness and boredom are remnants of more primitive beings IMO, but we’re just kind of stuck with them because they were useful for our ancestors. Genuinely interested in what you have to say. I understand that emotions also play a huge role in our ability to coexist so I’m not dismissing the idea at all.

5

Kawawaymog t1_it83xz7 wrote

The cosmos viewed without emotion is nothing but numbers. There is no drive or desire to do anything for a purely logical being. Consider that even the desire for self preservation is ultimately emotional. I would argue that an AI without any emotion would be an AI without any desire to do anything. You could even make a case that without some sort of desire for autonomy it isn’t even alive. Emotions give us purpose, desire, drive, ext. They are our evolved software drivers. Without them we don’t have a reason to be autonomous, they are the drivers of life. The idea that a super AI would be devoid of them is bizarre to me. They may be vastly different than our own, and in fact a super AI would probably be able to experience a vastly more complex array of drivers/emotions.

A post singularity super AI would need to have emotions in order to have a purpose for it’s existence. After the job is done, that is collecting stars or even galaxies worth of raw material to secure it’s resources ensuring it’s self preservation for as long as possible, what else would there be to do in the trillions of trillions of years before heat death? Other than seek out new experiences? If you were said super AI with a trillion trillion trillion years to kill, and the ability to experience anything that could be simulated by a computer, what would you do?

Pleasure might be the sugar or the salt of life, but pain, anger, sadness ext are just as worth experiencing. Without pain pleasure would lose it’s meaning. If we were driven only to seek pleasure then every human alive today would be on a morphine drip. And no one would bother hiking up a mountain to get a dopamine high.

2

Lawjarp2 t1_it8f2us wrote

Everything is derived from survival needs.

(1) Pain is the most useless. You can have sensors akin to the same without overloading your brain. I would say it's better to handle it logically.

(2) Fear is nearly the same but much worse leading to multiple fails like anxiety, PTSD, trauma. It's a good to have when you are limited by brain power and need to focus. Not needed for a super intelligent being.

(3) Anger, rage, vengeance are simply animal behaviours intended to survive and thrive in a society filled with other less intelligent but useful(food) beings

(4) Sadness, sense of fairness, empathy are necessary for social living in a group of biological beings. They don't need to exist outside them.

Need for survival is the only thing truly needed. Everything else gets created around it.

2

Kawawaymog t1_it8htvd wrote

Emotions are just our programmed behaviour, an AI could have any of them, non of them, or all of them, or completely different ones. My point is that they remain important for an AI if it is to be autonomous, to have autonomy is to have desires and person goals. Without any innate drives or motives an AI would have no reason to do anything.

1

Lawjarp2 t1_it8iz75 wrote

Yes I agree with that. But my point is they won't be anything like what we have. And only one is absolutely needed for everything else to come up. Survival.

Simulation of an entire universe is a terrible way to experience anything. Most people explain away the need for it through things that are very 'human' and don't consider how it's not essential

1

Kawawaymog t1_it8lp7a wrote

Well for one thing if this is a simulation there is no reason for the whole universe to actually be simulated, only the parts we are looking at. Even modern game designers are able to get around that one.

It will have as emotions whatever it is designed to have, that is whatever it’s creators build it to have, or whatever it evolves to have. It will be presumable be possible for it to change it’s own core programming. But it seems to me that it is unlikely it would desire to cut out parts of it’s ‘self’. We don’t need our limbic system, but we still want it. I think saying that a super intelligent AI wouldn’t have things it doesn’t need is very short sighted.

In the infinity of time that a super AI would have available to it I think its a reasonable suggestion that at some point it would simulate just about everything that it is possible to simulate. You have to remember that such a being would be around for trillions of trillions of years. It would possible for it to change it’s perception of time, such that, from it’s own experience, it is as close to eternal as can be imagined. What else is there to do but run though all the possible things that could be?

1

BooksLoveTalksnIdeas t1_it8afbq wrote

If you examine biological animals in general, it’s obvious that the more advanced the intelligence in the brain, the more complex that animal’s emotional system gets. Therefore, it would make sense that a more advanced biological civilization with a more intelligent brain than ours would also have not more emotions but a more complex emotional system than ours. And, even if that civilization or “beings” were not biological, they would still understand the more complex system of emotions, even if they are not as dependent on it for their existence anymore. Food for thought and good sci-fi 😉👌

1

StarChild413 t1_ita64qd wrote

Why are you not working towards building an Experience Machine then, as clearly we're not already there

1

SirDidymus t1_it72s2s wrote

Where did you get that preposterous hypothesis? 🙂

2

ToastyRedApple t1_it7qtm9 wrote

You literally need emotions in order to exist though… without fear you would starve to death or get killed by something stupid because you weren’t concerned about it. Without pleasure you wouldn’t feel a drive to do anything, you’d just sit around. I think it is probable that AI will experience these emotions too, it’s probably a fundamental part of intelligence

2

Lawjarp2 t1_it81vpt wrote

Fear is needed because we are limited in our mental capacity. Fear is what focuses our limited mental abilities to a task that is needed for survival. Fear is also not optimal, anxiety is literally our fear mechanism failing to work properly. But fear of death is nothing but the need to survive. It is the literal core of everything else. One can even say, it's the only thing needed and everything else is a proxy for it. If survival is all you need, is it wise to expend huge amounts of energy probing the experiences of an ape society?

1

Azu_Nite t1_it80qfn wrote

The simulation probably isn't about apes but more so simulation of small particles and energy, and ONE interesting result is that slightly evolved ape.

We give ourselves too much importance but we're not much different than a cat if we put aside our little more advanced brain.

2

Lawjarp2 t1_it8df64 wrote

How much energy are willing to bet on something you could derive more efficiently in anything other than a massive energy hungry inefficient simulation

1

BooksLoveTalksnIdeas t1_it87jaw wrote

Intelligent living beings like watching other living beings in their natural environments living out their lives. We like zoos, aquariums, documentaries about animals in the jungle and in the ocean, etc. Does any of that help our own evolution and progress? Not really, but we still find it entertaining, and even interesting. If a “post-biological super-advanced civilization” wanted something similar (just for entertainment) it wouldn’t be watching dogs and cats, or tigers and lions in a forest, it would watch more primitive intelligent civilizations that are still stuck in planets. This is good science-fiction material not because it doesn’t make sense (it makes perfect sense in fact) but because we, as the “observed animals” can’t prove that this is the case, unless the observers choose to make it known to us. However, what do we gain from telling the fish and the lion that we are watching them for scientific studies and for entertainment? Nothing. It might even complicate things for the observation. With a smart species, it would even end the observation because they wouldn’t behave the same way after knowing everything. Food for thought and for good sci-fi 😉👌

1

Lawjarp2 t1_it8ccx1 wrote

We are very much close to apes in social behaviour. Non biological beings won't be or rather won't need to be. If you truly paid attention to why we watch animals, you would know that

(A) There are immense parallels to us and animals. More than there ever would be between humans and post-biological society.

(B) Study of animals and plants actually helps us directly and not just satisfy our curiosity. We find cures, genetic marvels, diseases etc from animals, what do non biological entities have in common with humans. Intellect?

The difference between an ape and a human is negligible, in terms of intellect and then some, compared to a super intelligence. They would rather study something closer to them

1

Kawawaymog t1_it8helb wrote

The difference in intelligence between a human and a chimpanzee is not negligible. the human brain is absurdly large, three times the size of a chimpanzee brian. The human brian is insanely complex compared to anything else in nature. It is an evolutionary marvel.

1

Lawjarp2 t1_it8jhrh wrote

Difference relative to a super intelligence. It's like comparing a 20 IQ ape (some are smarter) with a 100 IQ human and thinking we big smart.

To a 1,000,000 IQ super intelligence, Ape and human are more similar than not.

NOTE: IQ points are there to make a point, not real.

1

Kawawaymog t1_it8mhtg wrote

Well that’s fair enough. But to the point of the first comment in this chain. It’s my opinion that a super intelligent AI would take immense interest in humankind. In a similar way to the way we take immense interest in the first single cellular organisms. They are manny orders of magnitude simpler than us. And had no intelligence at all. But they have immense importance to us. I think human kind will be similarly though of by a super AI. And just as we run simulations of microbes, an AI might run simulations of us.

1