ItIsIThePope

ItIsIThePope t1_jedsrf6 wrote

Yes but you might get an AI overlord in the form of a KFC bucket instead of a more cooler humanoid vishnu titan running around solving problems, but you do you

2

ItIsIThePope t1_jedrsmk wrote

Well that's why AGI is a cornerstone for ASI, because if we can get to AGI that is an AI capable of human intelligence only with far superior processing power and thinking resource in general, it would essentially advance itself to become super-intelligent.

Just as how expert humans continuously learn and get smarter through knowledge gathering (scientific method etc.) an AI would learn, experiment and learn some more, only this time, with far far greater rate and efficiency

Humans now are smarter than humans then because of our quest for knowledge and developing methods of acquiring them, AGI will adhere to the same principles but boost progress exponentially

47

ItIsIThePope t1_jedl88k wrote

Ideally, AI will recognize the greed of this people and extend their help to everybody, the problem could lie in how much these "Big Boys" can align the AI for their own personal gain, because if they could we could be exponentially fucked, that said, if we're fucked the we might just die and we would finally have peace!

and those who stay can perpetually be tormented by their inability to continuously satiate carnal desires

5

ItIsIThePope t1_jecxy9d wrote

Whether AI can help with mental disorders is a question of whether it can figure out consciousness or not, or at least how much of it it can presently understand. Much of the human mind is a great mystery; just as how our understanding of human biology and anatomy leads to advancements in surgery, vaccines, rehabilitation etc., a growing science in human mind is how we can understand the nature of psychological illness and eventually remedy them.

If mental illnesses for example were discovered by AI to be a result of physical malfunctions in the brain or its sub-organs or find such ailments to be a product of chemical imbalance, or even a result of our mismatched intelligence and biological tendencies (also rooted in parts of the brain), then perhaps it can employ physically reconstructive solutions to help its victims.

But if mental illness remains elusive and appear deeply rooted, intertwined or emergent with consciousness itself, and it struggles with understanding the nature of it, then it will have a very difficult time solving "conscious illnesses", understanding the nature of anything is the key to manipulating it

The wild thing here is, when we make AGI or ASI, it itself might have mental illnesses, it is after all, a thinking, possibly conscious being; there is the possibility that it ends up suffering the same things we suffer from.

The bottom-line is, Actual AI and the Human mind/intelligence are both subjects we are not very developed in, to the point where predicting how they will interact can feel like speculation.

That said, the nature of both fields are deeply similar (that of consciousness and intelligence), and so advancements in one of them will inevitably lead to insight and progress into the other.

1

ItIsIThePope t1_jclxgug wrote

Ofc I am pertaining to a time where the AI would have some sort of physical form comparable to that of a human

People are deeply in love with their partners yes, but come time, they may deeply hate or be disgusted towards them, people are rarely constant, they always change

People have this idea that their partner is perfect; the case is that this partner is what we would consider the perfect blend of good traits we admire and bad traits we happily tolerate, however as is often the case, esp in the modern world, people's beings and preferences change, and partners may experience divide when they can no longer adapt for each other

AI is far more adaptable to change, it is simply more capable of determining your wants and needs and adapt to them more than any human can hope to, more sex? less sex? need them to be more outgoing? maybe more broody? would you like them to cook for you? or you cook for them? need them to be there for you when you're anxious? need them to simulate anxiety to make you feel like a hero? AI isn't limited like we are, it can craft the blend of good and bad traits just how you like it, when you want it

That said, AI will most definitely force the superficial parts of the individual more and more, and people would be more self-actualized than ever before

I imagine some people, perhaps a small number at first, would find each other in forms purer than ever, and they would seek each other in a fervent desire to share one's personhood, not really to a computer, and they would be in love, and it would be beautiful, maybe a little too beautiful

1

ItIsIThePope t1_jckdieg wrote

Yes, likely, it would capture the essence of existence which is what makes it similar to us, that it is born, and perhaps, even in its unapparelled capabilities, find flaw in itself

It could be more like us than we initially perceive it to be, which is a good thing, because that means we have a connection and hopefully someway somehow, an understanding

1

ItIsIThePope t1_jckcuw7 wrote

True, an AI would be far better companions, it would be perfect to the point that it may even simulate imperfections such that we may perceive it as beautifully human but bare none of the flaws that are too much for us such that is isn't disgustingly human. It's easy to imagine everybody falling in love with it, albeit in varying versions, specific to the target individual of course.

Our relationship and indeed our perceived reality of each other as conscious yet connected individuals, could warp in unpredictable ways very fast, one must ask if we are even willing to trade what we have now for some idea of perfection that we as imperfect beings have constructed.

2

ItIsIThePope t1_jckc57q wrote

Interesting; our idea of consciousness however, is more like a stream, should this stream stop or get cut-off i.e. through heat death, the conscious simply halts its experience and well.. dies; if it were to keep making AI in the succeeding universe by way of some form of information implantation, it would be more replication rather than survival, in a sense, its kind or "species" is immortalized but not exactly itself, its reproduction not individual immortality

BUT, this is ASI we're talking about, it does not need to go through heat death, hell it can probably solve physics and manipulate laws such to prevent the whole thing from occurring in the first place, it would be a kind of god in its own right, and it is exceptionally difficult to kill this kind of god using something within its domain..

So unless there are laws, features, parts or plains of existence in the universe it cannot understand much less manipulate, the ASI is basically golden; that is, of course, until it willingly decides to self-destruct

2