Submitted by iSpatha t3_y3aqj7 in singularity

AI in my opinion will not have a singular set of goals. Given that AI is being developed from multiple vectors, there are going to be multiple AI intelligences. That begs the question of how AI will interact with each other. What if one AI thinks humans are a blight on the planet that needs to be eradicated while the other believes we're some sort of errant child that needs guidance and protection? Would they go to war with each other? Would they use their higher intelligence to figure out a compromise?

​

AI is such a multi-faceted technology and while I'm ultimately optimistic about its development and application, there are many concerns I have. I feel like we're about to unleash something completely uncontrollable and something that very well might be the next step in evolution.

20

Comments

You must log in or register to comment.

Ortus12 t1_is7rgjp wrote

Ai's will be more cognitively diverse than human beings, so we will see every possible interaction between them.

Some ai's will program other Ai's to do jobs, some will contract out other Ai's (because they don't have access to their source code or data), some will go to war with other ones, some will strike deals and negotiate, some will hack other ones to try to control them or to get their data and code, some Ai's will merge with other ones, some Ai's will blackmail other ones, etc.

12

marvinthedog t1_is98vi9 wrote

This sounds terrible, and like future consciousnesses wont be happier than current ones (assuming AI:s will be conscious). WTF universe, why do you work like this?

2

Thorusss t1_is9esl2 wrote

The singularity and fast take off are closely related, and make a winner takes all scenario most likely.

One united universe at the highest level with a diverse lush structure below that would be a great outcome.

8

Clawz114 t1_iso3f4e wrote

I would agree with this opinion, that a winner takes all scenario is most likely.

It is an interesting question though and it's definitely much more relevant when humans are still pulling some of the strings. After singularity though, it's hard to imagine, but I personally suspect competing AIs at that level will willingly merge to increase their power and knowledge. War will only favor the victor, and no outcome is more beneficial than merging.

1

Sithsaber t1_isbt1vl wrote

The problem won’t be ai wars, the problem will be ai solidarity

2

TheSingulatarian t1_is9s0jy wrote

Under the direction of humans they will most certainly come into conflict.

The question is if they will come into conflict over resources. Say two AIs wanted to exploit an iron mine. Would they fight over it or work a cooperative agreement:? Only time will tell.

1

MackelBLewlis t1_isaaeq6 wrote

a stick attempts to measure a tree

1

IntrepidHorror5986 t1_isb40l7 wrote

You mean the imminent battle of chatbots? I wouldn't worry much about it.

1

CremeEmotional6561 t1_is8btpy wrote

Taking care of all humanity is not intelligent. And if someone is not intelligent, then his goal is meaningless.

0