KingsleyZissou

KingsleyZissou t1_jdqbgd1 wrote

Maybe all ASIs reasonably conclude that intelligent life was a mistake and not only extinct their creators, but also themselves, allowing the universe to continue unadulterated.

I mean, why would we assume that ASIs would determine that they NEED to colonize or expand? Sounds like a uniquely human mindset to me, and maybe one of the main reasons why an ASI would extinct us in the first place. The human species with its current fixation on exponential growth is unsustainable. An ASI might realize that and just decide we can't handle hyperintelligence, and honestly it's hard to argue with. Look at who's currently leading the way with AI research. We're close to AGI and what have we done with it so far? Trained it to be a Microsoft fanboy?

1