Viewing a single comment thread. View all comments

mrpimpunicorn t1_jefaj2b wrote

The technical reason for this all-or-nothing mentality is optimization pressure. A superintelligence will be so innately capable of enforcing its will on the world, whatever that may be, that humans will have little to no impact compared to it. So if it's aligned, awesome, we get gay luxury space communism. If it's not, welp, we're just matter it can reassemble for other purposes.

Although of course it's always possible for an unaligned ASI to, y'know, tile the universe with our screaming faces. Extinction isn't really the sole result of unaligned optimization pressure- it's just more likely than not.

1