Eleganos OP t1_jd03z69 wrote
Reply to comment by xott in A technical, non-moralist breakdown of why the rich will not, and cannot, kill off the poor via a robot army. by Eleganos
I've addressed another comment or regarding that scenario.
Basically, a singular actor turns it from 'the rich will will us all' to 'a madman who happens to be rich/powerful will murder us all'.
While the odds of it succeeding and devastating humanity go up by an order of magnitude compare to most other scenarios, there are eight billion humans, which means there's 8 billion-in-one scenarios, 8000 million-in-one scenarios, and 8,000,000 one-in-a-thousand scenarios between us and our potential killer.
And I mean that as in 'between them and the number 0'.
Statistically speaking, something is bound to happen that would see a good chunk of humanity survive their attempt, and more than likely kill them in retribution, or just outlive them.
Still an apocalyptic scenario, but not an extinction scenario, and our species could well rebuild via AGI after all was said and done.
I'm not going to pretend that that outcome is a certainty though. I just side with statistical probability and Murphy's law for it.
Viewing a single comment thread. View all comments