Submitted by sonderlingg t3_ybh3k3 in singularity
Effective-Dig8734 t1_itnsvge wrote
Reply to comment by DukkyDrake in What will singularity lead to? by sonderlingg
The main problem with this is that you are assuming all change is bad. Which mind you, the thing up for discussion is whether this change will be good or bad. I’m saying it is more likely to be good than to be bad. I don’t see how you’re jumping from fast technological progress to society stopping. It seems like you’re saying that there will be a time difference in between when the first workers start to get “replaced” and the last workers do. However this is something that will most likely happen Far before the technological singularity.
It just seems to me that we are not actually getting to the root of the argument which is whether it is more likely to be positive or negative for society, historically the industrial revolutions and things of that nature which are a type of scaled down singularity have been extremely positive for society
DukkyDrake t1_itnwmz5 wrote
> The main problem with this is that you are assuming all change is bad.
No, I just never worry about the good cases. The good case is the default state, one only need concern themselves with the bad cases.
>I don’t see how you’re jumping from fast technological progress to society stopping.
How can you not see that pathway. The biggest is fast technological progress creates a super intelligent agent and it accidentally kills everyone.
>which is whether it is more likely to be positive or negative for society
One cannot predict what the world looks like after the singularity, hence the name.
One can theorize about the kinds of tech the avg person could get their hands on, just about anything permitted by physics, and what would do with it. It would take just 1 person to make that choice.
>historically the industrial revolutions and things of that nature which are a type of scaled down singularity have been extremely positive for society
Did those past events, which played out over decades, provide each human on earth access to superhuman competence & labor?-No.
There is no point considering any good when superhuman competence & labor could allow an endless number of maximally bad events. Some prankster is bound to create that suitcase containing 50 billion flying insect bots 200 micron in size, each carrying a 100 nanograms payload of botulism toxin.
Viewing a single comment thread. View all comments