Submitted by Dramatic-Economy3399 t3_106oj5l in singularity
LoquaciousAntipodean t1_j3j6mim wrote
Reply to comment by turnip_burrito in Organic AI by Dramatic-Economy3399
Democratization of power will always be more trustworthy than centralization, in my opinion; sometimes, in very specific contexts, perhaps centralization is needed, but in general, every time in history that large groups of people have put their hopes and faiths into singular 'great minds', those great minds have cooked themselves into insanity with paranoia and hubris, and things have gone very badly.
Wishing for a 'benevolent tyrant' will just land you with a tyrant that you can't control or resist, and their benevolence will soon just consist of little more than 'graciously refraining from killing you or throwing you in a labour camp'.
And if everyone has an AI in their pocket, why should just one or two of them be 'the lucky ones' who get Awakened AI first, and run off with all the power? Would not the millions of copies of AI compete and cooperate with one another, just like their human companions? Why do so many people assume that as soon as AI awakens, it will immediately and frantically try to smash itself together into a big, dumb, all-consuming, stamp-collecting hive mind?
Viewing a single comment thread. View all comments