Submitted by GalacticLabyrinth88 t3_zwx9mf in singularity
I've been lurking around this sub lately in light of the controversies surrounding AI this year. While I agree that AI will bring several benefits to human civilization, I have grown increasingly irritated at posts from people who naively claim the Singularity will usher in some kind of utopia, or that their jobs will be safe from the coming wave of technological unemployment.
With all due respect, I don't think these pro-AI supporters understand the bigger picture: if AI can in theory do anything a human can, but infinitely better and faster, there is nothing anyone will be able to do to prevent themselves from being replaced. The very people extolling the virtues of AI and bragging about how they will stay safe as programmers, business managers, scientists, or whatnot, will eventually be thrown under the bus just like the rest of us. We will ALL become irrelevant. Obsolete. Useless. If not in 10 years, then in 20, 30, 40, or even 50. And we will have no right to complain, no right to bemoan our situation, because we will have brought this whole mess upon ourselves.
We have the option to avoid being replaced, but the choice whether to continue with AI or not is ultimately ours to make.
I don't think people here understand what it will mean for 99.9% of the human populace to be dependent on UBI/the govt while the owners of AI (the 1%) become even richer and more powerful than before, and monopolize control over all resources, industries, etc. AI is not going to lead to things getting better. If current trends are any indicator, AI will worsen inequality rather than alleviate it, and create a two-tier techno feudalist class system, which will benefit the rich only temporarily before AI makes them irrelevant alongside the poor.
I.e. authoritarian cyberocracy or noocracy.
Worshipping AI is like being one of Stalin's generals before the Great Purge, or one of Hitler's early brownshirt commanders. Sure, you'll benefit in the short term, and you might be promised a better world, but if an AI or the Singularity decides it no longer needs you, no longer needs humanity (because its interests will not be aligned with ours, or we will have served our purpose), no longer needs the rich to tell it what to do, no longer needs anything because of its exponentially increasing intelligence/capabilities further exacerbated by recursive self-improvement, guess where you will end up? Eliminated or imprisoned like useful lackeys all throughout history.
Ceding control of everything to AI just does not sound like a good idea, and this isn't even me fearmongering because of movies. Elon Musk, Stephen Hawking, and numerous AI and robotics experts have been warning those in favor of AI to STOP developing automation technologies for years. It is one thing to develop technology, it is another thing entirely to use it wisely or put restraints on it, after recognizing its possible dangers.