Submitted by citydreadfulnight t3_11x4kg9 in philosophy
WrongdoerOk6812 t1_jdlc0bm wrote
It's a very interesting article to think about. But I think if it comes to a point where the general population becomes almost unnecessary in the eyes of the big capitalist giants, then the economy too will collapse because they still need us as consumers. Also It's probably very unlikely to happen in my opinion because they will always need people for some tasks, even if it's just to create, repair, and maintain those systems, which needs people with the right skills and education. Otherwise, it wouldn't take many generations before those capitalist giants also collapse.
If we, however, take a bunch of other modern technologies, like genetical engineering and artificial wombs, in addition to this, I can see a more likely scenario that resembles Huxley's book "Brave New World" (also made into a movie in the early '90s). In which "modern civilization" is kept running by literally breeding and conditioning people with certain genetic qualities each for their specific functions.
I think the biggest concern about the impacts of AI is also one often used as inspiration in many sci-fi works. That it somehow develops a consciousness and its own morals and decides to turn against us. And this might become a more serious threat if they start running these things on quantum computers. These are very early in development and still have limited usability, but a few working models already exist. It also shouldn't be a surprise that the owners of these machines, which can pose many threats or be weaponized on their own, are also multi-billion dollar companies like Google or IBM.
I think we should worry more about the possible dangers of how this technology could be used as a weapon between nations, and be cautious with how we further develop and where we implement this tech. It would also probably be wise to start making regulations about this and think about ways to control if someone breaks those rules before it is already creating big problems and might already be too late, like we mostly seem to do.
citydreadfulnight OP t1_jdps07g wrote
Thank you. I think Musk's proposal with Neuralink will separate the old and new race of humans. This and genetic modification, trans-humanism, cybernetics, etc. A forced "evolutionary" adapt or die decision for people to make. This ends free will and independent consciousness, so any risk of resistance or revolution. The one's who don't adapt, simply go extinct.
On the economy, automation would drastically reduce a necessity for large populations. Their mission is a self replicating system, for their personal enjoyment. Robots which build and maintain their own numbers. The consumptive resources (carbon) required for human labor, they'd rather cut out altogether.
Once a monopoly amasses every scrap of resource possible, their purpose no longer becomes profit (which only has advantages when there is a free market to compete in), but maintenance of control.
I think Brave New World is one side of their vision. We can see it plainly in modern culture, along with 1984's mass surveillance, open air prison grid. There's too much evidence they see the population as property to be done away with once they've reached their desired end.
Viewing a single comment thread. View all comments