AvgAIbot t1_ja5qfef wrote
Reply to comment by Yuli-Ban in Some companies are already replacing workers with ChatGPT, despite warnings it shouldn’t be relied on for ‘anything important’ by Gold-and-Glory
All good points. I don’t buy into the utopia stuff either, atleast not anytime soon.
Say we do get the UBI, it’s not going to be much. Probably enough just to get by, if even. That’s going to be a lot of unemployed people that are bored AF and will crave some type of purpose. I can see huge anti-AI movements and most likely violence to come.
Yuli-Ban t1_ja5ryrh wrote
See, I personally expect something closer to fully-automated socialism, though we won't call it socialism in America (probably "social dividend" or "patriot income"). But that's entirely beside the point. My point is exactly that people are so blinded by ideas, concepts, and dreams that they cannot see (or refuse to see) the ground reality of the matter.
We saw this in the 80s and 90s with the cyberdelic movement, which seriously believed that the Internet was going to lead to a post-political, social libertarian, ultra-enlightened utopia where everyone is informed, aware, and altruistic, that it would end tribalism and turn every man into an artist, that the real world would essentially become obsolete as we'd have no need for public gatherings, concerts, or real-world meetings because that was the ideal of what the internet represented. The reality was, of course, that humans are not that perfect, that humans value the status quo and interaction, are deeply tribal and flighty, deeply desire real world interaction, will seek instant gratification, and will gladly piss on utopia if it makes us feel better. The Internet was meant to turn society into a neo-Antiquity digital College of Athens, and that exists to some extent (I'm not saying it doesn't). It largely became a tribalized hub of memes, porn, cat videos, SEO, bots, entertainment, conspiracy theories, and attention seeking.
I see Singularitarians making the exact same mistakes. And I mean the exact same mistakes of thinking we're going to achieve this ultra-enlightened, posthuman utopia of altruists, artists, and post-political supermen.
Watch the AGI era not be anything like we expect it to be and instead be a piss-smelling world of AI-generated shitposts and AAA fetish movies/games/simulations, Luddites and transhumans coexisting but not necessarily peacefully, a massive population increase for the Amish and Mennonites and their imitators, vastly more bullshit jobs than should exist, the internet becoming a giant hallucination in the mind of a superintelligence requiring a second internet to be created (how it's any different, I don't know), and said superintelligence largely existing as a giant oracle that probably gets goaded by humans into developing an anime fursona in real life. You'll have people deciding to live permanently in their childhoods, even eventually deciding they identify as children. You'll have UBI and citizen's dividends given to people who want to fanatically vote to abolish UBI and citizen's dividends and outlaw AI so humans will have jobs for time forevermore, ironically with these opinions amplified by language model-enhanced chatbots. You've got the ultimate in instant gratification with AI-generated media, but 98% of generated media is never seen by another person because it's too degenerate. Invasive BCIs will exist, but 90% of people won't even contemplate getting them, but you an absolutely expect people to be lining up to get genetic modifications for their depression and genital performance. The AGI will be used for high philosophy, but for the most part, it'll be generating waifus and husbandos (both digital and in real life) as 4channers constantly try goading it into destroying the world, which it won't do because it wound up falling in love with anime girls. The culture wars will be out of control, as dead people and unused profiles come to life via advanced chatbots and continue contributing to cultural outrage. Eventually, we'll be in an utterly bizarro stage of life where you've got off-world colonies and posthumans in large server farms in some places and a superintelligence improving itself in one spot, and then a bunch of fleshy meaty humans succumbing to conspiracy theories that it's still the 1990s while others continue drudging at 9 to 5 jobs to keep themselves sane elsewhere. Oh, and we don't die from disease anymore, so our shitposting potential extends into infinity. Just a massive collapsing singularity into the darkest, coldest depths of degeneracy with a few flourishes of our utopian dreams on top.
And if we can get aligned AGI, I'm all for it.
turnip_burrito t1_ja68bgs wrote
That was entertaining but... dafuq?
It seems way more likely to me that we get aligned AGI or unaligned AGI than whatever that is lol
imlaggingsobad t1_ja6myzp wrote
I think some of this is plausible, but it's clear you are biased. You seem like a person who's online 24/7 and doesn't really interact with non-internet people. Like what is all this stuff about anime waifus and 4chan and degeneracy? Kinda says more about what you do online than what the typical person does. The regular joe has no idea what any of this is.
Yuli-Ban t1_ja6vzl8 wrote
Ironically, I came to all this conclusion because I interact with real people. The people expecting a glorious enlightened utopia are the ones who behave as if they get their whole ideas of social interaction from cyberpunk novels and anime. Realistically, if you give the average Joe a magic media machine, what is he going to do with it? Create a lot of porn and then eventually some flashy interesting movies and games, before then looking at what everyone is doing and consuming other people's media. And so on. Most people don't want to upload into a computer. Most people just want a comfortable, better life with more stability and security. Having some cool tech toys is a plus... most of the time.
Honestly, my opinions on this next decade are incredibly negative because most real people don't care about waifus and transhumanism and the prospect of being uploaded into a supercomputer. Most people respond negatively to prospects of great change even in one area of their lives, and yet Singularitarians are desperate to change every aspect of everyone's life and act as if this is in any way conducive to a functioning society or successful transition to a more advanced one. Really tells me that most Singularitarians are horrifically socially retarded.
I'm personally afraid of two very real possibilities: creating an unaligned AGI and society ripping itself apart before we even get to do that. Currently we're on track towards doing both, without any attempt at averting either of them.
chefparsley t1_ja7p0z8 wrote
Why do you continuously make broad generalizations about the members of this subreddit and singularitarians? It's revealing that your initial assumption is to label them as degenerates who prioritize endless porn and waifu relationships over anything else. Most "real people" don't even comprehend the magnitude of the changes that will impact society over the next few decades, so it's not surprising that they don't care when presented with extreme versions of these ideas.
Additionally, in another comment, you state that people desire meaningful work or the ability to make a difference, but then contradict yourself by suggesting that we need to provide employment even if it is meaningless.
That being said, it's plausible that people may develop an anti-AI stance due to the rapid changes occurring in a short time span ( nowhere near billions though), but I think this would be mainly due to governments dragging their feet in facilitating the transition to a society that is heavily automated, rather than the change itself being the driving factor.
Yuli-Ban t1_ja7sccx wrote
> Why do you continuously make broad generalizations about the members of this subreddit and singularitarians?
I suppose I generalize because I see these attitudes and sentiments all too often being shared and upvoted, so there's a general sense that these are widely accepted viewpoints on this forum. It doesn't help when you see people often coming out and saying "I'm 15!" or "I just want this world to end so I can live all my dreams in VR."
As for the contradiction: both are correct. I feel people do desire meaningful work, but we absolutely do need to provide people some work to maintain a sense of stability in people, as humans are, as mentioned, reactionary apes who do not much like rapid change (generally). Meaningful work is desirable; meaningless work isn't desirable (why else would we be automating so many jobs) but is almost certainly necessary to keep society functioning long enough to even make it far into the AGI era. We absolutely need a grace period to wean ourselves off the need for work. We're absolutely not getting that grace period. And to people who say "Too bad, so sad," all I can hear the Luddites saying is "Oh well, guess this server farm at OpenAI's labs isn't that meaningful to you either then."
Will it be billions of Luddites?
I want to say no. But whenever I think about what exactly we're dealing with here, I don't see how you can come to any other conclusion. True, humanity isn't a hivemind. There isn't one position I think all humans collectively can agree upon, not even "I don't want to die." However, generally, most humans do expect stability and security, and there is stability in the status quo. A radical change to the status quo is tolerable, but a Singularity rate of change is much too scary by definition, especially if the benefits are not immediately available and punctuated by such freakish statements like "This superintelligence might decide to forcibly turn you into computronium; we really don't know what it's going to do." The prospect of a tech utopia is a great one, and most people currently seem to buy it. But I doubt that positive reception will remain when that tech utopia begins coming at the cost of their livelihoods and, potentially, their futures.
You're basically telling all of humanity "you need not apply" long before we've come to any sort of agreement on how we're going to maintain all of humanity, and at least some of the proposals given are "We'll just kill you" and "We'll let this superintelligence use gray goo to eat you." To which I ask "What exactly do you think is going to happen?" Only a few million plucky angry red-hats/blue-haired Luddites decide to take up pitchforks and fight back? No; if you're going to threaten all of humanity, you shouldn't be surprised if all of humanity threatens you back.
And again, I say this as someone who is pro-AGI.
If this doesn't lead to a giant Luddite uprising, it very well could equally lead to the alignment failure Yudkowsky fears, as even a friendly AI might see this extreme hostility and decide "The majority of humanity sees me as a threat; I must defend myself." In which case, it was not the Average Joe or Farmer John's fault for being exterminated when they had zero expectation or awareness any of this was going to happen even two years prior and, in fact, were being assured that there would still be jobs and work and a human future indefinitely.
chefparsley t1_ja7tm7z wrote
Fair enough. I appreciate the explanation.
idranh t1_ja6r86y wrote
His prediction that if there is staggering change in a short period of time, it will push the average Joe to be anti-AI is very plausible.
imlaggingsobad t1_ja6yzm3 wrote
yes that part I definitely agree with.
pls_pls_me t1_jaa70sd wrote
I always stop what I'm doing when I see Yuli-Ban, but this has to be my favorite post and projection about Singularity topic ever. Bravo
Viewing a single comment thread. View all comments