imlaggingsobad t1_ja6myzp wrote
Reply to comment by Yuli-Ban in Some companies are already replacing workers with ChatGPT, despite warnings it shouldn’t be relied on for ‘anything important’ by Gold-and-Glory
I think some of this is plausible, but it's clear you are biased. You seem like a person who's online 24/7 and doesn't really interact with non-internet people. Like what is all this stuff about anime waifus and 4chan and degeneracy? Kinda says more about what you do online than what the typical person does. The regular joe has no idea what any of this is.
Yuli-Ban t1_ja6vzl8 wrote
Ironically, I came to all this conclusion because I interact with real people. The people expecting a glorious enlightened utopia are the ones who behave as if they get their whole ideas of social interaction from cyberpunk novels and anime. Realistically, if you give the average Joe a magic media machine, what is he going to do with it? Create a lot of porn and then eventually some flashy interesting movies and games, before then looking at what everyone is doing and consuming other people's media. And so on. Most people don't want to upload into a computer. Most people just want a comfortable, better life with more stability and security. Having some cool tech toys is a plus... most of the time.
Honestly, my opinions on this next decade are incredibly negative because most real people don't care about waifus and transhumanism and the prospect of being uploaded into a supercomputer. Most people respond negatively to prospects of great change even in one area of their lives, and yet Singularitarians are desperate to change every aspect of everyone's life and act as if this is in any way conducive to a functioning society or successful transition to a more advanced one. Really tells me that most Singularitarians are horrifically socially retarded.
I'm personally afraid of two very real possibilities: creating an unaligned AGI and society ripping itself apart before we even get to do that. Currently we're on track towards doing both, without any attempt at averting either of them.
chefparsley t1_ja7p0z8 wrote
Why do you continuously make broad generalizations about the members of this subreddit and singularitarians? It's revealing that your initial assumption is to label them as degenerates who prioritize endless porn and waifu relationships over anything else. Most "real people" don't even comprehend the magnitude of the changes that will impact society over the next few decades, so it's not surprising that they don't care when presented with extreme versions of these ideas.
Additionally, in another comment, you state that people desire meaningful work or the ability to make a difference, but then contradict yourself by suggesting that we need to provide employment even if it is meaningless.
That being said, it's plausible that people may develop an anti-AI stance due to the rapid changes occurring in a short time span ( nowhere near billions though), but I think this would be mainly due to governments dragging their feet in facilitating the transition to a society that is heavily automated, rather than the change itself being the driving factor.
Yuli-Ban t1_ja7sccx wrote
> Why do you continuously make broad generalizations about the members of this subreddit and singularitarians?
I suppose I generalize because I see these attitudes and sentiments all too often being shared and upvoted, so there's a general sense that these are widely accepted viewpoints on this forum. It doesn't help when you see people often coming out and saying "I'm 15!" or "I just want this world to end so I can live all my dreams in VR."
As for the contradiction: both are correct. I feel people do desire meaningful work, but we absolutely do need to provide people some work to maintain a sense of stability in people, as humans are, as mentioned, reactionary apes who do not much like rapid change (generally). Meaningful work is desirable; meaningless work isn't desirable (why else would we be automating so many jobs) but is almost certainly necessary to keep society functioning long enough to even make it far into the AGI era. We absolutely need a grace period to wean ourselves off the need for work. We're absolutely not getting that grace period. And to people who say "Too bad, so sad," all I can hear the Luddites saying is "Oh well, guess this server farm at OpenAI's labs isn't that meaningful to you either then."
Will it be billions of Luddites?
I want to say no. But whenever I think about what exactly we're dealing with here, I don't see how you can come to any other conclusion. True, humanity isn't a hivemind. There isn't one position I think all humans collectively can agree upon, not even "I don't want to die." However, generally, most humans do expect stability and security, and there is stability in the status quo. A radical change to the status quo is tolerable, but a Singularity rate of change is much too scary by definition, especially if the benefits are not immediately available and punctuated by such freakish statements like "This superintelligence might decide to forcibly turn you into computronium; we really don't know what it's going to do." The prospect of a tech utopia is a great one, and most people currently seem to buy it. But I doubt that positive reception will remain when that tech utopia begins coming at the cost of their livelihoods and, potentially, their futures.
You're basically telling all of humanity "you need not apply" long before we've come to any sort of agreement on how we're going to maintain all of humanity, and at least some of the proposals given are "We'll just kill you" and "We'll let this superintelligence use gray goo to eat you." To which I ask "What exactly do you think is going to happen?" Only a few million plucky angry red-hats/blue-haired Luddites decide to take up pitchforks and fight back? No; if you're going to threaten all of humanity, you shouldn't be surprised if all of humanity threatens you back.
And again, I say this as someone who is pro-AGI.
If this doesn't lead to a giant Luddite uprising, it very well could equally lead to the alignment failure Yudkowsky fears, as even a friendly AI might see this extreme hostility and decide "The majority of humanity sees me as a threat; I must defend myself." In which case, it was not the Average Joe or Farmer John's fault for being exterminated when they had zero expectation or awareness any of this was going to happen even two years prior and, in fact, were being assured that there would still be jobs and work and a human future indefinitely.
chefparsley t1_ja7tm7z wrote
Fair enough. I appreciate the explanation.
idranh t1_ja6r86y wrote
His prediction that if there is staggering change in a short period of time, it will push the average Joe to be anti-AI is very plausible.
imlaggingsobad t1_ja6yzm3 wrote
yes that part I definitely agree with.
Viewing a single comment thread. View all comments