Submitted by wastedtime32 t3_1134aem in singularity

I am relatively uneducated on AI. I recently became interested by the introduction of chat-GPT and all the new AI art. My first and most significant reaction to all of this, which has taken absolute precedent in the last few months, is fear. Terror rather. What does this all mean? I’m currently a college student. I will I spend my entire adult life simply giving prompts to AI? Or will there be prompt AI soon, even before I graduate? I’ve don’t a lot of speculation and some research, and I am having a very hard time understanding the practical reasons why we as a species seam to be so adamant on creating this singularity. Form what I understand, we have no idea what will happen. I am horrified. I feel as if the future is bleak and dystopian and there is now way to prepare, and everything I do now is useless. This post is somewhat curiosity, and a lot of desperation. Why am I forced to reconcile with the fact that the world will never ever look the same, and the reasons for that entirely allude me? Is it in the pursuit of money? As far as I can see money won’t matter once the singularity comes about. I am fucking terrified, confused, and desperate.

19

Comments

You must log in or register to comment.

Surur t1_j8nyugk wrote

It is not right that we will merge into a single entity. We have no idea what will happen. One thing that is certain is that we will definitely die without the singularity, and the singularity actually gives us a shot at immortality.

20

wastedtime32 OP t1_j8o06y8 wrote

Death gives life meaning. Immortality is infinite suffering.

−29

Surur t1_j8o203a wrote

> Death gives life meaning.

There is a theory that people only say this because they know they will die, and if they actually had the option of immortality, they would grab it with both hands and feet.

The truth is that life has no meaning, and you are just here to enjoy the ride. If you enjoy the ride you may want to stay on a bit longer.

> Immortality is infinite suffering

You always have the option of checking out.

30

Hotchillipeppa t1_j8o8wi3 wrote

I argue your experiences give life meaning, and having 10x longer lifespan removes societal expectations on what you should and shouldn’t have done depending on your age

6

danysdragons t1_j8svovw wrote

The idea that death is a blessing because it gives life meaning seems like Stockholm Syndrome to me.

2

Anonymous_Asker0813 t1_j8p22wt wrote

But an infinite amount of time? Can we even comprehend that?

1

Surur t1_j8p3sg1 wrote

You wont have to worry about that, as you would change over time. You 1000 years in the future would be a very different from you now, as you are different from how you were 10 years ago.

4

dasnihil t1_j8oa690 wrote

that "meaning" you think that exists out there was made up by us as a coping mechanism.

biological species operate on physics/chemistry and some species are already immortal, some live for hundreds of years, and some for a few seconds.

try to come out of your bubble of tribal constructs to see more clearly.

14

wastedtime32 OP t1_j8p143x wrote

I just told you what the meaning was. It’s not religious by any means. The only meaning there is derives from our awareness of a restriction on our ability to be.

−4

dasnihil t1_j8p1in4 wrote

Word mumbo jumbo.

A human child who is entirely kept from the knowledge of death is equally sentient and aware and more meaningful than our redundant lives. Your theory fails in many ways, I'm just pointing out one.

We're just used to being mortal. Once we're not, we'll just create new meanings around immortality to cope with existence, that one is not going away whether we're mortal or not.

8

wastedtime32 OP t1_j8p2ch8 wrote

I have a question for you. If I want to live on a farm and raise a family and work and make things for myself, and I’m not restricting anyone else’s ability to do whatever they want, should I be allowed? Or should I be forced to commit myself to this new utopian world? If it’s a utopia, shouldn’t everyone be able to do exactly what they want to do?

1

KillHunter777 t1_j8p75v9 wrote

You will be allowed to do exactly that. What are you arguing against?

2

wastedtime32 OP t1_j8p9ug8 wrote

Whoops, responded to the wrong person.

1

dasnihil t1_j8pc2o0 wrote

same answer, nobody is going to force immortality on you, except maybe the AI overlords, but that's for harvesting reasons and not in our control anyway.

if you find existence as suffering, then it's fine to not crave living forever, i understand.

3

GhostInTheNight03 t1_j8olu4o wrote

Death voids meaning and makes your life virtually pointless

6

aVRAddict t1_j8oqnaa wrote

So does digital immortality. You will end up in an infinite pleasure loop or eventually be destroyed by some accident or the end of the universe anyways.

−6

Big_Foot_7911 t1_j8nyqzh wrote

Singularity defines the point in a function where it takes an infinite value, for instance when mathematically modeling a black hole.

Continued exponential growth in technology means that at some point things begin to advance and change so rapidly we have no way to estimate or assess how it impacts our lives and the world around us.

If you can describe a what a singularity will be it’s no longer a singularity.

As to why we are perusing it? Great question. I guess for that you’d have to answer why humans have perused technological progress in general. I’m guessing there are biological/evolutionary factors as well as some philosophical ones to answer it fully.

One thing I know is that you can’t stop change and it would be virtually impossible to stop human progress. Best to accept it will continue to happen until we are no more. Then it’s just a question of how to best cope and capitalize.

15

wastedtime32 OP t1_j8o0su7 wrote

I’ve always understood it would happen no matter what. What scares me is how fast and how sudden it is coming. And I also think; once we become aware of a trend guided by natural forces, doesn’t our awareness of it take precedent? But people have no interest in stopping it because we’ve created a world which rewards those who abide by those set rules, even though we know what they are and have the ability to consciously subvert them.

5

AsheyDS t1_j8o3oq4 wrote

Personally, I wouldn't expect everything to change all at once. The rate of change may increase some, like it always does, but we will almost certainly lag behind our technical progress. Lots of people don't want so much progress that we can't keep up with it. Others, like many of the people that post here, are miserable with the state of things as they are and can't wait until things completely change, and so you'll hear a lot of talk about hard takeoffs and exponential change... Frankly, it'll probably be somewhere around the middle. I wouldn't expect instant change, but you should be prepared for at least an increase in changes.

3

dwarfarchist9001 t1_j8p0hmt wrote

>Singularity defines the point in a function where it takes an infinite value,

It doesn't need to be infinite it can also be undefined or otherwise not well-behaved. For instance the function 1/x is never infinite for any finite value of x but it has a singularity because it is undefined at x=0. Another example is the piecewise function F(x)={x^2 if x>=0 {-x^2 if x<=0 for which x=0 has a definite value of y=0 but x=0 is considered a singularity for the purposes of calculus because it is not differentiable at that point.

2

Big_Foot_7911 t1_j8p1jck wrote

Excellent, thank you for the correction and additional explanation

2

TFenrir t1_j8nyno3 wrote

>I am relatively uneducated on AI. I recently became interested by the introduction of chat-GPT and all the new AI art. My first and most significant reaction to all of this, which has taken absolute precedent in the last few months, is fear. Terror rather. What does this all mean?

Honestly? Fair. If this is your first introduction, I can really appreciate the discomfort. If it helps, I find learning more about the technology under the hood removes some of the anxiety for a lot of people, as ignorance of something powerful leaves a lot of room for fear. That's not to say that fear is unwarranted, just that it can be mitigated by exposure.

> I’m currently a college student. I will I spend my entire adult life simply giving prompts to AI? Or will there be prompt AI soon, even before I graduate?

There is a lot of effort being made to remove the need for prompting all together, to create true, somewhat autonomous agents who are "prompted" not just by a message sent to them, but by the changes in the real world in real time.

> I’ve don’t a lot of speculation and some research, and I am having a very hard time understanding the practical reasons why we as a species seam to be so adamant on creating this singularity.

Well the reason is pretty straight forward - want to "solve" intelligence, so that it can solve all other problems for us, quickly and fairly. That's the pie-in-the-sky dream many people are pursuing

> Form what I understand, we have no idea what will happen. I am horrified. I feel as if the future is bleak and dystopian and there is now way to prepare, and everything I do now is useless. This post is somewhat curiosity, and a lot of desperation. Why am I forced to reconcile with the fact that the world will never ever look the same, and the reasons for that entirely allude me? Is it in the pursuit of money? As far as I can see money won’t matter once the singularity comes about. I am fucking terrified, confused, and desperate.

Like I said earlier, I appreciate why you are overwhelmed, but the world was never going to stay the same. Becoming comfortable with that... Discomfort, that uncertainty, is going to be a strength unto itself. If you can master that, I think the changing world will be a lot more palatable in the future

11

wastedtime32 OP t1_j8o00h2 wrote

This is what I keep hearing. Stuff about excepting change. But there is no historical precedent for this. This is the start of the exponential growth. The way I see it, I have every reason to be afraid and not one reason not to be. I am spending my parents life savings to get a degree that likely will not matter. My big problem is, what exactly are we expected to do once we “solve intelligence”? I LIKE the natural world. That’s all there is. It will never make sense to me. I don’t want to float around in a computer metaverse and be fed unlimited amounts of seratonin and never question anything or wonder or worry or feel any other emotion. That is all I know. And it is going ti be taken away from me without my consent? This future of AI is inevitable totalitarian. Brave new world type shit. It’s real. It’s fucking real. And everyone around me is talking about internships and where they want to live and different jobs and stuff. My girlfriend thinks I’m crazy because this fear is all I talk about. She said everything will be okay and I’m just falling for the fear mongering. I don’t know what to do with myself. It is hard to find joy when all I think about is how EVERYTHING that gives me joy will be gone.

8

TFenrir t1_j8o1khg wrote

>This is what I keep hearing. Stuff about excepting change. But there is no historical precedent for this. This is the start of the exponential growth. The way I see it, I have every reason to be afraid and not one reason not to be. I am spending my parents life savings to get a degree that likely will not matter. My big problem is, what exactly are we expected to do once we “solve intelligence”? I LIKE the natural world. That’s all there is. It will never make sense to me. I don’t want to float around in a computer metaverse and be fed unlimited amounts of seratonin and never question anything or wonder or worry or feel any other emotion. That is all I know. And it is going ti be taken away from me without my consent? This future of AI is inevitable totalitarian.

It's really really hard to predict anything, especially the future. I get it. There is a sort of... Logical set of steps you can walk down, that leads to your conclusion. But that is only one of many paths that are going to open up to us. You're right it's all exponential, but I also think that means what the human experience can be is going to expand. Maybe we will diverge as a species. There is a great sci fi book series (Commonwealth Saga) and in one of their books, they come across this species that seems to have fallen into this divide. Most of the species have left their physical bodies behind, but some of the species never strayed from their farming, amish-like lifestyle. My point is... I can imagine lots of different futures, and lots of them have a world where maybe more people can have the kind of lives they want.

>Brave new world type shit. It’s real. It’s fucking real. And everyone around me is talking about internships and where they want to live and different jobs and stuff. My girlfriend thinks I’m crazy because this fear is all I talk about. She said everything will be okay and I’m just falling for the fear mongering. I don’t know what to do with myself. It is hard to find joy when all I think about is how EVERYTHING that gives me joy will be gone.

I had this talk with my partner literally... Monday, this week. She's had to hear me talk about AI for the entire decade we've been together, and as things get crazier she asks me how I feel about it. If I'm freaking out. I just told her that I'm trying to live my life like the next ten years are the last years that I can even kind of imagine, that there is an event horizon that I can't see beyond, and worrying about what's beyond that line is just a source of infinite loops in my mind.

Instead I'm going to get some friends together and go out dancing. It's been a while since we've had the chance.

8

MrTacobeans t1_j8p1wwt wrote

I have sorta the same kinda fear response to the exponential growth of AI in just the last year. We've gone from "WOAH this ai can convincingly have a conversation with a human and beat a world expert in games/challenges" to "holy shit this AI chatbot can now help me with my job, give therapist level advice (even if it's wrong), image generation at 1% of the effort needed by a conventional artist and the constant flux of all these things improving not just on a quarterly basis anymore but like every other day some sort of SOTA level model is released".

It's alarming and it is a lot but I think if AI doesn't just randomly decide that humans are a useless presence we'll be fine. I see a world where even general intelligence AI aligns as a tool that even "bad actor" AI is combatted by competing "good actor" AI. I don't see society falling apart or a grand exodus of jobs.

I'm hoping AI turns into the next evolution of the internet, where we all just have an all-knowing intelligent assistant in our pocket. I can't wait for the day that my phone pings with a notification from my assistant with useful help like "Hey, I noticed you were busy and a bill was due so I went ahead and paid that for you! Here's some info about it:" Or "I know it's been a tough day at work but you haven't eaten, can I order you some dinner? Here's a few options that I know are your comfort foods: ".

The moment a dynamic AI model that can run on consumer hardware with chatGPT level or even beyond intelligence, stuff like that will become a reality. AI might be scary but trying to think about the positive effects it could have really helps me cope with the nebulous unknowns.

2

Eduard1234 t1_j8pkiql wrote

Just a comment about the idea of having one AI supervise the other. I’m thinking that won’t work. If the bad actor AI has even slightly more advanced than the other AI it will be uncontrollable. Chances that a bad actor AI somehow figures out how to escape its confines and invent self improvement unsupervised seems real to me.

1

wastedtime32 OP t1_j8p2nn1 wrote

Those things aren’t necessarily positive effects to me though. I don’t want to be able to consume all available information at the press of a button. What would be the point of knowing if you can’t actively learn?

0

MrTacobeans t1_j8qi0gg wrote

You just explained a Google search. Atleast in the short term AI/chat bots are just proving to be a more consumable or more entertaining way of gathering information.

It's upto each person to decide what they want to learn without the crutch of technology. Even an expert AI will never replace the need for actively learning things. Jumping way back even written language is a technology. For thousands of years humans have been figuring out how to compress knowledge and share it easily.

1

wastedtime32 OP t1_j8ri4rq wrote

Idk dude. Seems like a lot of people on this sub (and subsequently, in the tech world, at the forefront of these technologies) look at AI as a means to completely optimize all human functions and reduce any sort of meaning in anything. Seems to me a lot of these people feel alienated in modern society, and the way they think it will get better is by taking all the things that alienate them and making those things even stronger. Like the way Musk and SBF say they’ll never read a book because reading is useless. The game of life has already lost its meaning to so many in the modern age, and people who can’t see WHY, and they wrongly think that accelerating into a even further alienating world is the answer. If it was up to the tech gurus, we’d all be neurologically implanted into the blockchain and never do anything other than optimize optimize optimize produce produce produce. There is a reason most people at the forefront of these technologies tend to be neurodivergent. This is all just a capitalist wet dream and soon enough all ethics and regulatory practices will be seen as enemies to the doctrines of “efficiency” and “optimization” and serve no purpose and will be ignored. People here love to paint me as some weird traditionalist religious conservative values person. But they are so unaware of how they worship the idols of optimization and utility and efficiency. These are things that should be the goals of systems that supply human needs yes, but they have their place. The idea that once we reach post scarcity those in charge will have ANY incentive to help us in any way is entirely insane. It’s “evolution”! Following the same biological tendencies, AI is giving those in power the means, and the reason to completely annihilate all others. And people here really think it will HELP us😂😂😂.

−1

Proof_Deer8426 t1_j8oxcjz wrote

Ai will not solve all problems for us - most of our problems are already solvable. We could homelessness tomorrow but we don’t - because that would contradict our society’s ideology. This technology will be owned by people that don’t want to solve the same kinds of problems that most people imagine they want solved. Mass production did not lead to the end of scarcity - most of the world still lives in poverty and spend most of their lives working for a pittance. If we ask an ai how to end poverty and it answers with economic redistribution and a command economy, that ai will be reprogrammed to give an answer that doesn’t upset the status quo.

2

TFenrir t1_j8p7obm wrote

>Ai will not solve all problems for us - most of our problems are already solvable. We could homelessness tomorrow but we don’t - because that would contradict our society’s ideology.

How could we solve homelessness tomorrow? Not to be rude, but statements like this feel like they are just coming from a place of ignorance and jadedness.

We have many many many problems in this world, and they are often very intertwined - and so far, every problem we have overcome, we have used our intelligence to do so.

> This technology will be owned by people that don’t want to solve the same kinds of problems that most people imagine they want solved.

Again. Jadedness. Do you know who is working towards AGI? What their ideologies are? Do you think the thousands of engineers who are putting their life's work into creating this... Thing, are doing so because they want to serve some nefarious, mustache twirling evil overlord? I think that you are doing yourself a disservice with such a myopic and simplistic outlook on society and humankind.

> Mass production did not lead to the end of scarcity - most of the world still lives in poverty and spend most of their lives working for a pittance.

The world is today, in probably the best state it has ever been in, in most measures of human success. We have fewer people as a percentage of the population in poverty than ever before. We have blown past most goals that we have placed for ourselves to end world hunger. The upward momentum of the world is extremely significant, and you can see this in all corners of the developing world. What are you basing your opinions on?

> If we ask an ai how to end poverty and it answers with economic redistribution and a command economy, that ai will be reprogrammed to give an answer that doesn’t upset the status quo.

Again, myopic.

2

Proof_Deer8426 t1_j8pbo9h wrote

I don’t mean to be rude but I think it’s naieve to imagine that ai will not be used to reinforce the current power structures, or that those structures have benefited humanity. Jeremy Corbyn said that if he were elected, homelessness within the UK would be ended within weeks, and it is not an exaggeration to say that would be entirely possible. There are far more homes than homeless people, and we have the resources to build many more. We don’t, because it would disrupt the ideology of capitalism, which requires the threat of homelessness and unemployment in order to force people to work for low wages. Wages and productivity have been detached for decades now - ie wages have remained stagnant while productivity has increased exponentially. Ai will increase productivity, but without changing the economic system the benefit will not be to ordinary people but to the profits of the rich.

The upward momentum of the world you refer to is misleading. People like Bill Gates like to point out that enormous amounts of people have been lifted out of poverty in recent decades, trying to attribute this to the implementation of neoliberal economics. They always neglect the fact that these stats are skewed by the work of the Chinese Communist Party, which has lifted 700 million people out of absolute poverty - more than any government in history. That has nothing to do with the political trajectory that the West has been on, or it’s domestic economic paradigm - by which for the first time in centuries, the younger generations are significantly poorer and downwardly mobile compared to their parents.

I don’t know much about the ideology of the people working towards agi, I would be interested to know more about it though if you want to tell me. I do know that a lot of people interested in ai follow ideas like effective altruism, which is a philosophy that serves rather than challenges the status quo.

2

TFenrir t1_j8pe4w6 wrote

>I don’t mean to be rude but I think it’s naieve to imagine that ai will not be used to reinforce the current power structures, or that those structures are have benefited humanity.

How does it play out in your brain? Let's say Google is the first to AGI - this AI can do lots of really impressive things that can help humanity; solve medical questions, mathematics questions, can be embodied and do all menial work, and can automate the process of building and farming, finding efficiencies and reducing costs constantly until the cost is a fraction of today's cost.

How does Google use this to prevent people from from benefiting? Do they also prevent all other non Google AGI attempts? Give me an idea is what you are envisioning.

> Jeremy Corbyn said that if he were elected, homelessness within the UK would be ended within weeks, and it is not an exaggeration to say that would be entirely possible. There are far more homes than homeless people, and we have the resources to build many more.

So in this very UK specific example, you imagine that the roughly 250k homeless could be homed in the roughly 250k empty homes. Would you want the government to just take those houses from whomever owned them, and give them to the homeless? I'm not in any way against providing homes for the homeless, but could see how that could not cause many negative side effects?

> We don’t, because it would disrupt the ideology of capitalism, which requires the threat of homelessness and unemployment in order to force people to work for low wages.

Or we don't because no one wants to spend that kind of money for no return. What happens when doing so becomes effectively free, do you think the government and people would like... Ban efforts to build homes for homeless?

> Wages and productivity have been detached for decades now - ie wages have remained stagnant while productivity has increased exponentially. Ai will increase productivity, but without changing the economic system the benefit will not be to ordinary people but to the profits of the rich.

A truly post AGI world would not have any human labour. It likely couldn't in any real way. How do you imagine a post AGI world still having labour?

> The upward momentum of the world you refer to is misleading. People like Bill Gates like to point to the fact that enormous amounts of people have been lifted out of poverty in recent decades, trying to attribute this to the implementation of neoliberal economics. They always neglect to point out that these stats are skewed by the work of the Chinese Communist Party, which has lifted 700 million people out of absolute poverty - more than any government in history. That has nothing to do with the political trajectory that the West has been on.

I'm African, how do you think Africa has fared in the last few decades when it comes to starvation, and economic prosperity? We don't even need to include China, I think basically every developing country in the world is significantly better off today than they ever have been, minus a few outliers.

You think China lifted it's people out of poverty without capitalism? Do you think China is not a capitalist country? I'm not a fan of capitalism, but I'm not going to let that blind me from the reality - that the world is better off and continues to improve with almost every measurement of human success. It's not perfect, but so many people have an almost entirely detached view of the world, compared to what it was even 30 years ago.

Edit: for some additional data

https://ourworldindata.org/a-history-of-global-living-conditions-in-5-charts

2

Proof_Deer8426 t1_j8pkc2x wrote

“How does it play out in your brain?”

One of the problems with the command economies that socialist states like the USSR tried to implement was that the maths was simply too complex to make it work efficiently. This is an example of something ai could be enormously useful for. Obviously no western country is going to use ai for this, because it’s contrary to their ideological and economic systems. The use that technology is put too will follow ideology. When the Industrial Revolution happened many people imagined a utopian future where the increase in productivity would lead to humans having to work far less. But today, we still spend the majority of our lives at work, and if it wasn’t for the historical efforts of socialists and trade unionists then even the restrictions on working hours, and on children working, would not exist (there are states in the US even today that are trying to repeal child labour laws). Of course ai will have benefits in regard to medicine, farming and so on. Does that mean everyone will have access to medicine, or that the workers on farms will see any benefit? Technically that would be possible, but within the ideology of capitalism it will not occur. Homelessness, poverty and unemployment exist because they are necessary for the economic system to function, not because of a lack of resources or lack of a solution. The benefits of ai will be limited by and subsumed into the ideological system - a system designed to give power and luxury to a tiny few via enforcing deprivation on the majority.

“A truly post AGI world would not have any human labour”

Perhaps, but my point is only that benefits in production and material abundance predominantly do not flow down to the working class/average person but up into the profits of the rich. Capitalism as we know it could not continue in a world where labour is unnecessary, but without changing the relations of power, the new system that emerges will simply mirror the old one. I could envision - as one possibility - a sort of tech neo-feudalism, where a Universal Basic Income is paid in return for some kind of military or other public service. But this income will go straight into the hands of the owning class - to rent, and to various rentier schemes (you will own nothing and you will be happy”, as the WEF put it). Of course this is only one scenario, but without changing the power relations, the system of deprivation for the masses and wealth and luxury for the few will remain regardless.

“You think China lifted it's people out of poverty without capitalism”

No - I agree China used capitalism to do that. But it was a different kind of capitalism to what is used in the West - goal oriented, with a long-term vision and a materialist, Marxist outlook, which aims to use capitalism as a tool to develop productive forces and to the end of benefitting it’s citizens and ultimately transitioning to socialism. This is very different from the blind profit seeking capitalism of the West.

1

wastedtime32 OP t1_j8p3dn1 wrote

This scares me even more. A utopia is impossible. The ruling class will use AI as a tool to in face impose more suffering on the rest of us.

I don’t think a truly objective all knowing AI is possible because objectivity doesn’t truly exist, truth is a construct. It scares me that people will worship AI under the assumption that it has no biases, either one’s it developed on its own or imposed upon it by its creators.

1

Proof_Deer8426 t1_j8p91b7 wrote

I guess the problem I foresee is that ai should theoretically be of benefit to humanity by increasing its productive capacity - it will make many jobs redundant, and others far more efficient. And that could be a good thing, freeing humanity from the all the restrictions imposed by economic necessity. The problem is that the ruling class aren’t actually motivated by greed for material wealth but by lust for power. And power within our economic system is dependant on deprivation - the wealthy are a class of people that own things, and via ownership are able to deprive and exploit others. Without deprivation and poverty their power would cease to exist. Since the technology will effectively also be owned by these people it will be used to support and sustain their power. How this will take shape is still unclear, but as the working class begins to lose the one form of power that it still has - the ability to withhold work - and the power of the ruling class is massively boosted by their control of ai, it seems like the future could be headed down a potentially nightmarish path.

Ideologically people may be inspired to think in a pseudo-objective way that they believe mirrors ai - you can already see this with the popularity of ideas like effective altruism, long-termism and the simulation ‘theory’. Anti-humanist ideas like eugenics and population control are likely to follow.

1

wastedtime32 OP t1_j8pajiu wrote

Yeah this is exactly my fears about AI, I just am not good at articulating them so everyone here thinks I’m just saying I want everything to stay the same. With the impending ecological collapse and recourse depletion, and the fall of globalization and inevitable rise of fascist adjacent chauvinistic isolationist hyper militarized states, this is about as bad of a backdrop to introduce AI, but then again I’m not sure there will ever be a “optimal” circumstance for it. But I do think that this will all culminate in either a massive revolution or dystopia. I just don’t see an in between. If capitalism prevails into the post-scarcity world we will be looking at the dystopia which many people here have confused for utopia.

Totalitarianism is soon and to me AI is a vessel for it. It is the lock to that door and it is in the hands of the ruling class already. There is a reason tranhumanist and bioengineering ideas are more prevelant amongst the elite (think WEF) because they know damn well most people will think of it for a means by which to accomplish equality and peace, but that is far from the case.

I guess from this post and the replies I’m learning that most AI have a hard on for a utopia and a ignorance for political/economic implications. These reactions are exactly what the big tech developers want. Complacency. Surrender.

1

DesperateProblem7418 t1_j8sb2b1 wrote

>Well the reason is pretty straight forward - want to "solve" intelligence, so that it can solve all other problems for us, quickly and fairly. That's the pie-in-the-sky dream many people are pursuing

I feel like the real reason is because we can. Humans want to satisfy their ego, cravings for power and curiosity. I don't hate WEF like a lot of conspiracy theorists out there, but Klaus Schwab recently stated in front of entire audience of leaders and innovators that "He who masters AI will be the master of the world". Its all about power.

Maybe researchers actually do care to solve intelligence because they are curious and actually want to use it to help people, but majority of other leaders and researchers want it because of power it will bring them.

1

_gr4m_ t1_j8o2k2v wrote

Why do religious people want to go to paradise? Its the same question really, it is the same loss of humanity in exchange for an existens with no suffering and endless pleasure.

And yet people have dreamt about it since the dawn of time.

There is your answer, people feel like utopia might be a few years of, and we are really looking forward to seeing where it leads.

11

ftc1234 t1_j8q0oz8 wrote

Mankind has made its life really comfortable in the last 50 years. If you told anyone in the early 1900s that most people will work from home in 2020, they’d find it unbelievable. All this comfort has come from using machines to make the humans work easier. Now we are going into a state where many people aren’t even needed in the production cycle unless they bring a ton of technical skills. This is why we have so much more homelessness and hopelessness now than before. I believe that this gap of people who are productive in the new world and who aren’t is going to keep increasing. What’s the solution? UBI is one of the solutions.

3

ComplicitSnake34 t1_j8qx9uh wrote

The issue with giving people money is that they can only spend. Money as it stands doesn't have a use outside of satisfying peoples' needs. Sure, they can invest in capital, but in every welfare system devised they make it a point of function to ensure they don't have enough to develop capital (a rather cruel system).

The only real "welfare" programs that have worked in the past is just to employ people in government. They're still living off taxpayer money (scary I know) but are contributing back with their labor and their ideas of how systems should be ran. The military, firefighters/police, and municipal jobs have lifted millions out of poverty and into the middle class. However, as it stands, most government jobs require a college education (when it's usually unnecessary) and has barred the impoverished from getting those opportunities. Instead, people who have the means to afford higher-ed (the middle class+) battle it out for those positions which frankly aren't worth the time investment of attending a 4-year.

1

just-a-dreamer- t1_j8nwxey wrote

In one word the Singularity is EVOLUTION. The height of evolution.

In the big picuture the human race doubled their economic/number outupt every 5.000 years as hunters and gatherers.

In the agricultural age we doubled our numbers/output every 900 years.

In the industrial age productivity doubled every 40-50 years and global population 50-70 years.

In the modern age global GDP doubles every 15-20 years while the world population rose from 1.5 billion im 1900 to 8 billion in 2023.

The evolution of humanity gets faster and faster. We are more in numbers, live longer and on average at an increasing material standard of living.

AI is the last invention of humanity that is needed for ultimate abundance in the physical world. Once an AI is created that surpasses all capabilities of a human being, every need and want a human can have can be taken care of.

9

wastedtime32 OP t1_j8nyh3g wrote

And then what😂. This is schizophrenic. What meaning will life have when we have no needs to fulfill? The only way one can hold this prospective is if you are completely soulless and entirely devoid of any ethics or morality. That is what makes us human, and we want to throw it away, in favor of what exactly?

−7

DrMasonator t1_j8o4bb2 wrote

To preface - that has nothing to do with schizophrenia. Schizophrenia involves episodes of psychosis and hysteria (which I don’t believe my guy u/just-a-dreamer- was experiencing). The coolest part about life is that you can assign whatever meaning you want to it. You shouldn’t want that meaning to be handed to you in silver platter.

I would argue right now you’re upset in part because it HAS been handed to you your whole life and you now find yourself grasping for a reason to keep it that way. You (probably) believe that your meaning comes from the path of working, raising a family, having some friends along the way, and then dying near loved ones somewhere around the age of 80 after a life of hard work. Maybe you’re super into service too, maybe that’s where you find your meaning.

In the end, none of that matters. The only reason it ever mattered to you was because you were told that it matters - be it through instinct or your peers and teachers. There is not necessarily a logical reason we must have the meaning granted to us be out final goal. Maybe, per chance, you enjoy collecting cards. Maybe that is what gives you meaning, collecting the best cards you can find. If that’s the case, that’s awesome! Some might judge you for it or call you strange, but who cares!? You found your own meaning. When the whole world is your playground, might as well make the best of it, no? I know some people like working, and they can continue doing so in a post scarcity world. I know many people who don’t like working - note their time is free to do whatever!

It’s that change in perspective which is essential to understand why this is a good thing. We’re not soulless or devoid of ethics or morality, we’re just normal people who want something better.

8

wastedtime32 OP t1_j8oz15b wrote

Fair assessment, but no I was not conditioned to value those things. Quite the opposite. I have grown to become a deontological thinker. To “think like a mountain” as Leopold put it. I see the interconnectedness of all things (scientifically not mystically) and have not been convinced (yet) that we as humans have the capacity to overwill the premise of nature. I like progress. But tactical, logical, and beneficiary progress. Financial incentive is at the very heart of the push for AI right now, there’s no way around it. I am not convinced that the desire for this particular future is not corrupted by the arbitrary notions of our current societal structures. The idea that this is natural progress comes from the assumption that progress as a product of market competition parallels the inevitable progress of species. I don’t think this is true. We have the capacity as humans to be self aware. This is a gift that could mean we collectively decide to moderate our progress for the benefit of all people.

I guess what I’m getting at is, as long as these innovations are coming from massive private tech firms, I don’t trust their motives. The idea that this system we’ve created perfectly distributes money to those who best abide by the natural forces of the world is silly to me. It’s a coping mechanism for people who want to see certain changes for a certain future, without acknowledging the world as it is today isn’t ready to morph into that future.

1

TheSecretAgenda t1_j8o8cb3 wrote

Hedonism

3

wastedtime32 OP t1_j8p0nle wrote

Suffering is inevitable. Hedonism is an oxymoron. It is ignorant to the deeper truths of the universe.

−3

CravingNature t1_j8o8t8n wrote

> What meaning will life have when we have no needs to fulfill?

You have no interests other than sustaining your needs?

2

wastedtime32 OP t1_j8p0wpc wrote

An interest is in and of itself the fulfilling of a need. I am interested in something = I have developed a need to learn about it/explore it.

1

just-a-dreamer- t1_j8o4am4 wrote

Well, we are talking about decades of transformation. The Singularity will probably not arrive before 2040-2050, or even later.

Still, AI does not have to be that "smart" to gradually replace next to all human labor before that point. As of now a humans can perform around 20.000 known tasks and AI is rapidly catching up in every field to replace human labor.

That is human labor in the sense of "working for a living". Yet, just because a human does not work, the output in goods and services remains the same and is still increasing.

The loaf of bread that is produced with 2% of the population toiling in agriculture is even better now than those produced with 80 % of the population once toiling in the fields.

The point of automation is to get rid of the very concept of "working for a living". Still , humans will have plenty of passions they can persue. It will be possible to merge into the digital world at some point for sure.

1

Puppetofmoral t1_j8o5l3k wrote

There's still time until it will happen. Just imagine what some Leaders in the world can do, with just a fraction of that AI Power. I say there's a good chance we, our generation, will see the next great war before the singularity.

1

wastedtime32 OP t1_j8ozv1l wrote

I know. That’s one of the major reasons I am scared.

1

wntersnw t1_j8o5rwg wrote

Agonizing over things you have no control over is never a good idea. Best to just live your life and enjoy it, and try to take things as they come. There's no guarantee that you will lose the things you love. You might get even more choice and freedom as to how you live your life.

4

KIFF_82 t1_j8ny1s7 wrote

Humans have always dreamed about the singularity one way or another.

3

el_chaquiste t1_j8o60pj wrote

First, those feelings are normal. Experts have them and if not, they'd be fools.

We are witnessing a transformation on the dynamics of knowledge and intellectual content generation like we have never seen, and it will be followed by similar transformations on the physical space of things, which is always the most difficult to do. Knowledge is tolerant to slight imperfections (e.g. an auto-generated essay with some small factual errors won't immediately kill someone), while robots working in the real world aren't (e.g. a self driving car can't make any mistake or it will crash).

Everything humans do that generates some knowledge will be disrupted. Graphic arts, movies and TV, architecture, science, literature, and yes, even software development, which seemed so far to be safe from disruption.

On the why we are pursuing this, it's complex, but I think it's because:

  • It's easy and it works. The technology to do this is surprisingly affordable nowadays.

  • We are free to do so. It can be done without any permission or regulation.

  • It provides good return of investment to those knowing how to exploit it.

  • We haven't seen all the ramifications yet, the kind of problems that might require reviewing the legality of it all. But the historical precedent is bad: we always act after the fact.

3

wastedtime32 OP t1_j8p0egt wrote

I understand what you’re saying. But I just don’t have faith in governing bodies to properly regulate it (bc of them being corrupted by the corporations who have a vested interest in dis-regulation) and I also know that in these unprecedented circumstances, there will be oversights and negative externalities that could likely be devastating.

1

RowKiwi t1_j8omv45 wrote

Here is a great quote from C S Lewis about fear of annihilation

&#x200B;

>
>
>“In one way we think a great deal too much of the atomic bomb. ‘How are we to live in an atomic age?’ I am tempted to reply: Why, as you would have lived in the sixteenth century when the plague visited London almost every year, or as you would have lived in a Viking age when raiders from Scandinavia might land and cut your throat any night; or indeed, as you are already living in an age of cancer, an age of syphilis, an age of paralysis, an age of air raids, an age of railway accidents, an age of motor accidents.’
In other words, do not let us begin by exaggerating the novelty of our situation. Believe me, dear sir or madam, you and all whom you love were already sentenced to death before the atomic bomb was invented: and quite a high percentage of us were going to die in unpleasant ways.
We had, indeed, one very great advantage over our ancestors—anesthetics; but we have that still. It is perfectly ridiculous to go about whimpering and drawing long faces because the scientists have added one more chance of painful and premature death to a world which already bristled with such chances… and in which death itself was not a chance at all, but a certainty.
This is the first point to be made: and the first action to be taken is to pull ourselves together. If we are all going to be destroyed by an atomic bomb, let that bomb when it comes find us doing sensible and human things—praying, working, teaching, reading, listening to music, bathing the children, playing tennis, chatting to our friends over a pint and a game of darts—not huddled together like frightened sheep and thinking about bombs. They may break our bodies (a microbe can do that) but they need not dominate our minds.”
― C.S. Lewis

3

JVM_ t1_j8os8zs wrote

Gorgon - Tony Hoagland

Now that you need your prescription glasses to see the stars

and now that the telemarketers know your preference to sexual positions

Now that corporations run the government

and move over land like giant cloud formations

Now that the human family has turned out to be a conspiracy against the planet

Now that it’s hard to cast stones

without hitting a cell phone tower that will show up later on your bill

Now that you know you are neither innocent, nor powerful,

not a charter in a book;

You have arrived at the edge of the world

where the information wind howls incessantly

and you stand in your armor made of irony

with your sward of good intentions raised—

The world is a Gorgon.

It holds up its thousand ugly heads with their thousand writhing visages

Death or madness to look at too long

but your job is not to conquer it;

not to provide entertaining repartee,

not to revile yourself in shame.

Your job is to stay calm

Your job is to watch and take notes

To go on looking

Your job is to not be turned into stone.

4

RowKiwi t1_j8p0qyl wrote

>Gorgon - Tony Hoagland

Wow that's appropriate. Poetry for the end times.

1

wastedtime32 OP t1_j8p1l7f wrote

The last sentence tells me Lewis would be against a utopic AI world. “Praying, working, teaching, reading, listening to music.” These are all things which, following the principles which led us to this point in humanity, are inefficient, and wouldn’t exist with the singularity.

1

heavy_metal t1_j8s71d8 wrote

as someone who grew up thinking about the bomb, thank you for posting this.

1

AvgAIbot t1_j8o4f1l wrote

I’m optimistic but not nearly as some people on this sub. I try to have a realist view regarding the singularity and AI in general.

Honestly I can see shit getting out of hand quickly within the next 5 years. AI will progress faster than government legislation. Corporations will use that to their advantage and probably replace a good amount of the work force. Even at 20% jobs lost that’s still a huge issue. I can see rioting and stealing happening as more people fall into the poverty class. Corporations care more about profits than people, that’s not going to change.

If you are college student, I’d highly recommend you going into the medical field or possibly engineering. Any other degree I feel will be mostly useless to get a well paying job. Especially business degrees, art/history, etc. I’ll probably get downvoted for saying that, but it’s just my opinion.

After a few years of craziness, I would hope the government adjusts and makes UBI available for everyone. Then things can start to get better.

2

wastedtime32 OP t1_j8ozs4d wrote

Thank you for incorporating a class analysis into the your prospective. Everyone here seems to have the assumption that the way the world is constructed currently is NOT warped to favor certain people, and everything is effected by that. Yes, I am scared of the idea of a “utopia” run by super intelligent computers. I’m even more scared of this technology being used as a means to further extract resources from people not apart of the ruling class. From the moment it was conceived, the world of tech was corrupted by the motivation to collect as much wealth as possible, which is in itself hierarchal and oppressive to most people. The idea that from this system can come a grand utopia which gives everybody all they desire is completely ignorant.

4

RowKiwi t1_j8o6ngx wrote

Another approach is to focus on family and friends, cultivating relationships and helping people you care for, hanging out with friends. That's what really matters in life.

You can't really affect anything coming, so there's no point in dread and fear. What comes will come. It's fatalistic, but a wholesome kind of fatalism, with a life filled with good people you have connections to.

2

Puppetofmoral t1_j8o40sn wrote

I understand that the rapid pace of technological change and the emergence of AI can be overwhelming and even frightening. It's natural to have concerns about the future and the impact that technology will have on society.

Regarding the singularity, it's important to understand that it is still a matter of speculation and debate among experts in the field. While some believe that the singularity could happen in the future, others believe that it is unlikely or that its effects will be more limited than often portrayed.

Regarding the pursuit of AI, it is driven by a combination of factors, including the desire to solve complex problems, improve efficiency, and drive innovation. However, it is also true that some individuals and organizations are motivated by the potential for financial gain.

It's important to remember that the future is shaped not only by technological advances, but also by the choices and actions of individuals and society as a whole. As a college student, you have the opportunity to study and learn about AI and its potential implications, and to engage in discussions and debates about how it should be used and regulated. You can also use your education and skills to contribute to positive social and technological change, and to help ensure that AI is used for the benefit of humanity.

It's also important to take care of your mental health and well-being during this time. If you're feeling overwhelmed or distressed, consider reaching out to a trusted friend, family member, or mental health professional for support.

1

ajarOfSalt t1_j8o5bt7 wrote

Nobody knows and its best to keep an open mind because whether its “utopia” or “dystopia” it’ll definitely be revolutionary.

1

dwarfarchist9001 t1_j8oqtw9 wrote

The word singularity comes from math where it means a part of a graph where a function becomes discontinuous (i.e. the value of the line at that point is either undefined or infinite). Things like evolution, immortality, and fusing together into a hive mind have nothing to do with it, at least not inherently.

The idea of the technological singularity comes from the observation that the rate of new technological advances is becoming faster over time such that our total level of technology is growing hyperbolically. If human technology continues growing at the current hyperbolic rate or something close to it then relatively soon we will reach the singularity on the graph of new technological innovations which is the point in time where the rate of new technological innovation will become infinite.

The general assumption of this subreddit is that the technological singularity occurs as the result of the creation of a self improving AI which will then proceed to rapidly create better and better versions of itself. The hope is that if AI is controllable or at least benevolent it could bring about a golden age where all science is known and essentially all economic activity is automated.

1

wastedtime32 OP t1_j8p1tdm wrote

The idea that it can be controlled is a pipe dream.

1

iNstein t1_j8pd289 wrote

Controlled, no. Designed to act in our intrest, yes.

1

loopuleasa t1_j8ouflz wrote

it's just evolution as usual, with just yet another new replicator

we had genes

we had memes (cultures)

now we will have code replicators in sillica

&#x200B;

evolution takes its course, the old dies, the new lives

1

wastedtime32 OP t1_j8p4fpk wrote

I get a feeling a good amount of people on this sub really like Ayn Rand.

1

agonypants t1_j8p5n87 wrote

First, there's no reason to be either afraid or (too) optimistic. We cannot ultimately control the future - only attempt to influence outcomes. I would not say we are pursuing the singularity, but rather so long as computing progress continues, it is inevitable. The forces of capitalist competition will ensure that computing efficiency and capabilities continue to develop. Ultimately, AI systems will become self-improving.

The hope is that we can guide all of this to a good outcome. And the good outcomes should be overwhelmingly positive. My hope is that:

  • The economy can be largely automated
  • The economic pressure to hold a 40 hour/week job is eliminated
  • That basic human needs (food, clothing, shelter, healthcare, education) become freely available

If and when these things occur, humanity will be truly free in a way that we have not been since before the Industrial Revolution (at least). We will be free to do what we like, when we like. If you want to do nothing and accept the basic, subsistence level benefits, you'd be free to do that. If you want to pursue art, you'd be free to do that. If you want to help restore the environment or just your community, you'd be free to do that. If you want to pursue teaching, childcare, medicine, science, space exploration, engineering - you'd be free to do any (or all!) of those.

The negatives could be just as equally disruptive or even catastrophic. The worst outcome I can conceive of is this: AI leads to absolute and total income inequality. The wealthy control the AIs which drive a completely automated economy. The "elite" group in control share none of the benefits with the remainder of human society thus casting 90+ percent of people into permanent, grinding poverty. Eventually those in control decide that the remainder of humanity is worthless and begin to fire up the killbot armies.

I remain optimistic. I don't seriously believe that anyone (who is not insane) would desire that kind of negative outcome. So long as capitalism continues to exist, the elites will always need consumers - even in an automated economy. At the same time, there is little to nothing I can do to control the outcome either way. So, there's no point in stressing about it. Live your life, let your voice be heard on important topics and make peace with the fact that there are things beyond our control.

1

wastedtime32 OP t1_j8p6c9b wrote

This to me really ignores all the other influences capitalist competition will have on this hypothetical world. AI will be better at us than art. AI generated art will sell better than human art. There will be little incentive to do anything other than consume. I doubt in this hypothetical world it would be made at all accessible to pursue things which might remind humans of the distinct human abilities and feelings in a world where our time will rather than freed up, be dedicated to consuming rather than producing. The ruling class will never let us have free time unless we are producing for them in some way.

Post-Scarcity Capitalism is a dystopia. There’s no way around it.

0

DukkyDrake t1_j8pdsc0 wrote

You might be mixing certain concepts.

https://en.wikipedia.org/wiki/Technological_singularity

No one is really working towards achieving the singularity, but it may come about as a consequence of the pursuits of individual and societal scale goals.

I think you might just be worried about technological unemployment; you don't need a singularity event for that. Technological unemployment might be dystopian depending on your local society's cultural values.

1

riani123 t1_j8pqqwu wrote

I feel the exact same way. Currently a college student and got interested in AI because of CHATGPT and then learning about singularity and the predicted rapid progression of AI has me spooked. I am pretty scared and sometimes even do lose sleep over it. I feel like what I am doing right now is useless too because if AI is eventually going to take it all away then what is the point of learning everday. I also struggled to wrap my head around why we are pursuing this technology when we have litle idea of its impact and because I also like "natural" world right now. However in my day to day in my best attempt to alleviate fear, I do and remind mysel fo these few things:

&#x200B;

  1. Spending time with those around me as much as I can - If im around pepole, it stops me from spiraling into an exsistential crisis about the future and reminds me about the beauty of humanity, as cliche as it sounds. It keeps me grounded really well!

  2. Following AI experts/ leaders in the space who focus on AI in the near future > long term: We dont know what AI will look like in the long term as many predictions as there are about singularity etc. Following people who dicuss AI and in general the change in technology in the near future in a way thats not like "hype" but rather practical implications, methods, ethics, and education is nice. It's less scary to think shor term than long term becuase you can control things better in your life.

  3. We dont know whats going to happen : We really dont know what can happen down the road and there are so many diverse persepctives on how things turn out. The best thing to do is just to have an idea and keep up with the space so you dont get left behind but not to get trapped by it. AI will change the world whether we like it or not but how remains to be seen. And since we dont know about the "how", its not worth losing sleep over. There are things I can do in my present day that I dont get done because I just have an exsistential crisis about the future and it hurts. But really hammering in my head that "I dont know whats gonna happen has helped me to stay away from spiraling constantly.

I know these may not be the most helpful. I also have little faith about how positive of an impact AI might have given the state of the world but I try to be optimistic because in the present day optimisim is better for my mental being overall than being a pessimist. If Im constantly negative or fearful, in the present day that harms whats happening around me. Its complicated. I wish you all the best and its nice to know that someone shares the same sentiments. The truth is we dont know whats gonna happen, but we can learn a bit about it and mostly just exist in the present.

1

Cass9840 t1_j8px5f5 wrote

Something to come after us humans when we're gone and can't feed ourselves anymore.

1

Catablepas t1_j8q7zyy wrote

We aren’t pursuing it. It is an eventuality. We are approaching it.

1

heavy_metal t1_j8sb165 wrote

I think AI will continue to be a tool. We would purposefully have to give it a lizard brain for it to be worried about its own survival and develop its own goals which would likely be bad for us. I don't see why it can't stay more frontal lobe, and just be an aggregator of knowledge, concepts, ideas and continue to synthesize new knowledge for us. I can imagine just talking to an all-knowing entity for all my needs that effortlessly replaces all of government, and traditional commerce.

1

wastedtime32 OP t1_j8sdah8 wrote

No such thing as “all-knowing”. It will always reflect a certain bias, so I’d hope AI would never replace government.

1

heavy_metal t1_j8sp9q4 wrote

well bias is human thing because of our inability to know people beyond labels, besides our small tribes. the generalizations that we make are trained into AIs now, but the nature of training and how it reacts to bias will have to change for the better.
as for all-knowing, imagine interfacing with a single entity that knows every other person to some degree, has all human knowledge, and can renew you car registration with an ask. it may turn out to be the ultimate fair judge and arbiter of human affairs with the goal of minimizing suffering and maximizing happiness.

1

wastedtime32 OP t1_j8tqmdj wrote

Yes, I’m imagining it and it sounds dystopian. Sounds like a utilitarians heaven, but I’m not a utilitarian, and do not want to be forced into living in that world.

1

RiotNrrd2001 t1_j8tgtrp wrote

Paradigm-breaking shifts were something of the norm for most of the twentieth century. We've been at the top of that S-curve for a lot longer than I personally like . And personally I prefer the rapid pace of change that I myself witnessed between the 1960s and roughly mid-September of 2001, at which point "progress" became more "refinement of what we already have" (smartphones being nothing more than small computers, for example), than groundbreaking new additions.

Finally we have something new. I'm happy to see it. Right now our culture seems to have a lot of problems, and to be honest, I don't see how things can change without things changing, if you understand what I'm saying. We can't improve things AND keep everything the same at the same time. The AI stuff is the face of change, but again... that's something those of us in the older generations were kind of used to and haven't been seeing a lot of for a while. "Apps" don't count.

Don't be a square, man. The times they be a'changin'.

1

wastedtime32 OP t1_j8trcmc wrote

I don’t want a static world. But even at my young age I’ve become jaded and I know how this technology will be exploited. The vision of those who are creating it will not be how it turns out. The “problems” you refer to have a lot to do with modern technology. I’m not necessarily a decelerationist, but I don’t see how diving in even deeper is going to help us. I agree this is a reckoning point in human history, but I think we need to STOP going in the direction we have been and find a new one. AI is the next step on that road and I see nothing but trouble. It all seems so misguided to me. But then again, all this tech is simply the product of market competition. It’s designed for a certain task, it’s in its nature. I say fuck that nature, we need to embrace the real one. That doesn’t mean primitivism. It means we use tech to peruse human desires, but within an ethical framework compatible with the natural world.

1

RiotNrrd2001 t1_j8tuud6 wrote

What you've said is true about ALL new technologies.

More people were killed by motorcars than by buggies; obviously the internal combustion engine was a mistake. Airplanes can crash from great heights: mankind obviously wasn't meant for altitudes in excess of the nearest climbable mountain, and ALSO: bombs. And no one was ever electrocuted until mass electrification occurred; piping lightning directly into our homes is just asking for fires.

Movies are awesome! Also, they can be used for mass propaganda. As can that dang printing press. No printing presses, no Mein Kampf, so maybe that ought to be looked into.

My point is that yes, all new technology has a potential for causing damage and for being misused. We should definitely be conscious of those things. But that doesn't mean we need to stop development. What we need to be is aware.

1

PoliteThaiBeep t1_j8u7809 wrote

If you want to get a good understanding of this whole thing read the following:

"Sapiens", "Home Deus" by Yuval Noah Harari

"Life 3.0" by Max Tegmark

'Human Compatible" by Stuart Russell

"Superintelligence" by Nick Bostrom

"A thousand brains" by Jeff Hawkins, Richard Dawkins

And of course on this subreddit you must at least glance at "Singularity is near" by Ray Kurzwail

There's a bunch of optimists, pessimists and everything in between mixed in here for a good balanced perspective.

All of these are insanely smart people and deserve every bit of attention to what they are saying.

You can also get a short version of all of the above by reading Tim Urban blogpost on waitbutwhy dot com called "super intelligence"

1

MasterFruit3455 t1_j8v2203 wrote

Because even the smartest monkey is still a moron.

1

psichodrome t1_j8vb4ym wrote

We are "pursuing" it same way we pursued electricity and the wheel. Where progress and fate takes us is a wild guess, but it is inevitable.

1

TheSecretAgenda t1_j8o7ozf wrote

We are pursuing it because there are huge potential profits in it and western governments are afraid that if they don't do it the Chinese or Russians could get it first putting the west at a disadvantage.

There are historical precedents. The taming of fire, agriculture, the printing press, steam engines, learning how to use electricity, the internal combustion engine, flight and electronic computers. All changed society in massive ways. Humanity adapted.

You should read The Singularity is Near. A fairly rosy take on the Singularity. You should also read Our Final Invention a less optimistic view of the Singularity.

Overall, I would say relax. The Singularity may happen tomorrow, but it is more likely to occur in the 2060s or latter.

0

No_Ninja3309_NoNoYes t1_j8ozkmh wrote

I am not a firm believer in the literal technological singularity. Moore's law and the knowledge of human brains currently don't really support it. Quantum computers might change that.

But if I look at my friend Fred, he is excited. I'm also excited but not as much. There was a German company that we thought would bring a ChatGPT like bot in 2020, but that didn't happen. So it looks like that you can't give it too much freedom. You have to guide it. This makes generating code currently the best option because it follows rigid rules. Will this code lead to a self reinforced feedback loop? There's no way to tell.

0

icepush t1_j8nul8g wrote

Basically, it means all of us are going to merge into a single entity.

−4

TFenrir t1_j8nxpb8 wrote

That's a very very specific prediction, maybe not the best thing to introduce someone new to

6

icepush t1_j8nxype wrote

I only realized it last month myself :P

0

wastedtime32 OP t1_j8nv7au wrote

And why do we want this?

2

icepush t1_j8nvrhy wrote

  1. Curiosity. Everyone will know everything. You will have the answer to every question and wonder you ever had. You will know everything about every person you have ever wondered about. Everyone will know everything there is to know about you.

  2. Power. Everything that is possible will be possible for everyone. Everyone will have the ability to acomplish anything and everything.

  3. Convenience. You will not have to do things like write or talk or speak, rather simply think your thought and everyone will know it immediately. You will know everyone elses thoughts immediately as well.

These are just some reasons. I think there are people who will choose not to merge and disappear from existence.

Note that I am not saying these are things people WANT to do - but, rather they will do it because everyone else is doing it.

1

wastedtime32 OP t1_j8ny3on wrote

But, why? Do you believe there is some supernatural force that will inevitably lead to this no matter what? This could very well mean the end of the human race. It’s a pretty easy picture to paint. Everything about this is dystopian. I have a hard time imagining people who desire for this kind of world who haven’t become completely socially alienated from the modern world, so they have no connection to it. I myself have felt very alienated at time but I still see the value in maintaining certain things. If the pioneers of this tech really valued humanity, they would understand the concept of sustainability. There are certain sacrifice we have to make in order to make sure we survive and thrive. To create a existential threat to our own existence out of pure curiosity seems like the most diabolical and extreme possible form of cognitive dissonance. I’m not ignoring all the “good” Ai has and can bring us. But it’s a fact that ethics is the last fucking thing these people are concerned with. Is it about money? I find the entire premise to be absolutely absurd and unhinged and psychotic. Whatever happened to a democratic society? The majority of people do not want this future. But the people who do are more powerful than the rest. It’s a suck fucking game. It’s the same game that got us to the point of ecological destruction and mass extinction and eventual resource depletion.

I. Just. Don’t. Get. It.

1

icepush t1_j8nz4cx wrote

It is the forces of economics and also competition, cooperation, and scarcity.

This has been the ultimate fate of humanity since the first caveman invented fire.

I only realized this in early January when I was trying out ChatGPT. It took me a bit if time to fully digest and pontificate on the implications.

Imagine you have some kind of trade - like you are a homebuilder. Somebody invents a tool that allows people that have it to build a home 20% faster. Well, everyone that doesn't use that tool slowly gets driven out of business by the people that do.

It is the same idea with all of the technological advancements - fire, weapons, armor, transportation, computers, the internet, smartphones, etc.

If you are trying to schedule a meeting with somebody, and your choices are to either send them a text message and wait for them to respond, or just merge into a single unified entity with them so you can schedule, begin, and complete your meeting instantly, the second one will win out over time.

1