Viewing a single comment thread. View all comments

crua9 t1_iz9xt9h wrote

^

This

See, back in the day people were litterally antiwriting because they thought it would make people forgetful. People were anti car, TV, and so on. People were anti books, computers, internet, and now crypto. And like everything there is people anti AI.

Do they have worries? Yes. But are they founded? Not really. They think AI will kill us. But they never ask why. Like do you go out of your way to kill random bugs and germs for no reason? Same here

Now should people worry about it taking their jobs. Yes. But that is a good thing. There needed to be an economic shift. It hasn't happen, but there needs to be changes. Too many hard working people can't really survive on what they have, and they basically turned into a lifetime slave. AI is likely to fix this.

46

KidKilobyte t1_izc5evg wrote

Nick Bostrom would like a word with you.

I will include the following quote from Contact:

We pose no threat to them. It would like us going out of our way to destroy a few microbes on some ant hill in Africa.

Interesting analogy. And how guilty would we feel if we went and destroyed a few microbes on an ant hill in Africa?

7

mootcat t1_izh3vne wrote

It isn't that our demise is particularly desired, it's that it is ultimately an inconsequential side effect of AI exponentially scaling an objective.

Max Tegmark (I beleive) compares it to us worrying about destroying an ant colony while constructing a highway. It isn't even a consideration.

1

crua9 t1_izhiomb wrote

Here is a back and forward

Person A: Max Tegmark (I beleive) compares it to us worrying about destroying an ant colony while constructing a highway. It isn't even a consideration

Person B: Do you like your fridge or should we go back to ice boxes? Keep in mind fridges save lives because you can store medical stuff.

Person A: wants fridges over ice boxes

Person B: the biggest industry in the world and history was the ice industry. What killed it was the fridge.

So pick killing the biggest industry humans ever known. But in return countless people can get medical stuff, food can go to more places, and so on. Or keep that industry, all the people working it in the job, etc. But have everyone who is living today thanks due to the fridge dead.

There is always outcomes to every choice. Sometimes good and sometimes bad. But a simple risk assessment shows way more lives and way more good will come with AGI. And like the fridge. Even if you delay it, it will still come out at some point.

1

mootcat t1_izhwgxu wrote

Are you not aware of the existential risk that AGI/superintelligence poses?

I'm obviously pro AI, but it's also the greatest risk to humanity and all of life.

1

crua9 t1_izhwoib wrote

Ya it is a risk. But this is my viewpoint

  1. It makes our life better (good)
  2. It doesn't really change anything (whatever)
  3. It makes things worse (well I guess now is a good time to die)
  4. It kills us all (we all die one day anyways, and it isn't like my life is getting better)
1