Viewing a single comment thread. View all comments

BigZaddyZ3 t1_j8ggt7j wrote

No, it truly doesn’t… you’re basically saying that we should risk 100% of humanity being wiped out in order to possibly save the 0.84% humans who are gonna die of completely natural causes..

2

SoylentRox t1_j8ghast wrote

I am saying it's an acceptable risk to take a 0.5 percent chance of being wiped out if it lets us completely eliminate natural causes deaths for humans 1 year earlier.

Which is going to happen. Someone will cure aging. (Assuming humans are still alive and still able to accomplish things) But to do it probably requires beyond human ability.

2

BigZaddyZ3 t1_j8giqee wrote

But again, if a mis-aligned AGI wipes out humanity as a whole, curing aging is then rendered irrelevant… So it’s actually not worth the risk logically. (And aging, is far from the only cause of death btw).

3

SoylentRox t1_j8gj6go wrote

It's the cause of 90 percent of deaths. But obviously I implicitly meant treatment for all non instant death, and rapid development of cortical stacks or similar mind copying technology to at least prevent friends and loved ones from missing those killed instantly.

And again, I said relative risk. I would be willing to accept an increase of risk of all of humanity dying up to a 0.80 percent increased chance of it meant AGI 1 year sooner. 10 years sooner? 8 percent extra risk is acceptable and so on.

Note I consider both humans dying "natural" and a superior intelligence killing everyone "natural" so all that matters is the risk.

1

BigZaddyZ3 t1_j8gjv3o wrote

What if AGI isn’t a panacea for human life like you seem to assume it is ? What if AGI actually marks the end of the human experiment? You seem to be under the assumption that AGI automatically = utopia for humanity. It doesn’t. I mean yeah, it could, but there’s just much chance that it could create a dystopia as well. If rushing is the thing that leads us to a Dystopia instead, will it still be worth it?

5

SoylentRox t1_j8gk6d1 wrote

How dystopic? An unfair world but everyone gets universal health care and food and so on? But it's not super great, it's like the videogames with lots of habitation pods and nutrient paste? Or S risk?

Note I don't "think it is". I know there a range of good and bad outcomes, and "we all die" or "we live but are tortured" fit in that area of "bad outcomes". I am just explaining the percentage of bad outcomes that would be acceptable.

Delaying things until the bad outcome risk is 0 is also a bad outcome.

1

BigZaddyZ3 t1_j8gl7j2 wrote

>>Delaying things until the bad outcome risk is 0 is also a bad outcome.

Lmao what?.. That isn’t remotely true actually. That’s basically like saying “double-checking to make sure things don’t go wrong will make things go wrong”. Uh, I’m not sure I see the logic there. But it’s clear that you aren’t gonna change your mind on this so, whatever. Agree to disagree.

3

SoylentRox t1_j8gx69g wrote

Right. I know I am correct and simply don't think you have a valid point of view.

Anyways it doesn't matter. Neither of us control this. What is REALLY going to happen is an accelerating race, where AGI gets built basically the first moment it's possible at all. And this may turn into outright warfare. Easiest way to deal with hostile AI is to build your own controllable AI and bomb it.

0

BigZaddyZ3 t1_j8gy1hz wrote

>>Right. I know I am correct and simply don't think you have a valid point of view.

Lol nice try pal.. but I’m afraid you’re mistaken.

>>Anyways it doesn't matter. Neither of us control this. What is REALLY going to happen is an accelerating race, where AGI gets built basically the first moment it's possible at all. And this may turn into outright warfare. Easiest way to deal with hostile AI is to build your own controllable AI and bomb it.

Finally, something we can agree on at least.

2

SoylentRox t1_j8gydo4 wrote

>Finally, something we can agree on at least.

Yeah. It's quite grim actually if you think about what even just sorta useful AGI would allow you to do. By "sorta useful" I mean "good enough to automate jobs that ordinary people do, but not everything". So mining and trucking and manufacturing and so on.

It would be revolutionary. For warfare. Because the reason you can't win a world war is you can't dig enough bunkers for your entire population to be housed in separated bunkers, limiting the damage any 1 nuke can do, and build enough antimissile systems to prevent most of the nuclear bombardment from getting through.

And then well you fight the whole world. And win. "merely" AI able to do ordinary people tasks gives you essentially exponential amounts of production capacity. You're limited by how much land you have for an entire country covered in factories.

Note by "you" I don't mean necessarily the USA. With weapons like this, anyone can be a winner.

2