Submitted by kdun19ham t3_111jahr in singularity
SoylentRox t1_j8fyxct wrote
Reply to comment by jamesj in Altman vs. Yudkowsky outlook by kdun19ham
Have you considered that delaying AGI also has an immense cost?
Each year, the world loses 0.84% of everyone alive.
So if delay AGI by 1 year reduces the chance of humanity dying by 0.5%, for example, it's not worth the cost. This is because 0.84% extra people have to die while more AGI safety work is done who wouldn't have died if more advances in medicine and nanotechnology were available 1 year sooner, and the expected value an extra 0.5% chance of humanity wiped out is not enough gain.
(since "humanity wiped out" is what happens whenever any human dies, from their perspective)
Note this is true even if it takes 100 years from AGI -> (aging meds, nanotechnology) because it's still 1 year sooner.
BigZaddyZ3 t1_j8gey2c wrote
This doesn’t really make sense to me. If delaying AGI by a year reduces the chance of humanity in it’s entirety dying out by even 0.01%, it’d be worth that time and more. 0.84% is practically the cost of nothing if it means keeping the entire human race from extinction. You’re comment is illogical unless you somehow believe that every person alive today is supposed to live to see AGI one day. That was never gonna happen anyways. And even from a humanitarian point of view what you’re saying doesn’t really add up. Because if rushing AI results in 100% (or even 50%) of humanity being wiped out, the extra 0.84% of lives you were trying to save mean nothing at that point anyways.
Frumpagumpus t1_j8gqw8n wrote
> If delaying AGI by a year reduces the chance of humanity in it’s entirety dying out by even 0.01%, it’d be worth that time and more
my take: delaying agi by a year increases the chance humanity will wipe itself out preventing AGI from happening, whose potential value greatly exceeds that of humanity
SoylentRox t1_j8gf381 wrote
It makes perfect sense you just are valuing outcomes you may not live to witness.
BigZaddyZ3 t1_j8ggt7j wrote
No, it truly doesn’t… you’re basically saying that we should risk 100% of humanity being wiped out in order to possibly save the 0.84% humans who are gonna die of completely natural causes..
SoylentRox t1_j8ghast wrote
I am saying it's an acceptable risk to take a 0.5 percent chance of being wiped out if it lets us completely eliminate natural causes deaths for humans 1 year earlier.
Which is going to happen. Someone will cure aging. (Assuming humans are still alive and still able to accomplish things) But to do it probably requires beyond human ability.
BigZaddyZ3 t1_j8giqee wrote
But again, if a mis-aligned AGI wipes out humanity as a whole, curing aging is then rendered irrelevant… So it’s actually not worth the risk logically. (And aging, is far from the only cause of death btw).
SoylentRox t1_j8gj6go wrote
It's the cause of 90 percent of deaths. But obviously I implicitly meant treatment for all non instant death, and rapid development of cortical stacks or similar mind copying technology to at least prevent friends and loved ones from missing those killed instantly.
And again, I said relative risk. I would be willing to accept an increase of risk of all of humanity dying up to a 0.80 percent increased chance of it meant AGI 1 year sooner. 10 years sooner? 8 percent extra risk is acceptable and so on.
Note I consider both humans dying "natural" and a superior intelligence killing everyone "natural" so all that matters is the risk.
BigZaddyZ3 t1_j8gjv3o wrote
What if AGI isn’t a panacea for human life like you seem to assume it is ? What if AGI actually marks the end of the human experiment? You seem to be under the assumption that AGI automatically = utopia for humanity. It doesn’t. I mean yeah, it could, but there’s just much chance that it could create a dystopia as well. If rushing is the thing that leads us to a Dystopia instead, will it still be worth it?
SoylentRox t1_j8gk6d1 wrote
How dystopic? An unfair world but everyone gets universal health care and food and so on? But it's not super great, it's like the videogames with lots of habitation pods and nutrient paste? Or S risk?
Note I don't "think it is". I know there a range of good and bad outcomes, and "we all die" or "we live but are tortured" fit in that area of "bad outcomes". I am just explaining the percentage of bad outcomes that would be acceptable.
Delaying things until the bad outcome risk is 0 is also a bad outcome.
BigZaddyZ3 t1_j8gl7j2 wrote
>>Delaying things until the bad outcome risk is 0 is also a bad outcome.
Lmao what?.. That isn’t remotely true actually. That’s basically like saying “double-checking to make sure things don’t go wrong will make things go wrong”. Uh, I’m not sure I see the logic there. But it’s clear that you aren’t gonna change your mind on this so, whatever. Agree to disagree.
SoylentRox t1_j8gx69g wrote
Right. I know I am correct and simply don't think you have a valid point of view.
Anyways it doesn't matter. Neither of us control this. What is REALLY going to happen is an accelerating race, where AGI gets built basically the first moment it's possible at all. And this may turn into outright warfare. Easiest way to deal with hostile AI is to build your own controllable AI and bomb it.
BigZaddyZ3 t1_j8gy1hz wrote
>>Right. I know I am correct and simply don't think you have a valid point of view.
Lol nice try pal.. but I’m afraid you’re mistaken.
>>Anyways it doesn't matter. Neither of us control this. What is REALLY going to happen is an accelerating race, where AGI gets built basically the first moment it's possible at all. And this may turn into outright warfare. Easiest way to deal with hostile AI is to build your own controllable AI and bomb it.
Finally, something we can agree on at least.
SoylentRox t1_j8gydo4 wrote
>Finally, something we can agree on at least.
Yeah. It's quite grim actually if you think about what even just sorta useful AGI would allow you to do. By "sorta useful" I mean "good enough to automate jobs that ordinary people do, but not everything". So mining and trucking and manufacturing and so on.
It would be revolutionary. For warfare. Because the reason you can't win a world war is you can't dig enough bunkers for your entire population to be housed in separated bunkers, limiting the damage any 1 nuke can do, and build enough antimissile systems to prevent most of the nuclear bombardment from getting through.
And then well you fight the whole world. And win. "merely" AI able to do ordinary people tasks gives you essentially exponential amounts of production capacity. You're limited by how much land you have for an entire country covered in factories.
Note by "you" I don't mean necessarily the USA. With weapons like this, anyone can be a winner.
jamesj t1_j8fzlwr wrote
I don't think it is possible to delay it. If it is dangerous, I can mostly just hope for the best.
Baturinsky t1_j8ivn54 wrote
Is 1 person dying more important than 1000...many zeroes..000 persons not being born because humanity is completely destroyed and future generations from now until end of space and time will be never born?
SoylentRox t1_j8j3aql wrote
The argument is there is no difference from the perspective of that person.
This actually means if old people have the most power and money (and they do), they will call for the fastest AGI development that is possible. The risks don't matter to them, they will die for sure in a few years otherwise.
3_Thumbs_Up t1_j9caxcf wrote
You're not counting the full cost of humanity dying. Humanity dying also means that all the future humans will never have a chance to exist. We're potentially talking about the loss of trillions+ of lives.
SoylentRox t1_j9cbacy wrote
From you and every mortal perspective that has no cost.
3_Thumbs_Up t1_j9ccco0 wrote
Once again not true.
From my perspective it has a cost, because I value other things than my own survival. As do most humans who are not complete sociopaths.
throwaway764586893 t1_j8hr5p4 wrote
And it will be PAINFUL deaths.
SoylentRox t1_j8hrdur wrote
Which ones? AGI takeover, the AI has no need to make it painful. Just shoot everyone in the head (through walls from long range) without warning or whatever is most efficient. You mean from aging and cancer right.
throwaway764586893 t1_j8j16er wrote
The way people actually die is vastly worse than can be acknowledged
SoylentRox t1_j8j2432 wrote
Depends on luck but sure. I agree and if it's slowly forgetting everything in a nursing home vs getting to see an AGI takeover start only to be painlessly shot, I would choose the latter.
Viewing a single comment thread. View all comments