Comments
Dnuts t1_j9zx7fj wrote
This hurt my brain to watch. Definitely worth it.
Scarlet_pot2 t1_j9xxvx5 wrote
Maybe we could use AI in the future to make life that could survive in other places in the solar system. Mists of Jupiter, methane seas of titan, dusty lands of mars.
HurricaneHenry t1_j9zsier wrote
I can picture myself amidst the methane seas of Titan.
LevelWriting t1_ja3r02w wrote
why are you so evil?? you would create sentient being just so they could live in these harsh places while you live comfy on earth?
Scarlet_pot2 t1_ja675fa wrote
If you design them well it could be a comfy place to them.
CowBelleh t1_j9xxlo7 wrote
Would be cool to somehow crispr edit our brains to have > genius level intelligence
kfractal t1_ja1mfbi wrote
being able to simulate lots of these interacting with each other and the real world is still part of the problem. but a step toward better understanding nonetheless.
californiarepublik t1_j9xcn7v wrote
What could go wrong?
FWIW this is literally one of Yudkowsky’s scenarios for how AI could end us.
astrologicrat t1_j9xs9ff wrote
Don't put too much stock in Yudkowsky's hallucinations. He has no understanding of biology and no unique talent for predicting the outcome of AI development. Any time he talks about the subject, it's a Rube Goldberg machine of fantasies.
The reality is that once a computer hits AGI that's sufficiently more intelligent than humans, there is basically an uncountable number of ways it can end humanity. Yudkowsky likes to bloviate about a few hand-picked bizarre examples so that people remember him when they discuss AGI.
Guess it's working.
ActuatorMaterial2846 t1_j9yauge wrote
Yeah, I think people took that comment about 'instantly killing us by releasing a poison in the atmosphere' a bit too seriously. Maybe because it was so specific, idk.
But he does have a point that we should be concerned about an autonomous entity smarter than humans in all cognitive ability. An entity that has no known desire apart from a core function to improve and adapt to its environment.
Such an entity would most certainly begin competing with us for resources. So, his emphasis on alignment is correct, and he is probably not overstating the difficulty in achieving that.
Everything else he says is a bit too doomer with little to back it up.
3deal t1_j9wwq0h wrote
And it is how the humanity extinct.
darkness3322 t1_j9xd6oh wrote
It is how humanity will jump to the next level.
3deal t1_j9yqlu4 wrote
The first thing human made after discovered atomic equation is an atomic bomb.
darkness3322 t1_ja0es7p wrote
You really can't see the potencial of this technology? We're literally talking about becoming something more, something that will raise questions about whether we can still call ourselves homosapiens or whether we should already apply a new name to our new evolutionary state...
3deal t1_ja0hrm8 wrote
it is called eugenics like nazi wanted to do.
Mortal-Region t1_j9wzepc wrote
Kurzgesagt put out a pretty good video about proteins recently. A nice, big-picture overview with great visualizations.