Don't put too much stock in Yudkowsky's hallucinations. He has no understanding of biology and no unique talent for predicting the outcome of AI development. Any time he talks about the subject, it's a Rube Goldberg machine of fantasies.
The reality is that once a computer hits AGI that's sufficiently more intelligent than humans, there is basically an uncountable number of ways it can end humanity. Yudkowsky likes to bloviate about a few hand-picked bizarre examples so that people remember him when they discuss AGI.
astrologicrat t1_j9xs9ff wrote
Reply to comment by californiarepublik in We are in the early days of AI used as tool for biological design. It’s potential to design new proteins + DNA sequences from the building blocks of life is astonishing. by MichaelTen
Don't put too much stock in Yudkowsky's hallucinations. He has no understanding of biology and no unique talent for predicting the outcome of AI development. Any time he talks about the subject, it's a Rube Goldberg machine of fantasies.
The reality is that once a computer hits AGI that's sufficiently more intelligent than humans, there is basically an uncountable number of ways it can end humanity. Yudkowsky likes to bloviate about a few hand-picked bizarre examples so that people remember him when they discuss AGI.
Guess it's working.