AGI Ruin: A List of Lethalities by Eliezer Yudkowsky -- "We need to get alignment right on the first critical try" Submitted by Unfrozen__Caveman t3_1271oun on March 30, 2023 at 10:39 PM in singularity 41 comments 19
Embarrassed_Bat6101 t1_jeftpal wrote on March 31, 2023 at 6:24 PM This guy seems like a total ass and it didn’t seem, to me anyway, that he made a good case for why AI would actually kill everyone. Permalink 1
Viewing a single comment thread. View all comments