SkyeandJett t1_jecqspf wrote
Reply to comment by agonypants in AGI Ruin: A List of Lethalities by Eliezer Yudkowsky -- "We need to get alignment right on the first critical try" by Unfrozen__Caveman
For sure. It's extremely disappointing that Time would give public credence to a literal cult leader. His theories are based on completely outdated ideas about how we would develop AI and LLMs are nothing like he is describing.
Viewing a single comment thread. View all comments