Viewing a single comment thread. View all comments

vivehelpme t1_j8hiksi wrote

Yudkowsky and the lesswrong community can be described as a science-fiction cargo cult, and that's putting it nicely.

They aren't experts or developers of ML tools. They take loosely affiliated literary themes and transplant them to reality, followed by inventing a long series of pointless vocabulary, long tirades, grinding essays doing rounds on themself with ever more dense neo-philosophical contents. It's a religion based on what best resemble zen koans in content but are interpreted as fundamentalist scripture retelling the exact sequence of future events.

I think the cargo cults would probably take offense at being compared to them.

3

bildramer t1_j8hvo8e wrote

Every single time someone criticises Yudkowsky's work, it's not anything substantive. I'm not exaggerating. It's either meta bulverism like this, or arguments that apply equally well to large machines instead of intelligent ones, or deeply unimaginative people who couldn't foresee things like ChatGPT jailbreaks, or people with rosy ideas about AI "naturally" being safe that contradict already seen behaviors. You have to handhold them through arguments that Yudkowsky, Bostrom and others were already refuting back in the 2010s. I haven't actually seen any criticism anywhere I would call even passable, let alone solid.

Even ignoring that, this doesn't land as a criticism. He didn't start from literary themes, he started from philosophical exploration. He's disappointed in academic philosophy, for good reasons, as are many other people. One prominent idea of his is "if you can fully explain something about human cognition, you should be able to write a program to do it", useful for getting rid of a lot of non-explanations in philosophy, psychology, et al. He's trying to make predictions more testable, not less. He doesn't have an exact sequence of future events, and never claimed to. Finally, most people in his alleged "cult" disagree with him and think he's cringy.

3