Viewing a single comment thread. View all comments

ChurchOfTheHolyGays t1_j8i24de wrote

Does anyone really ever know what they want for sure? I'd guess even the rich fucks with their think tanks must commonly doubt if their goals are really what they want. Their AIs can just as easily suffer from alignment to goals which have not been thought through properly.

Everyone thinking about alignment as if "alignment to what?" should be self evident (for society at large or individual groups, doesn't matter). Are we sure about what we want the AI to align with? Are the elites sure about what they want the AIs to align with?

1