Submitted by yottawa t3_127ojcy in singularity
sillprutt t1_jefss3x wrote
Reply to comment by brown2green in Sam Altman's tweet about the pause letter and alignment by yottawa
Who's values are more important, yours or SV:s? Who decides which humans values are the best to align towards?
Is it my values? What if my values are detrimental to everyone else's wellbeing?
There is no way we can make everyone happy. Do we try to make as many people as possible happy? When is it justified to align an AI to the detriment of some? At what %?
AsthmaBeyondBorders t1_jegx580 wrote
About 1% of the general population are psychopaths. About 12% of corporate C-suite are psychopaths. It's their values that have a higher priority as of today.
Viewing a single comment thread. View all comments