Viewing a single comment thread. View all comments

DigThatData t1_j9s23ds wrote

> Isn't there a difference between the two, because the latter concerns a human trying to pursue a certain goal (maximize user engagement), and giving the AI that goal.

in the paperclip maximization parable, "maximize paperclips" is a directive assigned to an AGI owned by a paperclip manufacturer, which consequently concludes that things like "destabilize currency to make paperclip materials cheaper" and "convert resources necessary for human life to exist into paperclip factories" are good ideas. so no, maximizing engagement at the cost of the stability of human civilization is not "aligned" in exactly the same way maximizing paperclip production isn't aligned.

8