Submitted by CookiesDeathCookies t3_z15xdt in singularity
World_May_Wobble t1_ixb9xt0 wrote
Reply to comment by Falkusa in How much time until it happens? by CookiesDeathCookies
It is anthropocentric, which might even be warranted. For example, if the AGI that takes off ends up being an emulated human mind, human psychology is totally relevant.
It really all depends on the contingencies of how the engineers navigate the practically infinite space of possible minds. It won't be a blank slate. It'll have some. The mind we pull out of the urn will depend on the engineering decisions smart people make. If they want a more human mind, they can probably get something that, if nothing else, acts human. But for purely economic reasons, they'll probably want the thing to be decidedly unhuman.
Viewing a single comment thread. View all comments