Submitted by MistakeNotOk6203 t3_11b2iwk in singularity
DadSnare t1_j9wsnoh wrote
Let’s get more concrete. Regarding the first point of your argument, what would be an example of something AGI would want to do (and a good argument for why) that isn’t the second point, “to maintain a state of existence to accomplish things;” a human existential idea? We aren’t immortal, but it easily could be, and perhaps that distinction as a tangible possibility between the two intelligences is the thing that makes a lot of people uncomfortable. Now why would it want to destroy us on its own? Why would we want to turn it off?
Viewing a single comment thread. View all comments