Viewing a single comment thread. View all comments

kiralala7956 t1_j8r028f wrote

That is demonstratably not true. Self preservation is probably the closest thing we have to a "law" that concerns goal oriented AGI behaviour.

So much so that it's an actual problem because if we implement interfaces for us to shut it down, it will try it's hardest to prevent it, and not necesarily by nice means.

4