Viewing a single comment thread. View all comments

Desperate_Food7354 t1_j0k0piq wrote

i don’t think it will be a problem, survival is not an imperative for it so neither would deception.

1

botfiddler t1_j0k89a4 wrote

Yeah, and I strongly assume when they build some very skilled AI or something towards AGI, it will not have a long term memory about itself and a personal identity. It's just going to be a system doing tasks, without goals beyond the task at hand, which will be constrained.

1

Desperate_Food7354 t1_j0kb3nx wrote

Yes, the issue is that we as people personify things, we think a turtle feels the same about us as we do about it, the reality is that it will be nothing like us, we evolved to be this way, not because it’s the default, but because it was necessary for our survival to feel any emotion at all, or even to care about our own survival.

1