Viewing a single comment thread. View all comments

AsheyDS t1_ityesgf wrote

>Yes. The nature of general intelligence is that it may try anything.

May perhaps, and that's a hard perhaps. That doesn't mean it will try anything. We consider ourselves to be the standard for general intelligence but as individuals we operate within natural and artificial bounds and within a fairly small domain. While we could do lots of things, we don't. An AGI doesn't necessarily have to go off the rails any chance it gets, it can follow rules too. Computers are better at that than we are.

3

gahblahblah t1_ityf76x wrote

I completely agree. It is sensible, healthy and sane to not attempt extremist things, and it is entirely possible that computers will be better at rationality than we are.

But the question wasn't about the nature of AGI, but rather whether people had considered what AGI might do.

2