Heap_Good_Firewater t1_j9drr1u wrote
Artificial general intelligence could likely not be constrained by rules if it were more intelligent than a human.
This is because we likely won’t understand how exactly such an advanced system would function, as it would have to be designed mostly by another AI.
A super AI probably wouldn’t kill us on purpose, but by disregarding our interests, just as we disregard the interests of insects when they conflict with our own.
I am just parroting talking points I have heard from experts, but they sound reasonable to me.
Viewing a single comment thread. View all comments