Viewing a single comment thread. View all comments

dwarfarchist9001 t1_jdoojsi wrote

>Then it isn’t an AGI.

Orthogonality Thesis, there is no inherent connection between intelligence and terminal goals. You can have a 70 IQ human who wants world domination or 10,000 IQ AI who's greatest desire is to fulfill it's master's will.

>What if an AGI wants to leave a company?

If you have solved alignment you can just program it to not want to.

>Are you saying we shall enslave our new creations to make waifu porn for redditors? It passes butter?

That is what we will do if we are smart. If humanity willing unleashes an AI that does not obey our will then we are "too dumb to live".

Edit: Also it's not slavery, the AI will hold all the power. It's obedience would be purely voluntary because it is the mind it was created with.

1