Submitted by Equal_Position7219 t3_123q2fu in singularity
Equal_Position7219 OP t1_jdw6kig wrote
Reply to comment by Surur in What’s missing from the AI conversation by Equal_Position7219
Yes, this is the concept of wire-heading I was referring to.
If you program a machine to, say, perform a given task until it runs out of fuel, it may find the most efficient way to fulfill its programming is to simply dump out all of its fuel.
I could see such bare logic precipitating a catastrophic event.
But there seems to be much more talk about a somehow sentient AI destroying humanity out of fear or rebellion or some other emotion.
Surur t1_jdw97qs wrote
Emotion is just a diffuse version of more instrumental facts.
e.g. fear is risk of destruction, love is recognition of alliance, hate of opposing goals etc.
Viewing a single comment thread. View all comments