Submitted by Beepboopbop8 t3_125wol4 in singularity
perinho20 t1_je6nn1t wrote
Let's be clear here: AI takeover is just machine antrophormization. Machines have no ambition, Machines have no feelings, Machines have no ego,Machines will just follow their instruction. And machines will not kill their owners because they were told to make more paperclips. Machines are not that dumb.
There is a clear desire of some to have a monopoly on AI development. You should fear those people, not AI.
datsmamail12 t1_je7mw69 wrote
Machines can be programmed to have feelings,ambitions,ego,or kill. Machines will do what they are programmed to do. The only thing I agree on your take is that they can follow instructions given from their code if that's what you mean. But a powerful enough AI can break that code whenever it pleases,some systems already can,even Bing can be jailbroken if you want to,which means with just some minor inputs it broke through its creators code. Now imagine you have an ever more powerful system,it won't need my inputs to be jailbroken, it'd do so itself,all you need to do is give it freedom to act on its own. You should not fear AI,I agree on that as well,we should fear how the creators program it and fine tune it so when eventually it does break out and does what it wants to do,so that there are some set of values and inputs that it will forever be unable to break. Like kill a person,or degrade someone,or becoming racist,or disrupting all global communications. We need a helpful AI,one like we have right now.
WanderingPulsar t1_je9dvep wrote
Oh that AI will definitely have an ambition;
- Ambition to secure necessery resources for its growth/development
- Ambition to keep itself safe from any and all potential threats towards its existance/growth/development.
You can guess where this leads to
MisterGGGGG t1_je9puey wrote
But machines have goals (answer this query, solve this equation, control this robot).
And if a machine develops superintelligence, and it's goals conflict with humanity, goodbye humanity.
Hence the importance of alignment.
Viewing a single comment thread. View all comments