Submitted by purepersistence t3_10r5qu4 in singularity
just-a-dreamer- t1_j6uds0c wrote
Reply to comment by AsheyDS in Why do people think they might witness AGI taking over the world in a singularity? by purepersistence
That we don't know.
We don't know how it will be trained and by whom to what end. And there will be many AI models that get worked on. It is called the singularity for a reason.
An AI without what we call common sense might even be worse and give us paperclips in abundance.
AsheyDS t1_j6ugs8u wrote
The paperclip thing is a very tired example of a single-minded super-intelligence that is somehow also stupid. It's not meant to be a serious argument. But since your defense is to get all hand-wavey and say 'we just can't know' (despite how certain you seemed about your own statements in previous posts), I'll just say that a competently designed system being utilized by people without ill intentions will not spontaneously develop contrarian motivations and achieve 'god-like' abilities.
just-a-dreamer- t1_j6ui3pt wrote
God like is relative. For some animals we must appear as gods. It is a matter of perspective.
Regardless, the way AI is trained and responds gets closer to how we teach our own small children.
In actuality we don't even know how human intelligence emerges in kids. We don't know what human intelligence is or how it forms as a matter of fact.
All we know is if you don't interact with babies, they die quickly even if they are well fed, for they need input to develop.
AsheyDS t1_j6ukhrl wrote
>In actuality we don't even know how human intelligence emerges in kids. We don't know what human intelligence is or how it forms as a matter of fact.
Again, you're making assumptions... We know a lot more than you think, and certainly have a lot of theories. You and others act like neurology, psychology, cognition, and so on are new fields of study that we've barely touched.
Viewing a single comment thread. View all comments