Submitted by innovate_rye t3_11byfd3 in singularity
Darustc4 t1_ja0lp3d wrote
Results will not reflect reality since many will vote 'yes' even without knowing the real meaning (See all the posts asking about what comes after the singularity).
FC4945 t1_ja2alma wrote
There's certainly no way to know what ASI will truly be like as our monkey brains can't possibly conceive of it. Nor can we imagine what we will be like once we multiply our intelligence a billion fold. That's why it takes the name from a blackholes. You can't see what happens beyond the event horizon of a blackhole.
innovate_rye OP t1_ja0mfa0 wrote
yea ik. ill look. i was just curious if a large margin would say no
iNstein t1_ja14c1c wrote
Stupid people THINK they are smart. I have already spotted 2 posts here with the wrong definition and no doubt they voted yes.
innovate_rye OP t1_ja14qby wrote
my definition is an AI that can recursively & exponentially improve itself to a point that human intelligence isn't qualified to make decisions
Dyedoe t1_ja1z072 wrote
That’s not a definition of singularity but it is almost certainly what will cause singularity… I guess you’re right about some people on this sub not knowing what singularity is.
innovate_rye OP t1_ja21ysd wrote
well im not going to say, "when ai is smarter than all humans ever", "when AI becomes super intelligent", or "the moment ai becomes so smart we just don't ever know what will happen." those r lame answers
Viewing a single comment thread. View all comments