Submitted by Effective-Dig8734 t3_ykhpch in singularity
Kaarssteun t1_iuta63g wrote
To surpass human-level intelligence you need human-level intelligence. No getting around the definitions of the singularity
Effective-Dig8734 OP t1_iutaf08 wrote
I’ll clarify then, by singularity I’m talking about the point in which technological progress becomes uncontrollable and incalculable which results in massive changes to human society
ArgentStonecutter t1_iuti59y wrote
That already happened.
Effective-Dig8734 OP t1_iuti7gz wrote
Not quite
ArgentStonecutter t1_iutihni wrote
It’s happened multiple times starting with the Neolithic revolution and farming.
Effective-Dig8734 OP t1_iutitcr wrote
Not true, those are examples of quick technological progress (bar the neolothic revolution) but not quite on the level of the singularity
ArgentStonecutter t1_iutl70u wrote
They produced a society unimagined to the people who lived before. This is a commonplace phenomenon. It's just ordinary technological revolution. There are people alive today who have lived through one.
The singularity is not just more future shock. It's a change of state to one where the minds behind society are fundamentally different, more powerful, than mortal man.
Effective-Dig8734 OP t1_iutlzxy wrote
That’s not true, it’s likely that is what it will take to reach a singularity but it is not the singularity itself, we could increase human intelligence by 100x but if the rate of technological progress doesn’t change, then we are not in a singularity. A singularity is like a technological revolution, just on steroids
ArgentStonecutter t1_iutqr81 wrote
The singularity is not just a faster technological revolution.
It's a change in the minds that drive evolution.
https://edoras.sdsu.edu/~vinge/misc/singularity.html
If the minds that run society are no different from ours it may lead places better (for us) than a singularity, but it will still be an imaginable society.
Those minds may be biological brains, digital ones, or a mix of the two. But the distinguishing characteristic of the singularity is that they are not merely human. And the resulting society will not be one that humans can understand.
Effective-Dig8734 OP t1_iutv2jf wrote
You’re just wrong my guy, you find the 1 source on the entirety of google that agrees with you and try to act like it’s the final arbiter of truth on the matter. We usually pair the singularity with an increase in intelligence because that’s the most realistic route to the singularity, however it is not the singularity. My description is more similiar to how Neumann described it, how kurzweil describes it, how pretty much anyone describes it. It’s called the technological singularity, not the intelligence singularity
Edit: even then after reading the first few paragraphs it seems like the author agrees with me more
“ I argue in this paper that we are on the edge of change comparable to the rise of human life on Earth. The precise cause of this change is the imminent creation by technology of entities with greater than human intelligence.”
ArgentStonecutter t1_iutyjvd wrote
This isn't just a random source, this is the primary source on the singularity in the modern sense. Vinge has been the primary promoter of the concept since the seventies. Well before Kurzweil shifted from creating computer companies to writing speculative fiction.
And the sentence you quoted doesn't say what you claim.
"The precise cause of this change is the imminent creation by technology of entities with greater than human intelligence."
"Creation of entities with greater than human intelligence" is the important point.
Effective-Dig8734 OP t1_iuu0zxk wrote
I dont agree he is the primary promoter in fact I’ve never heard of him before now.
Also what is the “change” he is referring to? The change is the singularity and he just thinks advanced intelligence is how we will achieve it . It’s not necessary, you can have a technological singularity without higher intelligence and you can have higher intelligence without the singularity
ArgentStonecutter t1_iuucx83 wrote
Sorry you’re out of touch, not my fault.
Effective-Dig8734 OP t1_iuud4so wrote
Do you have anything else to say?
ArgentStonecutter t1_iuue3l4 wrote
Did you finish reading the paper?
Effective-Dig8734 OP t1_iuugag7 wrote
Yea and can you clarify your claim, is it that the singularity is the invention of a superhuman intelligence, that a superhuman intelligence is necessary, or what?
Edit: because in the original comment I responded to the poster said “To surpass human-level intelligence you need human-level intelligence. No getting around the definitions of the singularity” Implying the definition of the singularity is surpassing human level intelligence. Which (if we assume this paper is the be-all end-all) isn’t supported by this paper.
ArgentStonecutter t1_iuuzmah wrote
The singularity is the result of human society being under the control of a superhuman intelligence, whether biological, artificial, or mixed. This includes rapid technological advancement and the society's evolution and goals being literally incomprehensible to mere normal humans of today.
The singularity is a post-human era. Unless the singularity is avoided (eg, by Drexlerian confinement, or perhaps Egan's argument that superhuman intelligence isn't a real thing and as such no singularity develops) physical extinction of humanity is not the worst possibility. Mere human level intelligences could be treated as simple computational devices like traffic light sensors.
The paper spends some time on the possibility of avoiding the singularity, and it's all in terms of preventing the unlimited development of superintelligence. That is what distinguishes a singularity from mere rapid technological growth.
Quealdlor t1_iuvt07q wrote
We certainly could achieve Singularity with human augmentation, amplifying human intelligence directly.
Striking_Exchange659 t1_iutcj75 wrote
What about human level swarm intelligence
Viewing a single comment thread. View all comments