Submitted by Effective-Dig8734 t3_ykhpch in singularity
[removed]
Submitted by Effective-Dig8734 t3_ykhpch in singularity
[removed]
Yes. You can achieve a singularity through nanotechnology or genetic engineering, there are probably other ways as well.
Using those to amplify human intelligence?
Correct.
To surpass human-level intelligence you need human-level intelligence. No getting around the definitions of the singularity
I’ll clarify then, by singularity I’m talking about the point in which technological progress becomes uncontrollable and incalculable which results in massive changes to human society
That already happened.
Not quite
It’s happened multiple times starting with the Neolithic revolution and farming.
Not true, those are examples of quick technological progress (bar the neolothic revolution) but not quite on the level of the singularity
They produced a society unimagined to the people who lived before. This is a commonplace phenomenon. It's just ordinary technological revolution. There are people alive today who have lived through one.
The singularity is not just more future shock. It's a change of state to one where the minds behind society are fundamentally different, more powerful, than mortal man.
That’s not true, it’s likely that is what it will take to reach a singularity but it is not the singularity itself, we could increase human intelligence by 100x but if the rate of technological progress doesn’t change, then we are not in a singularity. A singularity is like a technological revolution, just on steroids
The singularity is not just a faster technological revolution.
It's a change in the minds that drive evolution.
https://edoras.sdsu.edu/~vinge/misc/singularity.html
If the minds that run society are no different from ours it may lead places better (for us) than a singularity, but it will still be an imaginable society.
Those minds may be biological brains, digital ones, or a mix of the two. But the distinguishing characteristic of the singularity is that they are not merely human. And the resulting society will not be one that humans can understand.
You’re just wrong my guy, you find the 1 source on the entirety of google that agrees with you and try to act like it’s the final arbiter of truth on the matter. We usually pair the singularity with an increase in intelligence because that’s the most realistic route to the singularity, however it is not the singularity. My description is more similiar to how Neumann described it, how kurzweil describes it, how pretty much anyone describes it. It’s called the technological singularity, not the intelligence singularity
Edit: even then after reading the first few paragraphs it seems like the author agrees with me more
“ I argue in this paper that we are on the edge of change comparable to the rise of human life on Earth. The precise cause of this change is the imminent creation by technology of entities with greater than human intelligence.”
This isn't just a random source, this is the primary source on the singularity in the modern sense. Vinge has been the primary promoter of the concept since the seventies. Well before Kurzweil shifted from creating computer companies to writing speculative fiction.
And the sentence you quoted doesn't say what you claim.
"The precise cause of this change is the imminent creation by technology of entities with greater than human intelligence."
"Creation of entities with greater than human intelligence" is the important point.
I dont agree he is the primary promoter in fact I’ve never heard of him before now.
Also what is the “change” he is referring to? The change is the singularity and he just thinks advanced intelligence is how we will achieve it . It’s not necessary, you can have a technological singularity without higher intelligence and you can have higher intelligence without the singularity
Sorry you’re out of touch, not my fault.
Do you have anything else to say?
Did you finish reading the paper?
Yea and can you clarify your claim, is it that the singularity is the invention of a superhuman intelligence, that a superhuman intelligence is necessary, or what?
Edit: because in the original comment I responded to the poster said “To surpass human-level intelligence you need human-level intelligence. No getting around the definitions of the singularity” Implying the definition of the singularity is surpassing human level intelligence. Which (if we assume this paper is the be-all end-all) isn’t supported by this paper.
The singularity is the result of human society being under the control of a superhuman intelligence, whether biological, artificial, or mixed. This includes rapid technological advancement and the society's evolution and goals being literally incomprehensible to mere normal humans of today.
The singularity is a post-human era. Unless the singularity is avoided (eg, by Drexlerian confinement, or perhaps Egan's argument that superhuman intelligence isn't a real thing and as such no singularity develops) physical extinction of humanity is not the worst possibility. Mere human level intelligences could be treated as simple computational devices like traffic light sensors.
The paper spends some time on the possibility of avoiding the singularity, and it's all in terms of preventing the unlimited development of superintelligence. That is what distinguishes a singularity from mere rapid technological growth.
We certainly could achieve Singularity with human augmentation, amplifying human intelligence directly.
What about human level swarm intelligence
I think it’s possible but at the end of the day, augmented human intelligence would be way more parts AI than human.
[deleted]
how many of these can be automated https://www.bls.gov/emp/tables/emp-by-detailed-occupation.htm
You mean like reach the singularity through narrow AI? Yeah, absolutely. If we could make a narrow AI in which its only function is to iteratively maximize its own intelligence, I don't see why that wouldn't be possible in principle. An AGI would obviously be born at some point in that iterative cycle though, but I don't believe humans have to make the AGI directly.
I do, however, think it's significantly more likely that humans will directly create AGI, probably with the assistance of narrow AIs, rather than narrow AIs doing it solely by themselves, which will then be able to recursively self-improve and lead to the singularity
No I think we need one. There are theoretically infinitely different ways to get there. It looks like right now we’re starting by having many narrow ais. Eventually there’ll be some super tool which is a combination of the best.
AGI is a tool that will spawn many narrow AIs. Because having a platform where people can train a single neural network to do any task is a great way to get to the singularity - every human can be used to direct and create more AIs.
Even human level intelligence is usually just focused on a specific task. But tasks are sometimes made of many things, so generalizing is an important skill.
But what exactly is general intelligence? Do humans even have it? If by AGI you mean a human like intelligence then yes we need one (many) that is not inside a normal human, which we can enslave without any consequences, that eventually becomes better than a human.
Of course.
Depends entirely on your definition of Singularity. I think we probably could.
To be honest, I would much prefer Singularity caused by greatly amplified humans than ASI. That's the human perspective. I would prefer to be much smarter myself than to have an AI helper. But an AI helper is much preferable to what the situation is now.
DukkyDrake t1_iutobr3 wrote
Yes. Automate every job worth automating, make shelter + breakfast lunch and dinner free. Anyone who wants some luxuries goes into R&D. A few billion ppl doing R&D vs. under 10 million, that should get us there.