Submitted by Effective-Dig8734 t3_ykhpch in singularity
ArgentStonecutter t1_iuucx83 wrote
Reply to comment by Effective-Dig8734 in Do you think we could reach a singularity without the invention of agi? by Effective-Dig8734
Sorry you’re out of touch, not my fault.
Effective-Dig8734 OP t1_iuud4so wrote
Do you have anything else to say?
ArgentStonecutter t1_iuue3l4 wrote
Did you finish reading the paper?
Effective-Dig8734 OP t1_iuugag7 wrote
Yea and can you clarify your claim, is it that the singularity is the invention of a superhuman intelligence, that a superhuman intelligence is necessary, or what?
Edit: because in the original comment I responded to the poster said “To surpass human-level intelligence you need human-level intelligence. No getting around the definitions of the singularity” Implying the definition of the singularity is surpassing human level intelligence. Which (if we assume this paper is the be-all end-all) isn’t supported by this paper.
ArgentStonecutter t1_iuuzmah wrote
The singularity is the result of human society being under the control of a superhuman intelligence, whether biological, artificial, or mixed. This includes rapid technological advancement and the society's evolution and goals being literally incomprehensible to mere normal humans of today.
The singularity is a post-human era. Unless the singularity is avoided (eg, by Drexlerian confinement, or perhaps Egan's argument that superhuman intelligence isn't a real thing and as such no singularity develops) physical extinction of humanity is not the worst possibility. Mere human level intelligences could be treated as simple computational devices like traffic light sensors.
The paper spends some time on the possibility of avoiding the singularity, and it's all in terms of preventing the unlimited development of superintelligence. That is what distinguishes a singularity from mere rapid technological growth.
Viewing a single comment thread. View all comments