Viewing a single comment thread. View all comments

Effective-Dig8734 OP t1_iuu0zxk wrote

I dont agree he is the primary promoter in fact I’ve never heard of him before now.

Also what is the “change” he is referring to? The change is the singularity and he just thinks advanced intelligence is how we will achieve it . It’s not necessary, you can have a technological singularity without higher intelligence and you can have higher intelligence without the singularity

1

ArgentStonecutter t1_iuucx83 wrote

Sorry you’re out of touch, not my fault.

1

Effective-Dig8734 OP t1_iuud4so wrote

Do you have anything else to say?

1

ArgentStonecutter t1_iuue3l4 wrote

Did you finish reading the paper?

1

Effective-Dig8734 OP t1_iuugag7 wrote

Yea and can you clarify your claim, is it that the singularity is the invention of a superhuman intelligence, that a superhuman intelligence is necessary, or what?

Edit: because in the original comment I responded to the poster said “To surpass human-level intelligence you need human-level intelligence. No getting around the definitions of the singularity” Implying the definition of the singularity is surpassing human level intelligence. Which (if we assume this paper is the be-all end-all) isn’t supported by this paper.

1

ArgentStonecutter t1_iuuzmah wrote

The singularity is the result of human society being under the control of a superhuman intelligence, whether biological, artificial, or mixed. This includes rapid technological advancement and the society's evolution and goals being literally incomprehensible to mere normal humans of today.

The singularity is a post-human era. Unless the singularity is avoided (eg, by Drexlerian confinement, or perhaps Egan's argument that superhuman intelligence isn't a real thing and as such no singularity develops) physical extinction of humanity is not the worst possibility. Mere human level intelligences could be treated as simple computational devices like traffic light sensors.

The paper spends some time on the possibility of avoiding the singularity, and it's all in terms of preventing the unlimited development of superintelligence. That is what distinguishes a singularity from mere rapid technological growth.

1