Submitted by [deleted] t3_116ehts in singularity
I think hooking a human up to an exocortex and creating a new super-human intelligence has a far higher risk of generating a hostile and dangerous entity than simply building a regular AI. It is incredibly hard to predict how an individual's identity and desires will manifest when he or she is amplified in intelligence a thousandfold. To make an analogy, it's like trying to imagine how an adult will behave by looking at his or her personality as a toddler. If truly terrifying super-minds ever come to exist, I am almost certain they will be mostly of human origin.
A good rule of thumb for any post singularity civilization is to be very careful about letting any mind that has resulted from human augmentation be in charge of significant infrastructure around beings of less intelligence. This is something that should be strictly kept to ego-less AI's with a negligible chance of going rogue.
joseph_dewey t1_j966dm9 wrote
This is a very good point, and I've never heard people discuss this before.
Basically, human intelligence augmentation will let anyone that wants to, turn into a supervillain.