Submitted by StarCaptain90 t3_127lgau in singularity
SmoothPlastic9 t1_jefd25z wrote
The smartest people r afraid of AI for a reason. the chance of it backfiring on its own+used by terrorist to caused huge damage on a scale never seen before is enough to make it the second biggest threat to us
StarCaptain90 OP t1_jefe8ev wrote
I understand the threat. But its out of the box. If we stop development or slow down, only those with ill intentions will continue development. We need to focus on AI that benefits humanity, added security as well.
SmoothPlastic9 t1_jefeokt wrote
Speeding up development for the sake of it is still extremely likely to yield bad results
StarCaptain90 OP t1_jeffio7 wrote
There is no good solution. People will pause development for 6 months and then realize that they don't have a clear answer still. How would any human today know for sure that the solution would work against a hyper intelligent machine?
SmoothPlastic9 t1_jefhckw wrote
Thatβs why we need a lot of time to make sure it works
Viewing a single comment thread. View all comments