Submitted by Yuli-Ban t3_119ynxv in singularity
Darustc4 t1_j9owvhx wrote
Reply to comment by GenoHuman in If only you knew how bad things really are by Yuli-Ban
Why? Why are you sure he is wrong/cringe/misguided?
hapliniste t1_j9p1a4p wrote
Because it doesn't seem he know anything about the technology but is preaching quasi-prophetic messages about it.
Darustc4 t1_j9p21rt wrote
IMO it is the best thing to do. Promote fear of AI so that the people realize it is dangerous and we buy some time to get alignment work in.
I am an AI safety researcher and let me tell you, it's not looking great: AI is getting stupidly powerful incredibly quick and we are nowhere close to getting them to be safe/aligned.
FOlahey t1_j9oyjos wrote
Yeah, OP is just suggesting that Singularity could hugely benefit the world (if we align our values to collaborate) with things like conquering biological death. Or it could go belly up (if we don’t align and stay competitive) and a few entities will just disrupt the economy and people will lose their livelihood without any significant gains for the masses.
I’m hoping AI paints a picture for the true nature of interaction between the self and the universe through the senses and brain chemistry. Then we can get clearer answers of reality. Conclusive answers about neuroscience, psychiatry, Autism, shamanism, and drugs. Then maybe we can all get on the same page about mental health/religion and advance the human race together.
Viewing a single comment thread. View all comments