Submitted by Darustc4 t3_126lncd in singularity
Nous_AI t1_jea6plh wrote
Reply to comment by CertainMiddle2382 in Pausing AI Developments Isn't Enough. We Need to Shut it All Down by Eliezer Yudkowsky by Darustc4
If we completely disregarded ethics, I believe we would have passed the point of Singularity already. The rate at which we get there is of a little importance. Consciousness is the most powerful force in the universe and I believe we are being reckless, far more reckless than we ever were with nuclear power. You fail to see the ramifications.
CertainMiddle2382 t1_jeb6e8i wrote
We are all mortals anyway.
What is the worse case scenario?
Singularity starts and turns all universe into computronium?
If it’s just that, so be it.
Maybe it will be thankful and build a nice new universe for us afterwards…
BigZaddyZ3 t1_jebbwqs wrote
Not everyone has so little appreciation for their own life and the lives of others, luckily. If you’re suicidal and wanna gamble with your own life, go for it. But don’t project your death wish on to everyone buddy.
iakov_transhumanist t1_jebk8mu wrote
We will die of aging if no intelligence solve aging
BigZaddyZ3 t1_jebkngn wrote
Some of us will die of aging you mean. Also there’s no guarantee that we actually need a super intelligent AI to actually help us with that.
Viewing a single comment thread. View all comments