Comments
Science_is_Greatness t1_iy1t0uz wrote
"I am inevitable" - Thanos
AttackOnPunchMan t1_ixyma3t wrote
well, I do desire. Just saying
Kolinnor t1_ixyj79u wrote
"The folks on LessWrong are interesting philosophers, but not always very rational. Transhumanists seem to think AGI and SAI are wonderful potential creations."
Wow. That's a really shitty take on LessWrong. And there's a paywall ?
HeinrichTheWolf_17 t1_ixz5am7 wrote
Doesn’t matter, it’s happening wether you like it or not.
Honest_Science t1_ixymimp wrote
That is exactly the wording the gorillas used when they saw us coming.
Brangible t1_ixzjjpp wrote
It's not up to individuals. Economies of scale and the super organisms driving it that cannot shrink or die are what decides it. Clearly general and super intelligent ai is going to happen, unless some very big disasters happen across the planet
Professional-Song216 t1_iy0kdxz wrote
Human intelligence is NOT sufficient
alphabet_order_bot t1_iy0kf4l wrote
Would you look at that, all of the words in your comment are in alphabetical order.
I have checked 1,193,759,217 comments, and only 232,911 of them were in alphabetical order.
HeinrichTheWolf_17 t1_iy25q9z wrote
You are an absolute chad
rlanham1963 t1_ixzi4ai wrote
It really doesn't matter. The question is, is it feasible? If it is feasible, it will be built regardless of public opinion. Once built, you can't un-invent it.
sticky_symbols t1_iy2nswt wrote
Those doing AGI safety tend to agree. They also agree that it will likely happen anyway, based on economic and political forces.
[deleted] t1_ixybguz wrote
Not desirable, yet inevitable :(
"Your scientists were so preoccupied with whether or not they could, they didn't stop to think if they should." -- Ian Malcolm, Jurassic park
HeinrichTheWolf_17 t1_ixz5hih wrote
I disagree, it’s desirable. Humanity isn’t really doing a good job running this planet so far. Humans are driven by greed, hierarchy and tribalism.
Humans hate to admit it but we are still apes.
[deleted] t1_ixz5txq wrote
You should look up "The alignment problem".
HeinrichTheWolf_17 t1_ixz720g wrote
I worry greatly about the alignment problem, I worry that human beings will create an existential crisis in order to stay in power.
The sooner humans aren’t running the planet the better.
sticky_symbols t1_iy2norl wrote
I mean, sure, if you're okay with a world in which everything is turned into that AIs favorite thing. That sounds like the end of every good possibility to me.
[deleted] t1_ixz7cjw wrote
You are dangerous.
HeinrichTheWolf_17 t1_ixza3re wrote
Thanks for that, I needed a good laugh today.
Take a minute to look around the world and see what humanity has done, our governments and rich overlords treat their people like expendable cattle being forced to live paycheck to paycheck, Amazon workers can barely take a piss, police harass the homeless just for being poor, powerful old men take swaths of young men away from their families to go die or get crippled a pointless war (look no further than Putin), the environment is crumbling at an accelerating rate and the U.S., Chinese and Russian governments are all adamant about using coal to save on costs at the expense of causing another mass extinction not just of animal life but of billions of humans living in hotter regions, people’s individual rights over their own bodies are being revoked at the hands of men with religious convictions, it’s all about controlling other people.
Agree or not with Q’s antics aboard the Enterprise, he was right about humanity being a savage child race. Humanity can surely do better, but being a reactionary isn’t how a species gets there. AGI is coming and the best course of action is to merge with it. I believe shedding our biological selfish tendencies will be far more beneficial than the world we have now.
Brangible t1_ixykp5o wrote
They know all this already and they're still racing to create our new gods. You'd need a Sarah Conner plan be carried out 2000 times to end it.
arisalexis t1_ixyfsw2 wrote
But another adjective is even more important: it is inevitable