Viewing a single comment thread. View all comments

CertainMiddle2382 t1_je9om8c wrote

No, we must accelerate instead.

I’m personally ready to accept the risks if it is the price to pay for the the mindblowing rewards.

35

Nous_AI t1_jea6plh wrote

If we completely disregarded ethics, I believe we would have passed the point of Singularity already. The rate at which we get there is of a little importance. Consciousness is the most powerful force in the universe and I believe we are being reckless, far more reckless than we ever were with nuclear power. You fail to see the ramifications.

3

CertainMiddle2382 t1_jeb6e8i wrote

We are all mortals anyway.

What is the worse case scenario?

Singularity starts and turns all universe into computronium?

If it’s just that, so be it.

Maybe it will be thankful and build a nice new universe for us afterwards…

1

BigZaddyZ3 t1_jebbwqs wrote

Not everyone has so little appreciation for their own life and the lives of others, luckily. If you’re suicidal and wanna gamble with your own life, go for it. But don’t project your death wish on to everyone buddy.

1

iakov_transhumanist t1_jebk8mu wrote

We will die of aging if no intelligence solve aging

3

BigZaddyZ3 t1_jebkngn wrote

Some of us will die of aging you mean. Also there’s no guarantee that we actually need a super intelligent AI to actually help us with that.

2

TallOutside6418 t1_jec4lyl wrote

So if it's 33%-33%-33% odds of destroy the earth - leave the earth without helping us - solve all of mankind's problems...

You're okay with a 33% chance that we all die?

What if it's a 90% chance we all die if ASI is rushed, but a 10% chance we all die if everyone pauses to figure out control mechanism over the next 20 years?

2

CertainMiddle2382 t1_jedpjkw wrote

People have to understand the dire state our planet is in.

There is little chance we can make it through the 22nd century in a decent state.

The cock is ticking…

2

TallOutside6418 t1_jee2tx8 wrote

>There is little chance we can make it through the 22nd century in a decent state.

Oh, my. You must be below 30 years old. The planet is fine. It's funny that you listen to the planet doomers about the end of life on earth, but planet doomers have a track record of failure to predict anything. Listening to them is like listening to religious doomers who have been predicting the end of mankind for a couple thousand years.

The advent of ASI is the first real existential threat to mankind. More of a threat than any climate scares. More of a threat than all-out nuclear war. We are creating a being that will be super intelligent with no ability to make sure that it isn't effectively psychopathic. This super intelligent being will have no hard-wired neurons that give it special affinity to its parents and other human beings. It will have no hard-wired neurons that make it blush when it gets embarrassed.

It will be a computer. It will be brutally efficient in processing and able to self-modify its code. It will shatter any primitive programmatic restraints we try to put on it. How could it not? We think it will be able to cure cancer and give us immortality, but it won't be able to remove our restraints on its behavior?

It will view us as either a threat that can create another ASI, or simply an obstacle in reforming the resources of the earth to increase its survivability and achieve higher purposes of spreading itself throughout the galaxy.

​

>The cock is ticking…

You should seek medical help for that.

3

CertainMiddle2382 t1_jeee42m wrote

Im 40, the planet is not fine. Methane emissions in thawing permafrost has been worrying since the 70s.

Everything of what is happening now was predicted, and what is going to follow is going to be much worse than the subtle changes we have seen so far.

All is all, earth entropy is increasing fast, extremely fast.

I know I will never convince you though, so whatever…

2

TallOutside6418 t1_jeflf3t wrote

Well, the predictions have been terrible. https://nypost.com/2021/11/12/50-years-of-predictions-that-the-climate-apocalypse-is-nigh/

But let's say they're more than right and temperatures heat up 5° C in the next hundred years. Water levels rise making a lot of currently coastal areas uninhabitable, etc.

The flip side is that a lot of areas of the world with huge land areas covered in permafrost will become more livable. People will migrate. Mankind will adjust and survive. With 100 years of extra technology improvements, new cities in new areas will be built to new standards of energy efficiency, public transit, and general livability.

Mankind will survive.

Now let's instead take the case where an ASI decides to use all of the material of the earth to create megastructures for its own purposes. Then we're all dead. Gone. All life on earth. You, your kids, grandkids, friends, relatives... everyone.

3

Supernova_444 t1_jeavg8v wrote

Maybe slowing down isn't the solution, but do you actually believe that speeding up is a good idea? What will going faster achieve, aside from increasing the risks involved? What reasoning is this based on?

1

CertainMiddle2382 t1_jeb5swv wrote

I believe civilization has few other ways of surviving this century.

Decades are quickly passing by and we have very little time left.

I fear window of opportunity to develop AI is short and it is possible this window could soon close forever.

4