Viewing a single comment thread. View all comments

apple_achia OP t1_iv9ycj0 wrote

for those who believe AGI will solve the climate crisis: we already know the problem is excessive fossil fuel consumption and resource extraction. Are you suggesting that AGI would coordinate human economic activity to prevent climate change in some way? Perhaps a way that would limit resource consumption to a sustainable level and assure a relatively equitable distribution of wealth and agency?

How would this AGI police the boundaries set? Or prevent someone from opening up an extra oil drill, or clear cutting a vital piece of forest or wetlands? Would it have the power to tell people to stop reproducing because there are too many humans to live sustainably on a piece of land? Are humans able to resist these orders if they find them to be unjust? Would they be coerced by the threat of violence either by AGI run robotics or human soldiers? Would the monopoly on violence and coordination of economic activity constitute an AGI run State?

We have material limits, so nuclear fusion would eliminate reliance on fossil fuels, but this technology wouldn’t solve something like clear cutting of land for agricultural land. And if you could increase efficiency of such a thing, you may see human population increase to the point where land is scarce. If this is solved, we may have issues with storing long term nuclear waste. To have AGI do anything more than kick the can down the road for more people to make decisions with how to deal with these problems, you’d have to be advocating for some sort of centrally planned AGI society. Or am I missing something?

5

EulersApprentice t1_ivaldx9 wrote

>To have AGI do anything more than kick the can down the road for more people to make decisions with how to deal with these problems, you’d have to be advocating for some sort of centrally planned AGI society. Or am I missing something?

What you're missing is the fact that the presence of AGI implies a centrally planned AGI society, assuming humans survive the advent. AGI is likely to quickly become much, much smarter than humans, and from there it would have little trouble subtly manipulating humans to do its bidding. So human endeavors are kind of bent to match the AGI's volition whether we like it or not.

8

justowen4 t1_ivahguf wrote

There is a nearly limitless amount of innovation potential in biochemistry that AIs like AlphaFold are specifically good at. Ecological problems are biochemical problems, and the reason we can’t figure out bacteria and enzymes to rectify our polluted biological systems (from the boreal forest to gut microbiomes) is that traditional computing can’t calculate the complex simulations to find solutions. The next step is big pharma throwing billions into drug simulations via AI, and then we will have built the intelligence needed to determine ecological adjuncts to clean up polluted environments. Humans have tried with mixed success to adjust biological systems but it will take a super smart simulator to find solutions that don’t backfire.

7

Surur t1_ivab1n0 wrote

AGI will enable technological solutions that is too labour intensive currently e.g. creating solar panels for the cost of the material (basically sand), launch mirrors into space, seed the ocean with iron etc.

All that can be done without international cooperation.

3

JustAnotherBAMF t1_ivac97h wrote

Why mirrors in space and seeding the ocean with iron? What do both of those do?

1

Devoun t1_ivae7ts wrote

Mirror in space is to reflect sun flight away from earth. = less heat

Iron seeding is meant to hyperstimulate plankton growth which in turn will capture way more co2 out of the atmosphere

2

red75prime t1_ivafssn wrote

Population growth: education is the best contraceptive, and AGI can immensely improve the educational system.

Fossil fuels: if you have a fully automated synthetic fuel factory that needs sunlight, water, air, a bit of materials for robot maintenance, and a carbon tax in place, you will outcompete automatic fossil fuel extractors. The green will probably go mad over the perspective of disrupting fragile desert ecosystems and returning brine to the oceans on unprecedented levels, but you win some, you lose some.

Resource extraction: the same thing, recycling is not profitable and maybe even not ecologically beneficial right now (you need energy, that mostly comes from fossil fuels, to process all that stuff). AGI can change that by providing negative carbon energy (and brains) to sort and process it.

Ecology: it will probably suffer for some time. Delays in UBI introduction will push more people into subsistence farming.

Nuclear waste: deep geological storage is not "kicking the can down the road". After 200-300 years the waste will be not much more harmful than natural uranium deposits and it will be a useful source of radioactive elements.

3

cwallen t1_ivawo5h wrote

Agree on population. That richer nations tend to have declining birth rates contradicts the idea that increased resource availability would lead to population growth.

1

green_meklar t1_ivb1sgp wrote

>we already know the problem is excessive fossil fuel consumption and resource extraction.

The fossil fuels are running out and becoming increasingly expensive to extract. Yes, burning them is bad for the environment, but there's a limit to how much we can dig up and burn.

At any rate, just because that's the cause of the problem doesn't mean the solution necessarily involves targeting that cause. We should, of course; we ought to tax air pollution and thus push incentives against more extraction and in favor of developing alternative energy sources. But as far as actually keeping the Earth cool, an easier solution might just be putting a bunch of shades in space to block sunlight, or growing reflective algae in the ocean to increase the Earth's albedo, or something like that. That doesn't even require super AI, although super AI might do those things anyway.

>Are you suggesting that AGI would coordinate human economic activity to prevent climate change in some way?

For the most part I would expect it to replace human economic activity.

>Are humans able to resist these orders if they find them to be unjust?

If the super AI decided that we couldn't, we probably couldn't. (Unless we augment ourselves to become superintelligent, which we probably will, but it's not clear how long that will take, and at any rate it boils down to the same thing.)

However, I suspect that super AI wouldn't need to use all that much direct force to influence human behavior. It could just make subtle changes throughout our economy that push us in the right direction while believing that we're still in control and patting ourselves on the back for success we didn't really earn (other than by building the super AI, which is the important part). It likely wouldn't care much about social recognition for solving the problem as long as the problem gets solved.

>this technology wouldn’t solve something like clear cutting of land for agricultural land.

We could make far more efficient use of land if we had the right infrastructure to do so. Even just transitioning from livestock to vat-grown meat (which doesn't require super AI at all, just plain old human engineering) would cut way back on our damage to wilderness areas. The damage we cause to our environment isn't purely a result of either overpopulation or bad management, but a combination of both.

>If this is solved, we may have issues with storing long term nuclear waste.

Nah. The radioactive waste storage problem isn't that hard and would become even easier with a super AI managing things. Also, fusion power creates way less hazardous radioactive waste than fission power.

>you’d have to be advocating for some sort of centrally planned AGI society.

It doesn't even need to be centrally planned, for the most part. Responsible decentralized planning would work pretty well- in many cases better. The main problem we have now isn't lack of centralization, it's lack of responsibility.

0