Comments

You must log in or register to comment.

acutelychronicpanic t1_jd7ga3k wrote

This is an unpopular opinion with all of the environmental concerns we have at the moment (which are both legitimate and serious), but with advances in technology, Earth can hold a ludicrous number of people comfortably.

If AGI was here, genetic engineering of crops will be supercharged, fusion will be fast-tracked, and truly intelligent systems will be ubiquitous.

A Thanos-inspired solution would do far more harm than the overpopulation it is supposedly addressing.

11

Gubekochi t1_jd7kl6x wrote

Plus there is no telling how enticing an archology could be built with super intelligence. You could pack entire city's worth of people in on skyscrapper that somehow looks good, has parks and ressources in and has roomier quarters than our current appartments and houses.

2

[deleted] OP t1_jd7j9oh wrote

[removed]

0

Mercurionio t1_jd7jlj8 wrote

The planet don't care. As long as we are keeping the minerals on it.

Will we die or flood it - planet will restore to it's original form anyway

2

acutelychronicpanic t1_jd7o0et wrote

I agree entirely with what you are saying. I just think that most people talking about this greatly underestimate our available resources as technology improves.

Say we get fusion.

What does carrying capacity and farmland acreage even mean when you can create tons of starch and protein in bioreactors for pennies a pound? With the inputs being things like air, water, energy, and abundant minerals?

1

Trout_Shark t1_jd7gt7g wrote

I'm no rocketsurgeon, but asking an AI to remove 75% of humanity seems like a bad idea. What if it thinks that means removing 75% of each of us. We might just end up as a bunch of heads in jars. Futurama style. I guess that would sort of be a version of the singularity.

5

Gubekochi t1_jd7kqwj wrote

"What if the super intelligence is actually stupider than ChatGPT currently is?"

​

For real? That's your concern?

2

just-a-dreamer- t1_jd7j7om wrote

You could make the majority of humans sterile. That works itself out within few years.

1

Mercurionio t1_jd7jvfc wrote

And what's the point? We are basically developing the tech to kill ourselves. And it's happening right now.

PS: it was a rhetorical question

1

Trout_Shark t1_jd7k08m wrote

No one would survive that scenario. This thought experiment has gone really dark quickly.

1

Marshall_Lawson t1_jd7qpae wrote

it was pretty fucking dark from the beginning what OP posted

2

Trout_Shark t1_jd7rpnq wrote

Agreed. Just casually killing or sterilizing 6 billion people didn't seem like a bad idea to OP. That's pretty fucked up.

2

Marshall_Lawson t1_jd7savs wrote

reminds me of Britta from Community lol

"I can excuse murdering 3/4 of the population of humanity, but I draw the line at forced sterilization!"

2

Trout_Shark t1_jd7tf5d wrote

LOL. I'm guessing it was a young kid. They deleted the post quickly as soon as it it went south on them. AI in the hands of morons is not something I am looking forward to.

1

Cmyers1980 t1_jd7gcan wrote

This is a false dichotomy and blatantly wrong. We can help humanity thrive without mass murder or depopulation by changing the dominant system to one that serves humanity rather than a wealthy few and makes efficient use of resources. Given the massive waste and unequal distribution under our capitalist status quo we have more than enough resources to give every person a basically good life while still protecting the environment. Read Less is More by Jason Hickel for a comprehensive summary.

4

jellicenthero t1_jd7gp6z wrote

The problem isn't numbers. It's location. Need to move humans away from fertile places like rainforests.

3

just-a-dreamer- t1_jd7jlwf wrote

Why would they do that? They want a vacation home over there. And a resort. A trail for Instagram photos. A safari park or whatever.

Human desires are endless. Making more people rich just creates more desires. Scaled up to 8 billion, everybody has a vacation home in the rain forrest.

2

jellicenthero t1_jd7ktcb wrote

I mean.... killing off 6 billion people would also remove those things as you wouldn't have the resource allocation to build or maintain them.

2

Marshall_Lawson t1_jd7hcov wrote

Keep this fascist bs in the burning dumpster where it belongs. The problem is not overpopulation, it's efficiently using and allocating the resources available, instead of allowing a privileged few to hoard and waste them, and manipulate essential resources like food, water, and housing, as commodities.

3

just-a-dreamer- t1_jd7it3b wrote

That's like giving a drug addict more money. There is never gonna be "enough" for his desires can never be fulfilled.

Letting the human population breed in ever growing numbers works at cross porpose for every reasonable long term planing provide a stable enviroment.

−2

Marshall_Lawson t1_jd7jdy4 wrote

So instead you just want to let a computer, programmed by humans, decide which billions of humans to kill, to get down to an arbitrary number that you pulled out of your ass. Okay, fucking fascist.

3

Trout_Shark t1_jd7krda wrote

I guess the AI genocidial maniac decided to delete and run.

1