ouaisouais2_2

ouaisouais2_2 t1_j6h83to wrote

>These economic systems have ranged from simple hunter-gather societies to globally interconnected ones and although the differences might seem stark, since the very start humanity has always been interconnected and has been a global society.

Hunter gather societies were definitely not a global society. Most couldn't even cross their subcontinent, unless by means of an extremely risky boat mission.

>all things must evolve for them to continue to exist in the universe. This law is universal at every level of the universe.

I've never heard of such a law and it seems entirely made up. I might even argue, that something isn't "the same thing" anymore once it evolves.

>Because everything must evolve it must adapt and this includes humans and its meta information.

I'm sorry but the paragraph following this line makes me want to say "Jesse, what the fuck are you talking about?". I don't know if it's me who doesn't get it or it's poorly written.

>The point is that all information structures evolve and that includes human societies.

I don't disagree that societies recognized as human have evolved, but human societies mght be more than information structures. I don't think it's definitively decided upon wether every physical entity can be reduced to the concept of information, especially if you take subjective experience (qualia) into consideration.

>Technology as we think of it can be boiled down to a tool. A tool that optimizes something in the universe to accomplish some task. We like to think that our tools don’t control us and this is actually true at the local level but at the meat level technology controls everything because it is the form of information that can optimize itself at a speed biology and chemistry cannot.

Technology does indeed NOT control us, but humans control each other by threatening to destroy if one doesn't use it or make more of it. Technology development is therefore necessary to survive, but only in our global society as we know it. You could largely escape this dynamic by means of some grand revolution or world federalism.

>This is because capitalism is the system that leads to technology to faster and faster progress

I don't think so. There are a lot of theoretically possible societies, that would seem very non-capitalist yet have furious technological development. Capitalism was fitting in the historical context. It allowed a lot of people to be united under the same country and for technological development but it's also an imperfect compromise. The workers are relatively satisfied by being able to vote, the rich are satisfied by well... being rich. It might not even be the most competitive system for its time in history yet the only one that had a reasonable chance of appearing

>We deserve capitalism not because of some moral consequence but because that is who we are as a species. Our purpose is to be another node in the technology evolution tree.

Might be your purpose, not mine :D

>We deserve because we selfishly refuse to die out and will continue to improve technology because without it we cannot exist.

You're making some overly generalizing metaphysical claims here.

>We cannot exist without technology and it cannot exist without us. We will follow the trees path to acceleration .

Seems like this was some kind of love letter to technology and capitalism. Few points were made other than that our relationship with technology "is meant to be" or something. All in all, not very interesting now that I've read it a second time.

3

ouaisouais2_2 OP t1_it8bole wrote

I was presenting ASI as a technology that is extremely risky to invent, then you bring up nuclear reactors in what seems to be an attempt to disprove me by saying "we use risky technology all the time but things work out anyway". Now you claim nuclear reactors are close to risk-free, which makes the comparison irrelevant. It'd been easier to say you just don't think ASI is that risky.

>OK, build one.

I didn't say it was easy to build one, but once it is built by somebody, it can easily be distributed and run by anyone who happens to own strong computing power.

Secondly, are you interested in gaining knowledge from this exchange or are you trying to slam-dunk on an idiot? You seem to be in keyboard warrior mode all the time.

1

ouaisouais2_2 OP t1_it3y0lf wrote

We should also have waited a while before we built that, but there was a cold war in the way. We avoided absolute calamities multiple times by luck.

We could abolish the reactors and the weapons that exist, which would require a lot of collaboration, surveillance between countries, more green energy. It's very, very ambitious but if it succeeded, nuclear war would be an impossibility.

AI and ASI are different because they're fuelled with the easily available materials, code and electricity, which provides many smaller groups with the ability of mass destruction or mass manipulation. That means not only nation states can join in, but also companies, cults, advocacy groups and maybe even individuals.

So either we spend a fortune on spooky, oppressive surveillance systems to ensure nobody's using it like dangerously or we negotiate on how we use it right, in some places at certain times in certain ways as we slowly understand it more and more.

It'd be great if we as international society could approach AI, especially ASI, extremely carefully. It is, after all, the final chapter of History as we know it.

1

ouaisouais2_2 OP t1_it39a33 wrote

It might not have been very clear but I said : "inhibit or manage".

>Not because of greed or capitalism, AI just has such huge potential, any country slowing down their own progress would assure their economic disadvantage in the future, maybe even their destruction.

That's exactly what I'd call a trademark of capitalism (mixed with the idiocy of warmongering in general). People are too afraid of immediate death or humiliation to step off a road of insanity.

1

ouaisouais2_2 OP t1_it3615d wrote

>Pretty much every new technology ever in history was doomed as the end of the world initially.

I doubt that people literally predicted the extinction of humanity or
dystopias in all the colors of the rainbow. Besides, all that shouldn't be a reason to not take serious predictions seriously.

We know there is a risk that is only possible with ASI/wide application of narrow AI. We know it can get unfathomably bad in numerous ways. We know it can only get unfathomably good in relatively few ways. It's highly uncertain how high the chances are that it lands on respectively bad or good.

It's only reasonable to be more patient to spend more time researching what risks we're accepting and how to lower them. I think that's the most reasonable at least on the extremely long-term

1

ouaisouais2_2 OP t1_it2lxdx wrote

I'm suggesting that we slow it down, put it through more law-enforced security checks and make its application a major political subject, preferably on an internation scale.

>Sure then, let's ban knifes as they can be used as weapons by irresponsible people.

No, that doesn't make sense. What does makes sense, is to not sell atomic bombs to profit hungry CEOs, terrorists or schizophrenic idiots.

1

ouaisouais2_2 OP t1_it1yfud wrote

By "high-technology", I primarily meant AI. I admit that the term was a bit of a stretch.

I think however that you continue to underestimate the chaotic danger and uncertainty of the situation when it comes to AI.

Poverty, education and medical treatments are but rough estimates of well-being

>Misery is just vastly overreported, because again, it generates more clicks.

... as it should be, generally. Pain and anxiety are largely more important for human survival than pleasure and reassurance.

−3

ouaisouais2_2 OP t1_it1q0dm wrote

>Do you want to do things by yourself for the rest of your life or do you prefer robots and computers taking care of (at least some of) them?

No, I don't, but I wish we'd have a have more democratic ethical consideration when going into these things, so that we don't pull a black ball.

Also I think slaves and serfs are mostly needed to keep an empire together in times of war. If we stop wars and the worst forms of economic exploitation, we might all be able to work without slave-like conditions. With lives like that people will have
more time to be consider the changes they make to society.

2

ouaisouais2_2 OP t1_it1o7kq wrote

yes, I am proposing that the ***might*** is more important than ***you***, because the ***might*** is absolutely ridiculously more dangerous than the current diseases.

It is a question of time before AI allows for the wildest forms of biological terrorism, which a company couldn't predict. The individual developer isn't necessarily a "bad person" but we should collectively decide to halt the advancements and subject them to collective ethical considerations.

Edit: It is important to note that I don't blame you personally if you have happen to run an AI enterprise. The problems are always systemic. I just wanted to know your motivations.

−1

ouaisouais2_2 OP t1_it1n9d4 wrote

>"They simply don't think that'll happen. And history backs them up."

We've been replacing our strength with tools, motor-skills with machines and now our brains with AI. I see no reason for there to be "jobs" in around 50 years. The only activity humans will need to do, given that they control the tools they have created, is to request their wishes, and I'm not so sure everyone will be allowed to have wishes.

>"Technological advancement has lead to enormous reductions in poverty."

I don't know what your definition of poverty is, but I have the impression that the ratio between the aristocratic 0.1%, the semi-comfortable middle-class of 9.9% and the 90% who are overexploited into misery has been the same since the dawn of civilization. We have simply been able to make more people.

These two unhealthy patterns are likely to express themselves in the singularity in morbid and unpredictable ways. That is, if they aren't reversed.

TLDR; how can so many in this subreddit be so nauseatingly positive about high-technology? Excuse the harsh words but that's what I think.

−13