VanceIX

VanceIX t1_iy6eeeq wrote

The government is always slow to react to technological advancements, it’s just how things are. Governments tend to be reactionary rather than precautionary. Just look at the state of the internet today: governments around the world have only started seriously regulating things in the last decade or so, and that to mixed success.

Even then, there’s already signs that the USA government at least is leveraging AI more and more, just that those applications are obscured for national security purposes. The NSA, CIA, and FBI doubtlessly have tracking and profiling applications using AI, just like China does. The military has incorporated many AI technologies into weapon systems and fighter aircraft.

I do agree that 2030 likely won’t be that far off from 2022, just saying the lack of action by the government isn’t really predicting anything.

9

VanceIX t1_iw4wiyw wrote

The entire reason we have the cutting edge computing growth and research today is the competitive economy fostered by capitalism. Like it or not, it’s true. If it weren’t, and the government mandating things was a better system, then the USSR would be the leading bloc around the world, not the USA. The federal government can’t wave its hands, print trillions, and solve AGI.

You do you though.

1

VanceIX t1_iw4v10n wrote

You’d throw the world into economic turmoil, potentially causing millions to billions of deaths, to replace a system that will become obsolete shortly anyway?

Right now, like it or not, human labor still has value. You abolish capitalism, you also abolish any incentive for advanced AI research. Good luck getting AGI paying PhD and ME researchers a pittance compared to what they earn now at Google, OpenAI, Meta, etc.

1

VanceIX t1_iw4osnv wrote

Our whole society is motivated by capitalism. What do you think is responsible for the people who work in those labs choosing to work there, getting paid, the competition to produce results, the decision to get the degrees they needed to work in the field, the taxpayer money or corporate money that funds them, etc? Do you really think this feat would have been accomplished in the USSR or pre-capitalism China?

Capitalism is not the final economic model for humanity, it’s simply the best one for achieving AGI which will be the final economic model.

0

VanceIX t1_iw3r7v0 wrote

The world is unfair because the nature of the value of human labor is inherently unfair. Someone having a degree working as a neurosurgeon provides more economic benefit to a society and is thus paid more than someone doing manual labor. It is what it is. To say this will change at any given point is bonkers, the ONLY thing that could possibly change this is the value of all human labor dropping to 0 (i.e. what we will hopefully see with AI this century).

1

VanceIX t1_iw3mu8k wrote

You’ve got a terribly twisted and jaded view of reality.

I’m a hydrogeologist working with many other scientists and engineers and if you got rid of managers tomorrow you’d tank our whole organization. Those are the people that can worry about long-term economic conditions, finances, and agency-wide collaboration while the scientists and engineers can focus on their own projects.

Don’t just parrot the jaded and out-of-touch voices from Reddit. In the real world the economy is a very complex organism, and saying “capitalism bad” doesn’t accomplish anything. Capitalism is responsible for pulling more human beings out of poverty than any other economic system in history, and continues to do so in countries like India.

Capitalism isn’t perfect, not even close, but we can keep improving it until an AI-run economy is possible, hopefully not too long from now.

1

VanceIX t1_iw3ikqt wrote

Lmao calling CEOs, recruiters, and “HR wankers” as non-jobs is out of touch with reality. Those are all jobs that take really powerful confidence and people skills and long-term reasoning, which is why they aren’t automated yet.

When LLMs become capable of long-term reasoning and perfect human sociability emulation I have no doubt those positions can be automated as well, but it’s not as easy as waving your fists at capitalism demanding that those jobs cease existing.

0

VanceIX t1_irvjecd wrote

I actually believe that empathy is the root of all the good that humans stand for. Almost all of our positive impacts on the world and to each other stem from empathy, which is a very humanistic trait. If we can instill any one human concept in AI going forward, empathy would be one hell of a start.

I truly believe that if we create a general agent with the concept of empathy at its core we’ve gone most of the way towards solving alignment.

22

VanceIX t1_iqyb9wo wrote

My thought process is that AI makes decentralization more and more possible. If we can get to the point where people can ask AI to create a movie or show or book exactly to their liking, why would people pay entertainment behemoths for anything? I could see entertainment companies pivoting to leasing rights to their owned universes (like Star Wars, Marvel, etc) but I really believe AI will decimate their revenue sources.

2

VanceIX t1_iqxlsuv wrote

We're getting to these crazy fast speeds on modern hardware with some extreme optimization over just a few months. Can you imagine what the state of the industry will be like with another five years of optimization and hardware/node improvements?

Major entertainment companies have to be sweating right now. Buckle up, we're seeing exponential progress in front of our very eyes.

44