hyphnos13

hyphnos13 t1_jef52x7 wrote

To be fair validating effectiveness of a medical intervention requires accounting for variety in people and making sure that it is safe across the board.

You don't need a pool of hundreds of thousands of the exact same particle and a control pool of the same or need them to roam about in the wild for months to ethically answer a question in physics.

If we were willing to immunize and deliberately expose a large pool of people the covid vaccines would have been finished with testing a lot faster.

1

hyphnos13 t1_je1crpe wrote

I agree but there are many many aspects of the economy that ai can't improve rapidly. Things still have to be dig out of the ground, moved around etc. Up to the point we can 3d print or micromanufacture everything at the point is needed.

Maybe we will get an ASI that can devise tech like that but it's unlikely we are getting star trek replicators any time soon. The base atoms will have to be made available in order to make whatever and that involves a great deal of inefficient gathering and transporting for the foreseeable future.

A lot of what people are referring to as increased productivity is just increased profits from automating inefficient desk jobs and the elimination of the managers standing over them.

Real productive increases will require better designs and machines to build things otherwise we are just talking about reduced labor costs.

I think most of the real money from ai/AGI/asi whatever comes about will be in the creation of things that don't currently exist because they haven't been invented yet, not replacing accountants and lawyers with expert systems.

1

hyphnos13 t1_je0x9un wrote

Maybe. A lot of the economy is the production of physical goods, food, power, infrastructure. You can be infinitely smart and not be able to grow enough food to feed a single family.

Ai can tell us how to do and make things better but it won't happen instantly unless it gains the power to manipulate matter and energy simply through computation alone.

1

hyphnos13 t1_je0wqe9 wrote

Why does AGI need to be conscious?

In fact why does it have to be general. A bunch of specialized networks that can speed up human science or discover things on its own will advance progress in a way that is indistinguishable from an agi acting on its own.

If we build a machine intelligence capable of improving other ais and the hardware they run on then specialized "dumb" ais will still outpace human development faster than we can keep up.

2

hyphnos13 t1_je0umnc wrote

For as long as we want it to.

Why would you give an unpredictable ai the ability to have control over its power source.

It may be able to self improve its algorithms and design better hardware to run on but it still has to get it built by humans until we literally hand it control of some future fabrication technology that we can't disconnect from it.

1

hyphnos13 t1_ir0cb9l wrote

With what arms, legs and manufacturing systems.

I can give you the plans for an iPhone 3. Now build one from scratch.

The limitations of our materials science, energy generation, and high tech manufacturing will not disappear overnight.

Also, define "computronium" and the means that "some matter" like silicates, carbon compounds, and the great many other elements that constitute a huge chunk of the earth will suddenly become converted into energy or anything remotely useful for performong computation.

8