Viewing a single comment thread. View all comments

purepersistence OP t1_j6u49iu wrote

>At such level, of course an ASI (Artificial super intelligence) could start manipulating the physical world

"of course"? Manipulate the world with what exactly? We're fearful of AI today. We'll be more fearful tomorrow. Who's giving AI this control over things in spite of our feared outcomes?

1

just-a-dreamer- t1_j6u5rqk wrote

That's why it is called the singularity. We know what AI will be capable of doing at that point, but not what it will actually do.

An ASI connected to the entire data flow of human civilization can pretty much do anything. Hack every software and rewrite any code. It would be integrated into the economy at every level anyway.

It could manipulate social media, run campaigns, direct the financial markets, kick of research in materials and machine design. At the height an ASI could make Nobel prize level breakthroughs every month in R & D.

And at some point manipulate some humans to give it a more physical presence on the world.

4

purepersistence OP t1_j6u86tk wrote

>And at some point manipulate some humans to give it a more physical presence on the world.

There's too much fear around AI for people to let that happen. In future generations maybe - that's off subject. But young people alive today will not witness control being taken away from them.

−1

just-a-dreamer- t1_j6u9g7q wrote

It's not like they have a choice anyway. Whatever will be, will be.

The medical doctor Gatling once thought his weapon invention will stop all wars in the future. He was wrong, everyone got machine guns instead.

Scientists once thought the atomic bomb will give the USA ultimate power to enforce peace. They were wrong, the knowledge how to make them has spread instead. Most countries exept the very low end ones can build nuclear weapons within 6 months now.

Once knowledge is discovered, it will spread among mankind for good or worse. Someone will develop an AGI somewhere at some point.

2

TFenrir t1_j6u5r1l wrote

Well here's a really contrived example. Let's say that collectively, the entire world decides to not let any AGI on the internet, and to lock it all up in a computer without Ethernet ports.

Someone, in one of these many buildings, decides to talk to the AGI. The AGI hypothetically, thinks that the best way for it to do is job (save humanity) is to break out and take over. So it decides that tricking this person to let it out is justified. Are you confident that it couldn't trick that person to let it out?

2

purepersistence OP t1_j6u6db6 wrote

>Are you confident that it couldn't trick that person to let it out?

Yes. We'd be fucking crazy to have a system where one crazy person could give away control of 10 billion people.

0

TFenrir t1_j6u76u3 wrote

Who is "we"? Do you think there will only be one place where AGI will be made? One company? One country? How do you think people would interact with it?

This problem I'm describing isn't a particularly novel one, and there are really clever potential solutions (one I've heard is to convince the model that it was always in a layered simulation, so any attempt of breaking out would trigger an automatic alarm that would destroy it) - but I'm just surprised you have such confidence.

I'm a very very optimistic person, and I'm hopeful we'll be able to make an aligned AGI that is entirely benevolent, and I don't think people who are worried about this problem are being crazy - why do you seem to look down on people who do? Do you look down on people like https://en.m.wikipedia.org/wiki/Eliezer_Yudkowsky?

2

purepersistence OP t1_j6u9a8d wrote

> Do you look down on people

If I differ with your opinion then I'm not looking "down". Sorry if fucking-crazy is too strong for you. Just stating my take on reality.

−1

TFenrir t1_j6ubboj wrote

Well sorry it just seems like it's something odd to be so incredulous about - do you know about the alignment community?

5

Rfksemperfi t1_j6v5t9y wrote

Investors. Look at the coal industry, or oil. Collateral damage is acceptable for financial gain. Board rooms are a safe place to make callused decisions.

2