Viewing a single comment thread. View all comments

turnip_burrito t1_iuv0eym wrote

Humans all have different ideas on how life should be lived. An ASI would recognize this. Assuming it is human-aligned, I think that the proper route for ASI to take would be allowing every individual to choose which society of like minded populations they want to live in:

Want to live in a non-automated society with X politics? This land or planet will be where you live, free from automation.

Late 20th century culture and technology? Over there. You will die at the age of 70ish without the new antiaging treatments, but it's your choice.

Want to live in a VR world? Here you go. Let the ASI know whenever you want out.

Want to become luxury gay space communists whose material prosperity increases every year, powered by ASI-managed Dyson sphere? This way.

Want to live without technology and with no government, off the grid? Here's this place you can live in. Send a signal when you get tired of living like a caveman, or not, it's your call.

Want to move to a different society because the one you're in right now doesn't fit or is abusive? Ask the AI and it will help you migrate to a different society.

Each society should be easy to migrate to/from, but protected from other societies. Want to nuke a different society? Or release a supervirus? The AI will quietly prevent it as much as it can, minimizing violence and other interference while it does so. There have to be some rules like this, and the ASI can figure them out by considering human preferences.

The amount of interference should be minimal to allow a lot of human freedom and liberty (likely even more than anyone alive has now) while still ensuring protection (also more than anyone has now).

It would do this without forcing everyone to live the same way.

Then the multitude of human preferences can be accomodated. Humanity can continue to explore and live out the future of its choosing, with minimal infringements on freedoms.

12

h20ohno t1_iuv3a3x wrote

An idea I had is for some sort of contract system you can sign with an ASI, in which you can agree to some rules and limits before moving to a different region, for instance you could specify that you aren't allowed to exit a VR sim until 2 years have passed inside the world (Or if a condition is triggered), or maybe something more abstract such as "If I end up in a hedonistic cycle where I stop doing productive things, please intervene"

And in these contracts, you would have to sign off on a number of laws that the governing ASI also brings to the table: "No killing or torturing conscious beings" or "If you want to create a conscious being, they are immediately subject to all human rights and can leave the simulation whenever they wish"

Any thoughts on a system like this?

3

turnip_burrito t1_iuv45tj wrote

I agree with this contract idea. It is a good proposal to protect yourself and others from your own actions. Very sensible.

If we ever reach a point where we know how to artificially create conscious beings, then we should (as you've pointed out) have a set of rules to prevent abuse. To add something new to the discussion: there is also a possibility of material or energy resource shortages (resulting in lower quality of life for you, others, or the new beings) if too many conscious beings are allowed to exist at one time, so it will need to be regulated somehow.

3

swazhr t1_iuvh89c wrote

If it could manage that why stay humans in the first place? It seems like some manipulation would have to be going on to convince people to live in a land with MOST x politics

1

turnip_burrito t1_iuvnh3f wrote

No one should force people to live in a place with X politics, so yes it's entirely possible most of those places would be almost empty or not exist at all. No manipulation performed to make people stay. The balance of how many resources should be given to these societies can be determined by the ASI as it observes and talks with people. Though likely a radical abundance of resources will make this a non-issue for sustaining less technological societies.

People with a very niche ideal society would have to live with the fact that no one else wants to live there with them. If there are not enough residents to make that niche society function as the human would prefer, then the oddball would need to either try to integrate into whatever is available, or go live in VR land with virtual residents of their favorite society. However, if enough people existed in total, then the chance of this society existing would be higher.

Eventually, people would independently sort themselves so that they spend most of their time in whichever most ideal population clusters exist, without being forced to do anything.

1