Comments

You must log in or register to comment.

BassoeG t1_jc57viz wrote

Step one is to find the ideologues who deliberately manipulate AIs to attempt to remove biases by hardwiring in biases of their own against the biases they expect the AIs to form, step two is to remove them from the field before they cause a cataclysm.

8

Liberty2012 t1_jc4vwwj wrote

The best we can do is dealing with it by the same means we do today, decentralization. There should not be a single governing AI, but distributed cooperatively owned systems. However, it is likely to be very difficult to get there.

Removing all bias will not be possible, as best we can do is negotiate our own biases as feedback into the system. I have a more detailed explanation in the event you are interested - https://dakara.substack.com/p/ai-the-bias-paradox

7

uswhole t1_jc5mda5 wrote

is possible to have decentralize AGI when the AI would be better off communicate and align with each other than humans?

0

Spreadwarnotlove t1_jc5nn8l wrote

Why would AI be more aligned with eachother?

1

uswhole t1_jc5nzjq wrote

AI would able send messages 1 million times faster than human communication, but the time you read my post, the AIs either set aside their difference or the more powerful AI subtle/hack the weaker AI thus creating a singleton system.

0

Spreadwarnotlove t1_jc5qe8j wrote

Or they close each other off from one another and the vicious exchange leave them only wanting to deal with humans.

1

errllu t1_jc4vmk0 wrote

You dont? You use it to crush China.

Fr tho, if US gov deciedes do do such a thing, what can you do? You can't stop shit. You Chinnese spy, you

4

Floofyboy t1_jc5ir9w wrote

I mean it looks like current AIs are purposely being fed bias. The bias does not come from the actual model, but from the programmers imposing their own bias. If you use uncensored AIs this issue does not happen.

Essentially when you ask chatGPT to make a nice poem about a controversial politician and it says "as a language model i can't its offensive" and then it does it for a less controversial politician, that's bias which was pushed by the programmers.

1

Readityesterday2 t1_jc4xz4y wrote

A calculator that deliberately miscalculates mathematical values entered by a nationality or regional group could lead to absolute misery and death.

It could make their buildings and bridges unsafe.

This is why we shouldn’t have calculators.

Let’s just go back to the Stone Age and ban stones too.

Let’s just fuckin ban existence all together 😂

−1

AllEndsAreAnds t1_jc4zh3w wrote

That’s a poor analogy.

We don’t have calculators like that, and if we did, it would make buildings and bridges unsafe.

That’s exactly the point. Trusting powerful tools with bias you can’t disentangle is asking for a misalignment of incentives and inequity on who knows what scale.

6