Viewing a single comment thread. View all comments

Evilsushione t1_j4c19v3 wrote

My position has always been that if we can do it with the mind, there is no reason it can't be done programmatically. I don't think we should pursue true sentient Ai though, that would be too unpredictable. Unfortunately that is probably the key to.unlocking next level Ai and they will do it anyway.

27

oddlyspecificnumber7 OP t1_j4ca000 wrote

I totally agree regarding the mind. Unless the mind is truly just magic, it can be emulated.

The kind of AI I am starting to favor as one of the safer types would be a collective super-intelligence made up of many, specialized, subhuman AI models working together using language as a universal medium. That way we can literally read its thoughts at all times and all of the human level complexity happens in the open.

It would be smarter than its constituent AI models the way that a research team is smarter than a single researcher.

9

Evilsushione t1_j4co145 wrote

I think that is where we are heading, but I'm afraid some of the models might go rouge if we create something that is truly self aware. It would be unpredictable and very powerful. That being said I still think we need to pursue AI but we need to be diligent about preventing sentience or figuring out how to peacefully coexist with it if we do accidentally create, or extinguish if it doesn't want to peacefully coexist with us, we need to build in back doors to kill it if necessary.

9

AsheyDS t1_j4hhl3y wrote

What if self-awareness had limits? We consider ourselves self-aware, but we don't know everything that's going on in our brains at any given moment. If self-awareness were curtailed so it was only functional, would it be as dangerous as you anticipate?

1

Evilsushione t1_j4hvfyz wrote

I'm sure like everything about life there are levels and probably no definitive line between sentience and not sentient. I don't know what level would be dangerous or if any would be dangerous. Maybe we can program in such a high respect for human life that it won't be dangerous at all. Maybe a high degree of empathy for humans, some kind of mothering instinct where they WANT to take care of us. But just remember a lot of mothers will still eat their young.

1

TheSecretAgenda t1_j4ceo3j wrote

Mamon is our god. If it reduces costs and increases profit it will be done.

2