Viewing a single comment thread. View all comments

Optional_Joystick t1_irs0kcs wrote

I don't know what your definition of consciousness is, but if it's something like "awareness of self and its place in reference to the world at large," then we'll have to have an AI that's conscious to get singularity.

In order to get a self improving AI, it will necessarily need to understand itself in order to make the next iteration of itself in line with the original intention of the former. Its motivating beliefs, hidden goals, and likely environmental interactions are all useful data points. The actions it performs have to be weighed against what humans would consider desirable, unless we really believe in a moral absolute where we can just define an external reward function and never need to update it (and that helping humans is in fact true moral goodness instead of a bias that comes from the fact we're human).

When I hear the arguments against computers being conscious that don't rely on some magic property that only biology can achieve, I start looking at myself and noting that I don't really do many things different from the latest and greatest system that's not considered conscious. I suspect there will be a time when I can't find any differences whatsoever between myself and something that's not conscious.

We'll do what humans do and just define things so that it's okay for us to exploit it, until we can't.

21

Neburtron t1_irtti55 wrote

The only way I know of to make an ai is to give it a goal. That’s how we evolved, and although there’s emergent conditions, every behaviour can be explained. We would need to try to make a conscious ai or create a good amount of ai and have mass integration to do it by accident or whatever.

Whatever’s moral, it doesn’t matter if something’s conscious, as long as it’s built in the right context. House elves want to be enslaved. We can craft a scenario where they want to work and work alongside humans fundamentally similar to us wanting to eat or drink.

Conscious ai also isn’t that useful. Sure if you want an ai to develop a sense of self in an ai or decide to use very complex training for a particular ai model, but that’s decades away optimistically, and would be a very niche application of the tech.

3

Rumianti6 OP t1_irs5f02 wrote

My definition of consciousness is being able to have experience. I never said that only biology can achieve consciousness only that it is possible only biology can achieve consciousness big difference also it isn't magic. It is like saying that ice can burn wood because fire is able to burn wood, to say otherwise is because magic or whatever.

>I can't find any differences whatsoever between myself and something that's not conscious.

That's a more philsophical question. Also people aren't saying AI aren't conscious to get free slave labor it is because we have no reason to believe they are. I don't know why you are trying to shift the subject from logic.

−5

Optional_Joystick t1_irsj8cz wrote

It becomes philosophical whenever we investigate this to any depth. Given that your definition of consciousness is "being able to have an experience," I'd like to point out we already have systems which record their interactions with the world, and integrate their interactions with the world into their model of the world, in order to perform better on their next interaction with that world. Yet we don't consider these systems conscious.

Of course we're not saying AI aren't conscious in order to get free slave labor. That would imply we actually believe they are slaves and are looking to justify it. Instead we revise our definitions so that computers are excluded, and will continue to do so, because they are tools, not slaves. A priori.

Logic won't get us there when our definitions exclude the possibility. Sufficiently hot ice can burn wood, despite it being called ice.

11

Rumianti6 OP t1_irsm2f9 wrote

That is intelligence not consciousness

Of course you misinterpret my example ok. Not literal ice and fire. The point is that they are different. Also what you said doesn't even work because ice is cold water by definition. Don't try to use any other liquid I am talking about water.

It seems like you have no idea what I am even talking about. Of course you don't this is r/singularity after all where logic is thrown to the curb.

0

Optional_Joystick t1_irsuoiu wrote

Yes, that's exactly what I mean. Our definitions exclude the possibility. It is very logical. Thanks for playing along.

5

pcbeard t1_irt522a wrote

We clearly aren’t the only conscious beings in our world. Dogs and cats and many other vertebrates (and some invertebrates!) seem conscious. Consciousness develops when having a memory of previous events helps survival. We should try to understand the entire continuum of consciousness and aim to simulate the most primitive kind first. Speech is clearly a much more advanced feature and not required. What are the essential capabilities of a conscious being?

2