Viewing a single comment thread. View all comments

KishCom t1_j6n6aqt wrote

Expecting a conscious to arise from a language model is like expecting a submarine to win awards for swimming.

It shows a fundamental misunderstanding of the underlying tech.

40

muzukashidesuyo t1_j6nmt40 wrote

Or perhaps we overestimate what exactly consciousness is? Are we more than the electric and chemical signals in our brains?

10

dmarchall491 t1_j6p00sx wrote

> Or perhaps we overestimate what exactly consciousness is?

Certainly, however that's not the issue here. The problem with language model is simply that it completely lacks many fundamental aspects of consciousness, like being aware of its environment, having memory and stuff like that.

The language model is a static bit of code that gets some text as input and produces some output. That's all it does. It can't remember past conversations. It can't learn. It will produce the same output for the same input all the time.

That doesn't mean that it couldn't be extended to have something we might call consciousness, but as is, there are just way to many import bits missing.

13

AUFunmacy OP t1_j6nqi4t wrote

As I am studying neuroscience in medical school I feel I am semi-qualified to answer this.

I don't think we are any more than the electric and chemical signals in our brains, simply because there isn't anything else that we can point at yet. The fundamental fact is that all human processes, what you could call the entirety of human physiology acts via the comunication between neurons in the nervous system, which is pretty well understood.

You would be dead the very moment (1 planck second) after your neurons stopped conducting - because at that point everything stops, literally everything.

10

littlebitsofspider t1_j6nukkl wrote

The roboticist Pentti Haikonen has put forth the idea that natural (and by extension) artificial consciousness hinges on qualia, and that we won't develop said artificial consciousness until we can implement qualia-centric hardware of sufficient complexity. Considering that human wetware functions on a similar premise, i.e. that our conscious existence depends on inter-neural communication that is independent of objectivity, would you think this theory holds water?

3

JustAPerspective t1_j6paw7w wrote

>I don't think we are any more than the electric and chemical signals in our brains, simply because there isn't anything else that we can point at yet.

Pragmatic.

The limitation of the practice is that it presumes anything humans haven't discovered yet isn't relevant... while simultaneously refusing to allow for what people haven't learned.

Yet science is merely observation of what is - any incomplete observation will be suspect in its conclusions due to the variables not yet grasped.

That the atoms comprising your system shift by 98% annually indicates that - at some level - what makes up "you" is not physical.

Which leaves a lot of room for learning.

1

AUFunmacy OP t1_j6peiqq wrote

I’m so confused, do you know what “pragmatic” means? Because it just seems like you compliment my way of thinking and then say that I am ignorant and so are the rest of people who learn neuroscience and god forbid - choose to believe it.

No idea what you mean by atoms shifting 98% that’s just complete nonsense you wrote to make yourself seem more credible. At least give context to the things you say or provide some evidence? Either would be great.

1

SkipX t1_j6npg5t wrote

It's an interesting misunderstanding isn't it, but natural in a way. For oneself to know or rather experience that there is consciousness and then to make the connection that similar creatures as oneself must have that same property feels just right, even logical. But the fact that there is no scientifically quality to quantify that observation makes consciousness quite naturally a rather mythical property.

4

tkuiper t1_j6o3tsj wrote

It's why I think pansychism is right. There's no clear delineation for when a subjective experience emerged and I definitely am conscious, therefore so is everything else. I think the part everyone gets hung up on is human-like conscious, the scope of experience for inanimate objects is smaller to the point of being nigh unrecognizable to a conscious human. But you do know what its like to be inanimate: the timeless, thoughtless void of undreaming sleep or death. We experience a wide variety of forms of consciousness with drugs, sleep deprivation, etc. and thats likely a small sliver of possible forms.

6

Schopenschluter t1_j6oqfwf wrote

> timeless, thoughtless void

I would argue that time is absolutely essential to anything we call experience and consciousness—these only take place in time. Dreamless sleep is neither experience nor consciousness, but really the absence thereof. We don’t really know what it’s like to be in this “inanimate” state because we always reconstruct it after the fact through metaphors and negations (timeless, thoughtless, dreamless).

In other words, I don’t think this is evidence for panpsychism but rather demonstrates that humans consciousness shuts down completely at times. So saying that it is akin to the consciousness of, say, a stone would be to say that a stone doesn’t have consciousness at all.

2

tkuiper t1_j6otjpd wrote

But I would also say we experience middling states between dreamless and fully conscious. Within dreams, partial lucidity, or heavy inebriation all have fragmented/shortened/discontinuous senses of time. In those states my consciousness is definitely less complete, but still present. Unconsciousness represents the lower limit of the scale, but is not conceptually separate from the scale.

What I derive from this is that anything can be considered conscious, so the magnitude is what we really need to consider. AI is already conscious, but so are ants. We don't give much weight to the consciousness of ants because it's a very dim level. A conscious like a computer for example, has no sense of displeasure at all. It's conscious but not in a way that invites moral concern, which I think is what we're getting at. When do we need to extend moral considerations to AI. If we keep AI emotionally inert, we don't need to regardless of how intelligent it becomes. We also will have a hard time grasping its values, which is an entirely different type of hazard.

2

Schopenschluter t1_j6ozacy wrote

I totally agree about middling and “dim” states of consciousness but I don’t agree that experience or consciousness takes place at the lowest limit of the scale, where there would be zero temporality or awareness thereof.

In this sense, I think of the “scale” of consciousness more like a dimmable light switch: you can bring it very very close to the bottom and still have some light, but when you finally push it all the way down, the light goes out.

Are computers aware (however dimly) of their processing happening in time, or does it just happen? That, to me, is the fundamental question.

1

thegooddoctorben t1_j6pd651 wrote

>Are we more than the electric and chemical signals in our brains?

Yes: speaking loosely, we have organic bodies with highly sensitive nerves and hormonal pathways. Those are the basis of emotion and sensation. That's the foundation of consciousness or awareness.

An AI without our organic pathways is categorically different. That's what makes it artificial.

At some point, if we combine an AI with organic sensitivity, we will be creating intelligence itself, not artificial intelligence. So we can't ever create AI with consciousness, but we could artificially create consciousness.

3

SkipX t1_j6nodub wrote

I think your answer fundamentaly misunderstands consciousness. Though thats an understandable mistake to make.

I do not believe that there is any real evidence of what consciousness actually is and whether anything even has it (outside of yourself but that is different problem). To then claim you know what can NOT have consciousness is pretty naïve.

0

TheRealBeaker420 t1_j6nq6to wrote

I don't think we lack evidence for what it is, so much as we simply use the term to encapsulate a great many concepts. The lack of an agreeable definition is more a failing of language than a result of the mind being "mythical", as you said.

2

SkipX t1_j6nzc9c wrote

Well then what evidence is there which can not be adequately explained without consciousness?

Also just to make it clearer, I do not claim it to BE mythical. Just that it is easy to seem mythical.

2

TheRealBeaker420 t1_j6o1rhq wrote

Sorry, I'm not sure I understand the question. I agree with the second part, though. It has a lot of attributes that make it difficult to describe, and it's something we give great importance to.

Edit: to try to address the question, I think human behavior is the best evidence. We demonstrate awareness through our actions. There are other terms we can use to describe these traits, though.

2

AUFunmacy OP t1_j6nss96 wrote

I understand the response as I have experience in programming neural networks. You mean that just because the AI that we have run on software and might perceptually represent a similar model to neuronal activity. But physically, on the hardware level and on the compiling level it is very different. However, in essense, still represents steps of thought that navigate toward a likely solution - which is exactly what our brains do in that sense.

I don't mean to say that AI will gain consciousness and suddenly be able to deviate from its programming, but somehow just maybe, the complex neuronal activity conjures a subjective experience. It can only be explained by understanding that when looking at a single celled organism with no organs or possible mechanism of consciousness 3.8 billion years ago it is easy to say that thing cant develop consciousness; and as you evolve this single cell into multi-cellular organisms it still seems impossible until you see a sign of intelligent thought and you think to yourself "when the hell did that begin?" No one knows the nature of consciousness, we have to stop pretending we do.

Let it be known I think a submarine would win the olympics for swimming, and I also think you are naive to consider your consciousness anything more than a language model with some inbuilt sensory features.

−1

KishCom t1_j6nuddm wrote

> I have experience in programming neural networks

Me too!

> just maybe, the complex brain activity conjures a subjective experience

That would be lovely. Conway's Game of Life, "simple rules, give rise to complexity" and all that. I don't think there's enough flexibility in current hardware that executes GNNs to allow this though. The kind of deviation required would be seen as various kinds of errors or problems with the model.

> I think a submarine would win the olympics for swimming

This is something a language model would come up with as it makes about as much sense as inquiring about the marital status of the number 5.

> I also think you are naive to consider your consciousness anything more than a language model with some inbuilt sensory features.

I think you should meditate more, perhaps try some LSD. What Is It Like to Be a Bat anyway?

edit BTW: I hope I don't come off as arguing. I'd love to have a cup of coffee and a chat with you.

3