Viewing a single comment thread. View all comments

myusernamehere1 t1_j34yb7o wrote

Oh im not arguing that ChatGPT is conscious, i just dont think you have arrived at any meaningful reasons as to why it couldnt be concious. Whos to say that an "input of tokenized vectors of numbers that represent tokenized text" is unable to result in consciousness? Again i do not think ChatGPT is necessarily advanced enough to be considered sapient/sentient/conscious.

8

ChronoPsyche t1_j34yu96 wrote

> i just dont think you have arrived at any meaningful reasons that it couldnt be concious.

I don't need to arrive at meaningful reasons why it couldn't be conscious. The burden of proof is on the person making the extraordinary claim. OP's proof for it being conscious is "because it says it is".

Also, I'm not saying it can't be conscious as I can't prove a negative. I'm saying there's no reason to believe it is.

0

myusernamehere1 t1_j34zt78 wrote

True. And i agree for the most part. Yet you started with and provided other arguments for why you think it is not conscious, none of which hold up to scrutiny. I am just arguing against those claims.

2

ChronoPsyche t1_j350ri7 wrote

I mean they do hold up to scrutiny. We have no reason to think that a probability model that merely emulates human language and doesn't have any sensory modalities could be sentient.

That's not an airtight argument because again, I can't prove a negative, but the definition of sentience is "the capacity to experience feelings and sensations." and ChatGPT absolutely does not have that capacity, so there's no reason to think it is sentient.

0

myusernamehere1 t1_j352wrq wrote

Sentience is the ability to have "feelings". These do not have to be similar to the feelings us humans understand, they could be entirely alien to our experiential capabilities. The ability to interpret text prompts could be a sort of sensory modality. And id argue that way the human brain operates can be abstracted to a complex "probability model". It is very possible that consciousness itself is "simply" an emergent property of complex information processing.

Have you seen the paper where a researcher hooked up a rat brain organoid to a (in simple terms) brain chip, and taught it to fly a plane in a 3d simulated environment? Or, more recently, a human brain organoid was taught to play pong? These organoids had no ability to sense their environment either, and both may very well have some limited level of sentience/consciousness.

2

ChronoPsyche t1_j353a24 wrote

Nothing you're saying is relevant. Anything could be possible, but that isn't an argument against my claims. My keyboard could have strange alien sensory modalities that we don't understand. That doesn't make it likely.

1

myusernamehere1 t1_j354tls wrote

Well, i disagree with everything you just said and find the keyboard analogy humorously off-target. My argument is not "anything is possible."

2

ChronoPsyche t1_j355cjq wrote

What is your argument then? You haven't actually stated an argument, you've just told me mine is wrong.

1

myusernamehere1 t1_j355jto wrote

My argument is that your arguments arent valid lol

2

ChronoPsyche t1_j355vbk wrote

"I agree with your conclusion but I just thought id point out that your arguments are bad". Lol that's rather pedantic but okay. You do you.

1

myusernamehere1 t1_j356iy7 wrote

Well, i saw a bad argument (or a few) and i pointed it out and explained my reasoning. Not sure why thats a bad thing, i think it promotes educated discourse.

2