Viewing a single comment thread. View all comments

Xavion251 t1_jabq7pa wrote

Well, also I share most of my DNA with other humans. They look roughly like me, act roughly like me, and biologically work the same as me.

So it's a far more reasonable, simple explanation that they are conscious just like I am. To a somewhat lesser degree, this can extend to higher animals as well.

But an AI that acts conscious still has some clear differences with me in how it works (and how it came to be). So I would place the odds significantly lower that they are really conscious and aren't just acting that way.

That said, I would still treat them as conscious to be on the safe side.

1

TKAAZ t1_jac5w7z wrote

You are literally a bunch of signals. So is an "AI" existing in a bunch of silicon. There is nothing (so far) precluding consciousness from existing in other types of signals other than our assumptions.

As for your arguments, it seems that you argue that "since other humans look like you they must be conscious", and you then conclude that this implies that "entities that do not look human are not conscious.".

I may agree with the first, but that does not entail the opposite direction, and hence it can not be used here. It's like saying "if it rains the street is wet" and then concluding "if the street is wet it rains".

2

Xavion251 t1_jac7mgs wrote

>You are literally a bunch of signals. So is an "AI" existing in a bunch of silicon.

Putting that (problematic IMO, as I'm a dualist) assumption aside and simply granting that it is true - human brains use different kinds of signals generated in different ways. Does that difference matter? Neither you or I can prove either way.

>As for your arguments, it seems that you argue that "since other humans look like you they must be conscious", and you then conclude that this implies that "entities that do not look human are not conscious.".

This is reductive. I'm not talking about superficial appearance. I wouldn't conclude that a picture of a human is conscious - for example.

But I would conclude that something that by all measures works, behaves, and looks (both inside and out, on every scale) like me probably is also conscious like me.

It would be rather contrived to suggest that in a world of 7 billion creatures like me (and billions more that are more roughly like me - animals), all of them except for me in particular just look and act conscious while I am truly conscious.

>I may agree with the first, but that does not entail the opposite direction, and hence it can not be used here. It's like saying "if it rains the street is wet" and then concluding "if the street is wet it rains".

No, because we can observe the street being wet for other reasons. We can't observe consciousness at all (aside from our own).

1

TKAAZ t1_jaclivj wrote

​

>Does that difference matter? Neither you or I can prove either way.

I did not say it did or did not, I am saying you can not preclude that it does, which is what the claim of the article OP is. It seems to me you are inadvertently agreeing with this. My main point was to refute OPs claim that

>As far as I can tell, we haven’t been able to prove that brain complexity = consciousness. Meaning, there is more to consciousness than the complexity of a neural network.

as their observation of a "lack of proof" does not imply the conclusion. Furthermore you mention

>No, because we can observe the street being wet for other reasons. We can't observe consciousness at all (aside from our own).

Again I think you misunderstand my point, my example was just an analogy as to why the the conclusion you arrive at is incorrect at a logical level. You claim that 1) you are conscious, and 2) "because others are look like you (subject to some likeness definition you decided upon), then they are likely to be conscious". Fine. However, this does not imply the conclusion you try to show, i.e. that 3) "Someone who is (likely to be) conscious must look like like me (subject to the likeness definition you decided upon)". This sort of reasoning is a fallacy at its core, and it is non-sequitor from the premise 1) and the assumption 2) at a logical level. You are basically claiming that it must rain because the street is wet. It's extremely common for people to make these mistakes, however, and unfortunately it makes discussing things quite difficult in general.

2