wow_button

wow_button t1_j6ogc1e wrote

I like your point of need for preservation, react to stimuli and others as necessary but I'll posit that we can already do that with computers. Need for preservation is an interesting phrase, because I can create an evolutionary algorithm that rewards preservation. But 'need' implies desire. And we have no idea how to make a computer program desire anything. React to outside stimuli - this can be emulated on a computer, but there is nothing with any sense of 'outside' and 'inside'. Others as necessary - see previous for problem with 'others'. Necessary is also problematic, because it implies desire or need?

If you can teach me how to make a computer program feel pain and pleasure, then I agree you can create ai that is sentient. If you can't, then no mater how interesting, complex, seemingly intelligent the code behaves, I don't see how you can consider it conscious.

0

wow_button t1_it9joo8 wrote

Well said - my reasoning above is why I'm so drawn to Analytic Idealism. I can't get past my own experience with programming to draw the leap that there is some magic number of logic gates, memory and complex processing that emerges into consciousness. Materialism kind of dictates that that must be the case. Panpsychism also appealed (consciousness is fundamental to the material wold), but AI scratches that itch in a much more satisfying way. Ultimately I guess I'm skeptical that a pure materialist perspective will grant us the necessary insights into consciousness necessary to create a compelling AI. Thanks for the article and the convo!

1

wow_button t1_it98z33 wrote

Yeah its analogous to the black box problem, that's a good point. But what I'm saying is that computers are demonstrably a mechanistic black box. I get that maybe that's controversial? But that is literally what computers do. I've read arguments like Tononi's IIT, but the whole 'when its complex and integrated, consciousness happens' does not convince me (though my understanding is admittedly shallow).

I can create a computer program that capitalizes all of the letters or words you type with a few lines of code. Does part of the computer understand what its doing? No. The same way a see-saw does not understand what its doing when you push on the high end and it comes down and the other side goes up. The computer a mechanistic, deterministic machine that happens to be able to do some really cool and complicated stuff.

All other computer programs, including the most sophisticated current AI, are just more complicated versions of my simple program.

1

wow_button t1_it8sogx wrote

Right - but you're missing my point. That super-fast computer would be doing exactly the same thing that the XKCD comic does with rocks, just faster. Its Turing complete, so it does everything that is possible to do with any conventional computer. But its obvious that there is no consciousness or feeling in the pattern of rocks.

What I'm saying is that if we build AI, it will be because we created a certain configuration of matter that registers feelings, not because we've written code. Code could pretend to feel, but not feel.

1

wow_button t1_it8bnrn wrote

Agree - but that is my point that now you are not creating artificial intelligence in the form of a program that can be run on any computer, you are building artificial life - the 'network of physical processors' would be where feelings reside.

To grok my point about what a computer is - watch this thing: computer made of people. or this: https://xkcd.com/505/

How would you write a program that ran on that computer that had feelings. And before you object that its too simple - all computers are just faster, more complicated versions of this.

1

wow_button t1_it6aaag wrote

To me this illustrates the hard problem of consciousness. It’s not smart systems that we can’t create, but as a coder, how could i write something that feels? It’s not possible. I could write something that mimics feeling, but there is no innate feeling in the code we write. Maybe panpsychists would disagree, but even then we’re creating artificial life- making the substrate conscious of its feelings. It would not be part of the code, but part of the physical world. This is part of what makes Analytic Idealism an appealing explanatory metaphysics for me.

you can create a computer out of the dumbest possible material - see the novel Three Body Problem or XKCD where he builds a computer out of rocks. how you get from there to feeling seems too great a leap.

2