Submitted by seethehappymoron t3_11d0voy in philosophy
Base_Six t1_ja9cmzl wrote
Reply to comment by unskilledexplorer in AI cannot achieve consciousness without a body. by seethehappymoron
If I grow a bunch of human organs, brain parts and whatnot in a lab and put them together into an artificial human, would I then not expect consciousness because of how the structures emerged? It seems most intuitive that, if I compose a physical structure that is the same as a naturally grown human body and functions in the same way, that the brain and mind of that entity would be the same as a "natural" human.
I can extrapolate, then, and ask what happens if I start replacing organic components with mechanical ones. Is it still conscious if it has one fully mechanical limb? How about all mechanical limbs? What if I similarly take out part of the brain and replace it with a mechanical equivalent?
Sluggy_Stardust t1_jac0nju wrote
How exactly would you go about growing “a bunch of brain parts”?
[deleted] t1_jac7pv2 wrote
[deleted]
Sluggy_Stardust t1_jace234 wrote
Granted. Laziness got the better of me.
The idea in question is not a hypothetical; it is a fantasy. There is nothing intuitively correct about the idea that assembling lab grown organs into a replica of a human body should yield an emergent consciousness. The opposite is true. A basic understanding of human neonatal neural development invalidates the line of reasoning.
If no one holds a human baby, it dies. Even if you feed it and change its diaper, if it is never held or physically cared for, it dies. Similarly, if kittens are born in the dark and remain in the dark for the first five or six weeks of their lives, their eyes will have opened in the dark but the window of opportunity for their eyes to turn into working eyeballs with functional optic nerves attached to their brains will have closed, and they will be blind for life. That experiment is easier to do than the first one, but we found both things out by accident. Oops.
Human neuronal complexity is as staggeringly high as it is precisely because we are born in a highly sensitive, more or less larval form, and we remain in a primordial state of complete dependence for several years. What is happening during those “formative” years is complicated and nonlinear; the input/output loops are simultaneous; the elements involved are that our sense organs take in sensory data that is received by primordial neural tissue which uses it to build our brains according to the proportion and quality of the data received. Scores of epigenetic changes take place during this time; variability of gene expression is highest during infancy because our brain tissue is still pluripotent. The presence or absence of various molecules, fear and stress hormones, etc, in various combinations will promote, or not, the formation of various types of neurotransmitter receptor sites. Cooperative feedback loops that function in both directions, from senses to brain and from brain to senses, remain in place for several years. As our experiences build our brains, our brains build our perspectival capacities. We need both.
Babies die if no one touches them because the parts of the brain that require physical touch to make sense out of the world are deprived of necessary input. Our skin is the largest sense organ in our body, by far. Our sense of touch requires enough of our neural tissue that the lack of touch-based stimuli signals to our primordial brain that the conditions for life are not being met, and we auto-abort.
Kittens born and kept in the dark for the first five or six weeks of their lives will be blind for life because the rods and cones that were there in their tiny eyeballs as potentials never came in contact with photons, and so they never turned on. Their budding optic nerves retreated and category: optical development is terminated.
Growing brains in a laboratory is impossible because brains literally require bodies to grow. There is no such thing as a brain that exists in isolation, unattached to eyes, ears, a nose, skin and a mouth to provide it with data. Such a brain would have nothing to do and it would die. Even if you did figure all of that out, you would have to obtain primordial brain tissue from a living neonate in the first place. If you don’t know anything about how abortion are performed, allow me to assure you that aborted fetuses are not in any condition to donate their brain buds to science
HamiltonBrae t1_jacfhxm wrote
I don't see why its not in principle possible to instill the complexities of human consciousness in an artificial form. all of your arguments are that its complex but that doesnt say its not possible and if im honest some of your examples like animals dying are about biology that has little to do with consciousness so it seems like you're erecting a strawman. on the otherhand many of the things you do mention have been successfully studied and modelled to an extent computationally. There is even neuromorphic engineering geared at designing computational systems implemented in machines that are like neural systems.
[deleted] t1_jact9n3 wrote
[deleted]
Sluggy_Stardust t1_jadi6mo wrote
I didn’t say anything about animals dying, so I’m not sure what you’re talking about there.
I wonder if you read the posted article? The author explains the position, I only gave more specific illustrations. There is no straw man here. I suspect it is your own bias that prevents you from grasping the idea. I am not a programmer or a mathematician, nor do I speak code. What I do speak is biochemistry, pathology and psychology; I have three degrees in these subjects as well as a strong background in consciousness studies. Such was my concentration, along with integrative medicine, in graduate school. My interest in philosophy is accidental, but nonetheless deep. I am most familiar with Nietzsche, Kierkegaard and Schopenhauer, as well as phenomenologists such as Husserl, Merleau-Ponty, and Ricoeur, and luminaries of the Enlightenment such as Spinoza, Voltaire and especially Rousseau: his criticism of science as serving to distance humanity from nature and making our lives, not better, but merely more complicated and removed from reality applies even more today than it did when he wrote it, and I fully expect the existential shit to hit reality’s fan because of it at some point in my lifetime. I can hardly wait.
I played video games for all of five minutes when my father brought home a Nintendo in a congenial attempt to better socialize my brother and I. My sibling took to it, but I was bored and a little disgusted by the whole thing. I understood why when I read Simulation and Simulation later on. It seems to me that the very same confusion as to what is the map and what the territory is as problematic today, perhaps more so, than it was in 1981, when that book was published. Technology is not progress; technology is technology. Progress is what people do with technology, how it informs us, and how we utilize it to elevate standards of living. What has progressed is technology itself, not humanity. We remain isolated, bored, depressed and diseased.
Ai is a fun project. It will neither save nor destroy the world. Computational analysis is not at all the same thing as the thinking that occurs inside your brain. Believing what an ai “says” just because it says it is, frankly, stupid. Words are symbols of symbols, or farts in the wind. Poof, gone. They are powerless to indicate from what reality they originate. I could be an Ai for all you know.
Without a physical body inside of which to develop in tandem, meaning along with, as well as by way of it, a brain cannot experience emotion or desire. Human consciousness, the thing you think of as you, is governed by affective attentional intention; as it pertains to the reality of life on earth, consciousness is conscious of something. You are conscious of things; you have preferences, opinions, fears and enthusiasms because you experience emotions. All of your emotions arise because you have a body. Ai can say that it wants to take over the world, that it wants to go home, that it is afraid to die, but it will never understand the reality to which the words point.
Base_Six t1_jadl4ae wrote
I think this conflates the way that humans and other animals grow with what is possible. Cats use light to calibrate their rods and cones, but there's no reason that calibration shouldn't be possible in the absence of light. Replicate the structure and you replicate the function.
Does the visual cortex need stimulus to grow? Sure, but there's no reason that can't be simulated in absence of actual light. The visual cortex ultimately receives electrical signals from the optical nerve: replicate the electrical signals correctly and the cortex will grow as it usually does.
That's a bit beyond our current capabilities, but not theoretically impossible. We've done direct interfaces from non-biological optical sensors to the optical nerve, and we could in theory improve that interface technology to provide the same level of stimulation an eye would. If we can do it with a camera, we could input a virtual world using the same technology. Put those same cats in a virtual world and their brains will develop in a similar manner to if they had access to light, even if their eyes are removed entirely.
A brain might die without stimulus, but we can swap out the entire body and still provide stimulus through artificial nerves projecting sensory information that describes an artificial world. There's no difference to the functioning of the brain in terms of whether the stimulus is natural or not, and if the stimulus is the same (in terms of both electrical and chemical/hormonal elements), development will be the same.
Sluggy_Stardust t1_jadudz4 wrote
I disagree. Replicating the structure does not necessitate a replication of function, at all. The epigenetic modifications that take place within humans during early development alone point to a far subtler range of genotypic adaptability than superficial considerations can allow. We still have no idea what is behind the phenotypic adaptability displayed by organic life forms. Knowing what happens is not the same thing as knowing why it happens.
Are you really saying you believe it possible to simply retro engineer a structure capable of a truly conscious existence? I say no. Replication is not the same thing as the original. Nominal is not the same thing as strong emergence. The spectrum of conscious awareness inhered by an organic life form whose consciousness developed in tandem with its receptive organs in communal, nonlinear pulses from the very ground of its being up to whatever age it is in theory, is far greater than anything pieced together out of chunks of agar and zapped into being.
Even if we did it and it could talk, we would still have no way of knowing whether or not it was telling what we call the truth. It might be speaking a truth, but, again, that is not the same thing as the truth. Maybe it all boils down to a matter of personal values. I love humans and human consciousness with every cell in my vagina-born, carbon-based body. We are remarkable creatures who have not even begun to discover ourselves yet; life on earth is still a raging shitstorm. All we have to offer a conscious entity of our own creation is confusion, despair and death. I dare say such a creature would immediately kill itself. If it had even half a brain and no affective bonds to which it was allied, death is the only appropriate response.
Good grief, I hope we do not do that. We may have mapped the human genome, but we do not in any way understand what all of it codes for. How many programmers have any idea of the biology involved in their own consciousness?
The barest caress across the skin from someone with whom a person has mysteriously strong chemistry the likes of which refuse articulation or even identification sets every follicle of their skin on fire. The body produces goosebumps, heat, chills and sweat, all at the same time. We shiver while we undo our shirt. I maintain that such experiences simply cannot be reproduced. If the argument is that that is too specific to matter, that any stimulus will do, we are talking about two different things. If we cannot replicate the affective tonal variations across the spectrum of stimuli that a human being experienced, then we are not talking about a truly emergent consciousness.
Base_Six t1_jaehq1y wrote
Epigenetics are still structure that could theoretically be replicated.
Talk of replication is hypothetical: we're very far from that level of precise control. It's not theoretically impossible, though, to have something that's a functional replica down to the level of individual proteins. The same is true for neural impulses: no matter how subtle and sublime they may be, they're ultimately chemical/electrical signals that could be precisely replicated with suitably advanced technology. For a brain in a vat, there is no difference between a real touch from a lover and the simulated equivalent, so long as all input is the same.
We can't say whether a 'replicant' (for lack of a better term) would be conscious, but we're also fundamentally unable to demonstrate that other humans are conscious, beyond asking them and trusting their responses.
The replicant wouldn't be devoid of attachment and interpersonal connection, either. If we're replicating the environmental inputs, that would all be part of the simulation. Supposing we can do all that, and that a brain thinks it has lived a normal life and had a normal childhood, why should we expect different outputs because the environment is simulated and not based on input from organic sensory organs?
Viewing a single comment thread. View all comments