Submitted by AUFunmacy t3_10pwt44 in philosophy
AUFunmacy OP t1_j6njsgh wrote
Reply to comment by PsiVolt in The Conscious AI Conundrum: Exploring the Possibility of Artificial Self-Awareness by AUFunmacy
As a neuroscience major who is currently in medical school and someone with machine learning experience (albeit not as much as you) - I respectfully disagree.
Lets assume we have 2 hidden layers in a neural network that is structured like this: FL: n=400, F-HL: n=120, S-HL:n=30, OL: n=10. The amount of neural connections in this network is 400*120 + 120*30 + 30*10 = 63,910 neural connections. This neural network could already do some impressive things if trained properly. I read somewhere that GPT3 (recent/very-similar predecessor to chatgpt which is only slightly optimised for "chat") uses around 175 billion neuronal connections, but GPT 4 will reportedly use 100 trillion.
Now the human brain also uses around 100 trillion neuronal connections and not even close to all of them for thought, perceptions or experiences - "conscious experiences". I know that neuronal connections is a poor way to measure a neural networks performance but I just wanted a way to compare where we are at with AI compared to the brain. So we are not at the stage yet where you would even theorise AI could pass a Turing test - but how about when we increase the number of connections that these neurons are able to communicate with by 500 times, you approach and I think surpass human intelligence. Any intellectual task at that point, an AI will probably do better.
I simply think you are naieve if you think AI won't replace humans in a number of industries, in a number of different ways and to a large extent. Whether or not Artificial Intelligence will gain consciousness is a question you should ask yourself as an observer of the Earth as single celled organisms evolved into complex and intelligent life. at what point did humans, or if we weren't the first then our ancestor species, gain their consciousness? The leading biological theory is that consciousness is a phenomenon that happens as a result of highly complex brain activity and is merely a perception. So who is to say that AI will not evolve that same consciousness that we did, it certainly doesn't mean that they aren't bound by their programming just like we are always bound by physics but maybe they will have a subjectively conscious experience.
​
Edit: I will note: I have left out a lot of important neuroanatomy that would be essential to explaining the difference between a neural network in and AI vs a brain. But the take home message is, the machine learning model is not a far fetched take whatsoever. But it is important to reign home that software cannot come close to the physical anatomy of neuroscience.
RanyaAnusih t1_j6nlgk7 wrote
Only an understanding of quantum theory has any hope of explaining consciousness. Complexity in networks most likely will not solve the issue.
Life is taking advantage of quantum processes at a fundamental level.
The current model of neuroscience is also misleading. Some kind of enactivism must be considered
TheRealBeaker420 t1_j6o8tkl wrote
People have said that for decades, but I think it's been pretty thoroughly refuted. Here are some good links:
Viewing a single comment thread. View all comments