Submitted by sugaarnspiceee t3_11d6zdv in Futurology
Lirdon t1_ja71rfg wrote
You’re assuming that the AI would possess a kind of consciousness that is recognizable for you. That is absolutely will not be the case, unless it is specifically designed to mimic human psyche, or by some improbable miracle it just spontaneously develops it, maybe through the process of deep learning.
But excluding those possibilities, it is very likely the AI intelligence will be nothing that is recognizable to us. It might not interact at all with anything visual, it can possibly be purely process based with no understanding of the physical world at all, where everything it can interact with are software modules.
I personally don’t think that AI gaining consciousness will be an automatic threat to our survival — depending on its role, authority, connectivity and function. It may never develop self preservation imperative, where it will try to identify threats to itself, or an imperative to optimize its surroundings, where we might be a nuisance for.
In any case, it won’t likely be like us, able to love or care for us.
superjudgebunny t1_ja7cnlb wrote
My issue is, how would one replicate the endocrine system? The area where emotions come from. Imo that would be the hardest thing to do. “feeling” is hard to conceptualize
Lirdon t1_ja7epok wrote
Yeah, I don’t think there can be an AI with emotions like us. The whole assumption that it might like us and care for us. There are whole pathways in the brain that get stimulated by endocrine systems electrochemically that just don’t doesn’t exist in an electronic system.
I again, don’t think that AI consciousness will be even recognizable for us. We just don’t know how it would look and behave.
It might never develop some more organic tendencies. Why would it ever decide it needs to perpetuate itself, keep itself alive?
superjudgebunny t1_ja7g32n wrote
Im curious as well. I could see sometime down the road, the Star Trek idea. Positronic brain, though with technology we have. I would think more of a quantum brain but that’s so far away it’s laughable. So with what we can do, I’m extremely curious as to what would become.
We assume it will have a motive, why? Our drive is organic, the need to further the species. What does a mind without ANY emotion need or want?
I’m not sure we can even comprehend what the singularity will be like. I feel like we are very close. Often wonder if we will even know when it happens. It’s a confusing idea personally.
UniversalMomentum t1_ja7t8m1 wrote
If we program human emotions into a big dataset and keep crunching the algorithums the result should be something that mimics humans emotions so well you can't tell the difference.
We can argue if it really FEELS or not, but from our perspective it should be able to easily mimic all human behavior convincingly. Humans are not THAT complex, rather we tend to all act very similar, so we won't be that hard to mimic.
UniversalMomentum t1_ja7t1ov wrote
The same way you do everything with machine learning. You provide it with a ridiculously large dataset to build a suitable algorithm from. You don't have to understand every aspect of something because your using evolution, not hand crafting every piece of code. It's just machine learning digital evolution instead of good old biological evolution.
superjudgebunny t1_ja7up3e wrote
:/ it’s not that simple. We still use the old factory sense. This is hard to represent digitally and does not translate well logically. How do you program drive? Also called will, the will to live. Pain? These things are all very interconnected, bio signals aren’t simple.
Where do you start? How would you give AI the idea of empathy? You still have to provide an input. WHAT would that be? This is the hard question.
Musk has hinted towards this with the brain implant. We would need an interface that can translate these things. Then your just imprinting the human response, might as well build an organic computer….
The reason it’s so hard is the same reason describing human emotions are difficult. What is love without using love as a description? What’s love vs infatuation? Philosophically speaking, we can not easily do this.
UniversalMomentum t1_ja7suyc wrote
AI will be evolved through machine learning cycles too, not just hand made, it will have components and features that were not designed for at all. I don't think we will have much certainly about what we will be creating at first and then we will still lock short and long term control over the outcome of this artificial evolution.
More than we are hand crafting digital life, we are evolving digital life, which means a lot of it is still kind of out of our direct control and understanding.
Lirdon t1_ja7t4hc wrote
Exactly my point, we are likely not be able to recognize the consciousness, it would be different than anything we understand.
Viewing a single comment thread. View all comments