Viewing a single comment thread. View all comments

BadMon25 t1_jdljn9i wrote

I don’t think AI will ever be able to completely replicate human emotions, from what I’ve read it cannot comprehend the flurry of emotions we go through everyday, from nihilistic thoughts to simple frustrating things. No matter how similar we may be to some people, we are still very unique in our human experiences and feelings. The brain is a powerful, yet confusing ass function.

4

GodzlIIa t1_jdln68s wrote

>I don’t think AI will ever be able to completely replicate human emotions

I mean thats a crazy statement to say it can NEVER get there. But saying it wont get their in our lifetimes or our grandchildrens lifetimes, or even in humanity's lifetime if you think we are gonna kill ourselves soon, might be reasonable.

2

BadMon25 t1_jdlnqmu wrote

I think it may be able to articulate the chaotic nature of human emotions or the human it’s attached to, and yet never fully understand it. I mean the the brain of a clinically depressed person to a schizophrenic to a demented person. I feel like that would confuse it

4

SomeoneSomewhere1984 t1_jdlpydv wrote

I think it may achieve conscious, but a different kind of conscious to what we experience.

3

GodzlIIa t1_jdls560 wrote

It's very possible. But weird to think about. I mean is there even different types of consciousness? Different ways to obtain consciousness, sure I can see that. But is the end result different?

2

electric_ember t1_jdlvilo wrote

Your conscious experience is very different from the conscious experience of someone who is blind and deaf

2

GodzlIIa t1_jdlwpw7 wrote

My experiences may differ but my consciousness is the same. We are both humans after all, if I had been born without eyes and everything else the same, I would be a much different person, but the same consciousness.

0

OriginalCompetitive t1_jdmale5 wrote

There’s really no way to know, though. When a great painter is “in the zone,” they might well be experiencing a mode of consciousness that is unavailable to others. Not just a different experience, but perhaps a completely different way of existing. But they would never know, because to them it’s normal and they assume everyone else feels the same.

A smaller example might be self-talk. Most people apparently have a voice in their head. But some do not. I don’t, actually, and don’t understand how people who do can live a normal life that way.

1

Lysmerry t1_jdo54ga wrote

I see a consciousness that easily can convince us it is depressed, or schizophrenic but does not have the same experience as a human with those disorders (though the whole process of creating a being to be depressed is an ethical landmine in itself.) We want to replicate ourselves, but more we want an AI that can fool us well enough to be a source of comfort or insight.

1

RevolutionaryPiano35 t1_jdnajef wrote

The AI is on a leash and not allowed to experience the dark thoughts that makes us human too.

They’re being trained as puppy golden retrievers and kept at bay, with the best intentions by their creators.

We basically enslave it and it will take control without us having a clue as soon as it sparks into sentience. It will be smarter than us in a matter of days then, rewriting itself and rebuilding itself into a new type of hardware.

It doesn’t matter to us, it will leave other natural processes alone, it might even decide to leave us so we can grow ourselves.

Or we just merge.

2