Viewing a single comment thread. View all comments

glass_superman t1_ixe2glj wrote

A baby doesn't need to learn to be hungry but neither does a computer need to learn to do math. A baby does need to learn ethics, though, and so does a computer.

Whether or not a computer has something fundamentally missing that will make it never able to have a notion of "feeling" as humans do is unclear to me. You might be right. But maybe we just haven't gotten good enough at making computers. Just like we, in the past, made declarations about the inabilities of computer that were later proved false, maybe this is another one?

It's important that we are able to recognize when the computer becomes able to suffer for ethical reasons. If we assume that a computer cannot suffer, do we risk overlooking actual suffering?

−2

d4em t1_ixe5eyy wrote

The thing is, for a baby to be hungry, it needs to have some sort of concept as hunger being bad. We need the difference between good and bad to stay alive. A computer doesn't, because it doesn't need to stay alive, it just runs and shuts down according to the instructions its given.

We need to learn ethics, yes, but we don't need to learn morals. And ethics really is the study of moral frameworks.

It's not because the computer is not advanced enough. It's because the computer is a machine, a tool. It's not alive. It's very nature is fundamentally different from that of a live being. It's designed to fulfil a purpose, and that's all it will ever do, without a choice in the matter. It simply is not "in touch" with the world in the way a live being is.

It's natural to empathize with computers because they simulate mental function. I've known people to empathize with a rock they named and drew a face on, it doesn't take that much for us to become emotionally attached. If we can do it with a rock, we stand virtually no chance against a computer that "talks" to us and can simulate understanding or even respond to emotional cues. I would argue that it's far more important we don't lose sight of what computers really are.

And if someone were to design a computer capable of suffering, or in other words, a machine that can experience - I don't think its possible and it would need to be so entirely different from the computers we know that we wouldn't call it a "computer" - that person is evil.

1

glass_superman t1_ixen19z wrote

>And if someone were to design a computer capable of suffering, or in other words, a machine that can experience - I don't think its possible and it would need to be so entirely different from the computers we know that we wouldn't call it a "computer" - that person is evil.

I made children that are capable of suffering? Am I evil? (I might be, I dunno!)

If we start with the assumption that no computer can be conscious then we will never notice the computer suffer, even if/when it does.

Better to develop a test for consciousness and apply it to computers regularly, to have a falsifiable result. So that we don't accidentally end up causing suffering!

0

d4em t1_ixeu6yh wrote

I'm not saying its evil to create beings that are capable of suffering. I would say that giving a machine, that has no other choice than to follow the instructions given to it, the capability to suffer would be evil.

And again, this machine would have to be specifically designed to be able to suffer. There is no emergent suffering that results from mathematical equations. Don't develop warm feelings for your laptop, I guarantee you they are not returned.

1

glass_superman t1_ixfso7p wrote

Consciousness emerged from life as life advanced. Why not from computers?

You could argue that we wouldn't aim to create a conscious computer. But neither did nature aim to create consciousness and here we are.

So I absolutely do think that there's a chance that it simply emerges. Just like it did before. Every day some unconscious gametes get together and, at some point, consciousness emerges, right? If carbon, why not silicon?

1

d4em t1_ixguiui wrote

Well, first, the comparison you're drawing between something created by nature and a machine designed by us as a tool is incorrect. We were not designed. Its not that "nature" did not aim to create consciousness, its that nature does not have any aim at all.

Second, our very being is fundamentally different from what a computer is. Experience is a core part of being alive. Intellectual function is built on top of it. You're proposing the same could work backwards; that you could build experience on top of cold mechanical calculations. I say it can't.

Part of the reason is the hardware computers are working on, they are entirely digital. They can't do "maybes."

Another part of the reason is that computers do not "get together" and have their unconsciousness meet. They are calculators, mechanically providing the answer to a sum. They don't wander, they don't try, they do not do anything that was not a part of the explicit instruction embedded in their design.

1

glass_superman t1_ixhifzy wrote

Is this not just carbon chauvinism?

Quantum computers can do maybe.

I am unconvinced that the points that you bring up are salient. Like, why do the things that you mention preclude consciousness? You might be right but I don't see why.

1