Viewing a single comment thread. View all comments

h3yw00d1 t1_j1o41r3 wrote

Can it be made 3 laws safe ? (This comment is to short so the bot tells me so here is an outline of Issac Asimov's 3 laws of Robotics). The first law is that a robot shall not harm a human, or by inaction allow a human to come to harm. The second law is that a robot shall obey any instruction given to it by a human, and the third law is that a robot shall avoid actions or situations that could cause it to come to harm itself.

1

joyloveroot OP t1_j1oacz8 wrote

Clearly the 2nd law could be at odds with laws 1 and 3, so while I understand the good sentiment behind the 3 laws, perhaps it needs an priority matrix, like maybe law 1 takes ultimate precedence? But of course ethics are more messy in some cases like the trolley problem, etc…

2

L4ZYSMURF t1_j1op6g8 wrote

I think that is originally how the 3 laws were created. Each one is subordinate to the ones above.

2

OtterlyAwesome t1_j1opgyu wrote

They are already in priority order. Always follow law 1 first, then law 2, then finally law 3. There's also a Zeroth Law of Robotics: A robot may not harm humanity, or, by inaction, allow humanity to come to harm.

2

ChronoFish t1_j1rb8u3 wrote

3 laws is a great story line vehicle... But you won't see it in practically... Because it assumes choice

2