Viewing a single comment thread. View all comments

d4em t1_ixdqech wrote

>One last point. You'd be amazed how useful "innocent" incidental data is. Just the expressions on faces or even clothing style and gait may correlate to other data in unexpected ways.

Looking angry on your way home because you got a cancer diagnosis and you're convinced life hates you? The police will now do you the honor of frisking you because you were identified as a possible suspect!

Are you a person of color that recently immigrated? Were you aware immigrants and persons of color are disproportionally responsible for crimes in your area? The police algorithms sure are!

This is an ethical nightmare. People shouldn't be suspect based on innocent information. Even holding them suspect for a future crime because of one they committed in the past is iffy. There's a line between vigilance and paranoia that's being crossed here.

And neither should we monitor everything out of the neurotic obsession someone might do something that's not allowed. Again, crossing the line between vigilance and paranoia. Like, crossing the line so far that the line is now a distant memory that we're not really sure ever existed. Complete safety is not an argument. Life isn't safe and it doesn't have to be. We all suffer, we all die. There is a need to strike a balance, so we can do other things besides suffering and dying. Neither safety nor danger should control our every second.

9

bildramer t1_ixi99wd wrote

On the one hand, sure, I want to be free to murder people if I really want, and free of creepy 24/7 observation, and people shouldn't assume things about me even if they're 100% accurate, and I would never trust anyone who wants to put cameras on me who claims it comes from a desire to reduce murders - let alone if it's lesser crimes.

On the other hand, if we really had a magical technology that allowed us to predict and stop murders with perfect accuracy and without the usual surveillance indignities and risks, it would be criminal not to use it. That hypothetical wouldn't be just another way for the powerful to assert themselves. And the problem with using it for other crimes is mostly that certain actions shouldn't be criminal, i.e. that the law is not lenient enough or not specific enough (perhaps for good reasons). In an ideal world with better institutions, we would resolve such a problem by changing the law.

1

eliyah23rd t1_ixdzjjc wrote

That might happen and it's a danger but that's not the mainline scenario.

Data being collected on facial expressions in the billions is more likely. Then you correlate that with other stuff. Bottom line, it's as if the cameras are installed in the privacy of your home, because mountains of data in public provides the missing data in private.

Then you correlate the inferred private stuff with more stuff. That's how you build "Minority Report"

−1

d4em t1_ixe1anb wrote

>Data being collected on facial expressions in the billions is more likely. Then you correlate that with other stuff. Bottom line, it's as if the cameras are installed in the privacy of your home, because mountains of data in public provides the missing data in private.

I would say this constitutes "monitoring everything out of the neurotic obsession someone might do something that's not allowed", wouldn't you?

4