MonkeeSage

MonkeeSage t1_j5wzy9p wrote

What prevents a human security guard from misidentifying a woman or person of color when they are using their own eyes looking at a cctv screen and comparing against a list of people who are not allowed? The AI training data being potentially biased toward being able to better identify white males is based on training data from humans.

−8

MonkeeSage t1_j5wiku6 wrote

The stuff about possibly being illegal to ban legal opponents from entry makes sense. This last bit is silly.

> Lastly, research suggests that the Company’s use of facial recognition software may be plagued with biases and false positives against people of color and women.

1.) Nobody is being banned just from facial recognition. Human security guards contacted the woman and confirmed her identity. 2.) People in general are biased, including the police in your state, so now they can't have security guards or police either?

−15

MonkeeSage t1_iz482p9 wrote

2

MonkeeSage t1_iyhiqdx wrote

That's kind of how it works though, unanimous consent requires...unanimous consent. And that is only required to expedite the process. The default is a full vote and confirmation, where everyone gets to interview the candidate, present arguments for or against them, etc.--the democratic process. Yes, refusing to consent unanimously can be used as a "stall" tactic, but there really should be no less than unanimous consent for Senate to be able to act outside the normal democratic process, even if it means shitheads sometimes get to be shitheads.

36