Submitted by persianphilosopher t3_10qxglr in nottheonion
StinkierPete t1_j6sj4oj wrote
Reply to comment by ZhugeSimp in An AI robot lawyer was set to argue in court. Real lawyers shut it down. : NPR by persianphilosopher
Considering that bias from input data is known with AI, I doubt this.
Melodic-Lecture565 t1_j6spquz wrote
Ecactly, there s a lot of ai already used in the "justice" system which is provable extremely biased and destroys people's life, the companies providing them are protected to not disclose the programming/algorithms due to patents and it's seriously fucked up.
If anything, this makes it worse, but that s the plan, i guess, more slaves for American prisons.
JimJalinsky t1_j6tpgcc wrote
Bias mainly exists because everyone ignored potential for bias in training data. That's been changing very rapidly lately.
ph16053 t1_j6sjoc4 wrote
Is it bias if the data proves it to be true….
StinkierPete t1_j6sksv3 wrote
Then we wouldn't call it a bias, genius
pjnick300 t1_j6tklfo wrote
Okay, let's try this one more time:
If biased humans produce biased data, and an AI trains off of that data, then the AI will be...?
----___--___---- t1_j6uaslv wrote
Well... not biased? It' a machine, duh┐( ∵ )┌
/s
dnaH_notnA t1_j6tlmy2 wrote
If the data is hand selected by fallible human beings with a bias? Yes.
Viewing a single comment thread. View all comments