Hagisman

Hagisman t1_j5l5lmm wrote

This is where a biased data set doesn’t help. In seminars on AI biases you usually want a diverse data set in order to draw a plausible conclusion.

If you are poisoning the data set by overusing one type of data such as Instagram models you’ll want to offset it. Unless of course your App has a Turn my picture into an Instagram model picture feature. In which case you filter out your dataset for just that specific tag.

−3

Hagisman t1_j5jwj59 wrote

Less you need ugly photos, but just more normal ones. If you only use pictures of people on runways you aren’t getting the people who can’t afford designer clothes.

Similarly pictures taken only during vacation don’t show the gloom of a funeral.

Diversity helps keep the results diverse.

If you only flooded it with pictures of people wearing sweaters the system would never know what swimsuits and t-shirts are. 😅

1

Hagisman t1_irpok4j wrote

Reminds me of “Life Sucks” graphic novel about a guy who gets turned into a vampire to work the night shift of a 7-11 type store. Proceeds to meet a goth woman who fantasizes about vampires, not realizing the true mundanity of it all. great read if you can find a copy.

2