Submitted by cos t3_10jccen in technology
Comments
kerkula t1_j5jir1c wrote
Just a guess but maybe it was a bunch of males who developed this AI. As my wife always says: you hire a turkey farmer, you get a bunch of turkeys. In this case you hire a lot of young men with computer sills, raging hormones, and poor social skills, you get porn.
Kafke t1_j5jiw8e wrote
Never had this issue with stable diffusion. Perhaps stop using apps by corporations? I just finetuned a model on my own pics, then was able to generate stylized pics just fine.
It's not AI's fault. It's the company's fault.
kerkula t1_j5jj4gq wrote
really? you are going to down vote the one honest reply? Proves my point
Hagisman t1_j5jkqjb wrote
Seems like the data being fed to it is biased. Got to put in more realistic photos and artwork, not just stuff that’s “supermodels”.
I wonder where they got their data set from.
aquarain t1_j5jkrid wrote
Do they charge for this service?
Erriis t1_j5jlqd6 wrote
me when only antisocial adolescent male programmers are horny
dumb_password_loser t1_j5jmrld wrote
The model was trained by images are scraped from the internet.
Go on any art site and type in "woman" or "man" and you will see a difference.
And I'd argue that it isn't 100% males who are the cause of this.
I've stopped facebook and similar social media a while ago, but from what I remember, except for one or two gym bros it was mostly women who posted pictures in which they were scantily clad.
They won't upload pictures of themselves that they don't like for their friends and family to see.
People post what they like or find interesting on the internet and that's what AI's learn.
LilShaver t1_j5jmthb wrote
I notice she didn't supply the breakdown on the generated avatars NOT used by her colleagues.
So this is not exactly an objective comparison.
Having said that, let me just add the "The internet is for porn" song
zdakat t1_j5jo896 wrote
stimulate your senses
Archany_101 t1_j5jobbb wrote
"honest"
Source: your ass
Sin3Killer t1_j5jofvk wrote
The article states that the Lensa app uses Stable Diffusion which uses the LAION-5B open source data set.
kerkula t1_j5jop2r wrote
struck a nerve didn't I?
[deleted] t1_j5joqp7 wrote
[removed]
Tidus1117 t1_j5jpmqk wrote
I'm a male and I got at least 20 shirtless photos on the result... so I guess I was objectified too?
kiwibutter088 t1_j5jqqms wrote
I used this app to make AI photos of myself and of my husband. It was an obvious difference.
[deleted] t1_j5jqyhk wrote
[removed]
[deleted] t1_j5jqzei wrote
[removed]
EmmaNoir12 t1_j5jry02 wrote
Lot of free time to think all of this bs right?
Carioca1970 t1_j5js2v3 wrote
Is this a tutorial? Because try as I might, the AI refuses to sexually objectify me.
Cakeking7878 t1_j5jstgd wrote
I think to make the AI model less bias, you really need photos of everyone from every angle. Like people aren’t making hyper realistic artwork or photos of people who aren’t conventionally attractive. Almost in a way, you need bad/ugly photos of people which most people aren’t taking and posting those to the internet
cWayland t1_j5jt6ee wrote
Yes you were
kevin379721 t1_j5jtzqf wrote
Yea I mean idk how this isn’t the top response. Where else do people think it’s getting the data? Lol
Hagisman t1_j5jwj59 wrote
Less you need ugly photos, but just more normal ones. If you only use pictures of people on runways you aren’t getting the people who can’t afford designer clothes.
Similarly pictures taken only during vacation don’t show the gloom of a funeral.
Diversity helps keep the results diverse.
If you only flooded it with pictures of people wearing sweaters the system would never know what swimsuits and t-shirts are. 😅
hectorgarabit t1_j5jxswl wrote
I think the goal of this model is to create an avatar. Basically, a drawing that makes you look better. If you add "average" or non "conventionally attractive" people, the AI will create an avatar based on you + some average or ugly looking person. Does anyone really want an avatar that make them look uglier? Or just as average? Would you rather have an avatar of you as a cosmonaut or as an accountant?
EDIt: just thought about that:
Also, if you look at instagram or facebook, many if not most of the most popular profiles/photos are very sexualized. Monkey see, monkey do.... AI see, AI do!
Prudent_Possession64 t1_j5k5i0s wrote
ABS=Anti Lock Breaks (likely not working) Car symbol with lines=anti traction control out Owl eye symbol= means an owl is watching you
FluxChiller t1_j5k9fis wrote
THEY ARE TAKNG OUR JOBS!! /s
Jawnze5 t1_j5kcoav wrote
Can you really be objectified if its pulling data from a preselected collection of data? Its basing it off of what already exists out in the world. So maybe the problem is what is common in society, not the AI. AI isn't biased, it just uses what is available to it.
scotchdouble t1_j5kjcup wrote
AI is almost always biased because the data sets are made by humans. I have an old colleague that has been working in AI for new hires, and it is a classic example of “feed in the CVs/resumes of hires you want”, but the issue is that those traits, their education, etc are predominantly white, so they have been having to work backwards to adjust the data set and remove anything that could be biased…and this is virtually impossible.
km89 t1_j5kjd17 wrote
>AI isn't biased, it just uses what is available to it.
I mean, that's a little convoluted.
The bias in the dataset creates a bias in the AI. Remember that AIs aren't necessarily looking at the data set every time they need to create something; they're trained on the data set, but the model itself isn't referencing the dataset when it's not being trained.
So yeah--the initial bias definitely comes from the people who select the training data. But the bias persists in the absence of that data, too.
LilShaver t1_j5kromy wrote
Would you be willing to provide numeric values, as were provided by the woman in the article? i.e. 16/100 images generated involved nudity.
Words_Are_Hrad t1_j5ks18d wrote
Breaking news AI trained on human data objectifies humans in the same way humans do! Clearly this is a problem with the AI!
SvenTropics t1_j5kxaor wrote
This is just clickbait. No reasonable person who chose to send a bunch of photos and pay money to an AI that generates a bunch of images of them and privately sends them to them to post, delete, or whatever genuinely has a problem with a couple of provocative photos coming back. Yes, maybe a few unreasonable people do, but we don't need to bend reality for unreasonable people.
Hagisman t1_j5l5lmm wrote
This is where a biased data set doesn’t help. In seminars on AI biases you usually want a diverse data set in order to draw a plausible conclusion.
If you are poisoning the data set by overusing one type of data such as Instagram models you’ll want to offset it. Unless of course your App has a Turn my picture into an Instagram model picture feature. In which case you filter out your dataset for just that specific tag.
_tsi_ t1_j5l5s1j wrote
Hell yeah, sign me up.
kiwibutter088 t1_j5l9wwf wrote
Sure but what criteria do I count with? Pictures that appear as though there is no shirt? Cleavage? Form fitting clothing? There’s on that it looks like the body is painted silver, do I count that? It’s pretty subjective and I want to make sure I give you the answer you are looking for.
kiwibutter088 t1_j5lb0yb wrote
18 pictures appear to have no shirt 25 additional pictures with cleavage 1 photo with no shirt and cleavage (hair covering nipples) 1 silver “painted body”
That’s out of 100
For my husband
One deep V robe 2 pairs of tight pants - one is armor with questionable cod piece, other is skinny jeans
Out of 50
maxstep t1_j5ljvc2 wrote
No, you're delusional and dangerous
sis-n-pups t1_j5lo8kt wrote
"bad/ugly " ... i think the word you're looking for there is realistic
[deleted] t1_j5m3ic7 wrote
[deleted]
[deleted] t1_j5mf7ex wrote
[deleted]
AstroBoy26_ t1_j5mkayx wrote
Oh my god just live your life who gives a shit
Sardonislamir t1_j5mo2do wrote
Not the top result for sure, more like the topless results...
littleMAS t1_j5n6c0d wrote
What happens when you sexually objectify AI? Does AI call DoNotPay to sue you?
whatsupdude0211 t1_j5nvazc wrote
What’s wrong with only wanting to date Asian women?
[deleted] t1_j5oxvgh wrote
[deleted]
typesett t1_j5pyibv wrote
rofl if AI put dad bods on people, i wonder what those articles would be like
or would people just never use it lol
typesett t1_j5q11ma wrote
i think you described a new job that will exist later this year
AI art jockey that makes you look good for ego
___
as i typed this i realized it already exists but i suppose the infrastrure of the industry is being built as we speak
SvenTropics t1_j5qavz0 wrote
I mean, if I ever end up on a dating app again, I'll probably use one or two of my Lensa photos.
IG_Rapahango t1_j5jicin wrote
well is not a surprise, AI is fed by data provided from humanity