Viewing a single comment thread. View all comments

DevAnalyzeOperate t1_ja11gdi wrote

Seems so unnessecary. It seems to me any model which is good at producing porn is probably going to be one trained on a bunch of pornographic images so midjourney shouldn't even be ones first choice. This is just an exposed ankle moral panic that people are *gasp* jerking off to AI!

74

lucimon97 t1_ja12kd1 wrote

Yeah, I'm struggling to understand who is being hurt here. Like, what's the big deal?

41

DevAnalyzeOperate t1_ja145o5 wrote

One day we start using AI and BANG! The sex industry might not have as much demand for porn actors and nude models! Pimps will go out of business! Won't somebody PLEASE think of the pimps?? How will Andrew Tate be able to make his millions?

20

lucimon97 t1_ja164qj wrote

Andrew Tate was doing human trafficking too though, so as long as the AI doesnt offer sex slaves, he will be fine.

3

deaddonkey t1_ja1jv13 wrote

I could be wrong but I believe it was human trafficking in the sense he was withholding his camgirls’ passports nd manipulating them. But they were ultimately Camgirls and not prostitutes - this is an industry AI can and probably should replace

Anyway fuck I never want to hear about that irrelevant moron again

5

sprkng t1_ja2o2rg wrote

I would guess that they primarily don't want their service to be used for generating fake celebrity nudes and child porn, which might be illegal in some countries despite not being real photos.

5

Monte924 t1_ja1q78m wrote

Actually thinking about it, it might not be a moral panic thing. It could be that if people use midjourney for pornography, that will play a roll in its training that could screw up the searches for others. Its like using google search with safe search turned off; even a perfectly innocent search can still result in NSFW results. So they exclude the porn searching while the AI learns so that it doesn't pick up bad habits while its in the early stages of its training

Also the company might also not want their AI to get associated with porn. That's just not good for PR

6

keylimedragon t1_ja4p2l5 wrote

Models aren't trained based on what people search though, it's usually done beforehand with a large set of training data (chatGPT might also use the thumbs up/down though for training). It's more likely they're trying to avoid liability and controversy.

2