DevAnalyzeOperate t1_ja11gdi wrote
Seems so unnessecary. It seems to me any model which is good at producing porn is probably going to be one trained on a bunch of pornographic images so midjourney shouldn't even be ones first choice. This is just an exposed ankle moral panic that people are *gasp* jerking off to AI!
lucimon97 t1_ja12kd1 wrote
Yeah, I'm struggling to understand who is being hurt here. Like, what's the big deal?
DevAnalyzeOperate t1_ja145o5 wrote
One day we start using AI and BANG! The sex industry might not have as much demand for porn actors and nude models! Pimps will go out of business! Won't somebody PLEASE think of the pimps?? How will Andrew Tate be able to make his millions?
lucimon97 t1_ja164qj wrote
Andrew Tate was doing human trafficking too though, so as long as the AI doesnt offer sex slaves, he will be fine.
deaddonkey t1_ja1jv13 wrote
I could be wrong but I believe it was human trafficking in the sense he was withholding his camgirls’ passports nd manipulating them. But they were ultimately Camgirls and not prostitutes - this is an industry AI can and probably should replace
Anyway fuck I never want to hear about that irrelevant moron again
Miguel-odon t1_ja16bid wrote
Here's an educational video
sprkng t1_ja2o2rg wrote
I would guess that they primarily don't want their service to be used for generating fake celebrity nudes and child porn, which might be illegal in some countries despite not being real photos.
[deleted] t1_ja6klg3 wrote
[removed]
Monte924 t1_ja1q78m wrote
Actually thinking about it, it might not be a moral panic thing. It could be that if people use midjourney for pornography, that will play a roll in its training that could screw up the searches for others. Its like using google search with safe search turned off; even a perfectly innocent search can still result in NSFW results. So they exclude the porn searching while the AI learns so that it doesn't pick up bad habits while its in the early stages of its training
Also the company might also not want their AI to get associated with porn. That's just not good for PR
keylimedragon t1_ja4p2l5 wrote
Models aren't trained based on what people search though, it's usually done beforehand with a large set of training data (chatGPT might also use the thumbs up/down though for training). It's more likely they're trying to avoid liability and controversy.
Viewing a single comment thread. View all comments