Comments

You must log in or register to comment.

desperate_coder t1_j8ni535 wrote

The solution is simple, though terrible. Start making and distributing deepfake porn of lawmakers.

324

frakkintoaster t1_j8nzcdf wrote

If you deepfaked some conservative into a video of them banging a porn star they would just jerk off to it

96

desperate_coder t1_j8o9an0 wrote

That's where the terrible part comes in, you don't deep fake them having pornstar sex, you deep fake lawmakers having sex with each other or you deep fake them having sex with people they hate and alienate. Stuff that is publicly embarrassing for them.

102

Sensual_Pudding t1_j8ogujk wrote

Wouldn’t be hard to put some in a lemon party video.

21

rammo123 t1_j8q7s7z wrote

Are we 100% sure none of them were in the original?

2

PKCertified t1_j8p35n6 wrote

I seem to recall something about cocaine orgies. It would be a shame if someone found video evidence.

11

skolioban t1_j8pf14m wrote

This. Deepfake them doing the things they are preaching against. It doesn't even need to be porn.

11

conventionalWisdumb t1_j8qfjkq wrote

All the GOP lining up and sucking Obama off would do the trick. With Trump getting the money shot, because you know, he’s into money.

5

Oliver_DeNom t1_j8q9on1 wrote

Deep fake a congressman correctly using a house page's preferred pronouns. That will get the legislation moving.

4

ExtantPlant t1_j8q7wlw wrote

Hmmm. Not sure if I want to see McConnell getting railed by a smoking hot trans lady or not.

1

Admin-12 t1_j8oz3xy wrote

Okay so get Trump deepfaked into having a gang bang with the whole GQP and someone include anything to do with LGBT. They’ll be talking about it on OANN before lunch.

10

Heklyr t1_j8qer7f wrote

At least I wont be the only one

1

Starkrall t1_j8qjei6 wrote

It should be distasteful enough to threaten their career with the knowledge that it's fake.

1

TheConboy22 t1_j8qye4v wrote

Nah, I want all GOP members deepfaked into being fucked by Trump

1

askaboutmy____ t1_j8p2ib6 wrote

Ted Cruz would finally get to fulfill his fantasy about Cory Chase!

26

kenghoong t1_j8qo3je wrote

Not if he’s the receiving end from mandingo

7

JCGolf t1_j8p3urb wrote

they’ll just ban deepfakes of politicians but not regular folk

2

Culverin t1_j8q38wp wrote

That sounds like some British parliament sorta shit

1

probono105 t1_j8plgwx wrote

they cant stop it because if they say that deepfakes cant be trained from publicly available photos then they are saying things like chat gpt are a no go as well

2

FPSPoonzy t1_j8odnz9 wrote

Same with politicians. Both local and national wise. It'd be the only way as you said.

1

nobody_smith723 t1_j8p2gv5 wrote

nancy pelosi's giant gilf tig ole bitties has already got to exist somewhere

1

Elegant_Tale_3929 t1_j8pblcs wrote

Timing is important, do it during an election year.

1

Unhappy_Gas_4376 t1_j8pi9pp wrote

No, then you can't tell the real sex tapes from the real ones. It gives them plausible deniability.

2

SvenTropics t1_j8q7opj wrote

Or just get over it. People have been photoshopping celebrity porn for over 20 years, and that's been fine. Have their lives been damaged in any measurable way? Absolutely not. The slippery slope we'd have to climb on to create legislation around. This is not one I ever want them to go down. Is your life really affected in any substantial way if someone made a digital representation of you that was getting Eiffel towered? Nope. Everyone knows it fake. For that matter, you're not going to be the target of this unless you're a celebrity or a politician. (Basically some kind of public figure) If they start making Barack Obama porn, a lot of women will probably be happy about that.

1

Metallic_Hedgehog t1_j8qpkim wrote

You cannot ban this technology, because you can't enforce it. It's essentially similar to Napster. If you ban it, 30 more popped up. It wasn't until streaming made the $15 + convenience a better value than free media + viruses that piracy fell out of the mainstream.

I think the best solution currently is to require videos to label such content, such as how sponsored content must be labeled. Granted, this only works for now while deepfake technology is still detectable.

1

Lemonic_Tutor t1_j8r0be5 wrote

Then we’ll threaten to post the porn online unless the politician fucks a pig on national television

1

Western-Image7125 t1_j8r8hat wrote

The solution is simple, and not terrible. We should already start doing this right away

1

hedgetank t1_j8ntlrw wrote

Can't the women being targeted, or anyone for that matter, simply file a copyright on their likeness, and then DMCA/sue any company that hosts infringing content?

43

Movie_Monster t1_j8o3kd4 wrote

I’ll give you the long-winded “not a lawyer” answer, if you don’t like it, take it up with the fake bar association.

If you are making a living by your appearance like a celebrity, then you can sue for unauthorized use of your likeness. If you are a normal Joe Schmoe that’s not really an argument you can make.

I found this out when I was working for a university and we had a lawyer in our video production department. We used to make anyone on camera sign a waiver, the lawyer said it’s not necessary unless they were a minor, or some other circumstance that involved being mentally competent, or like aware of the video production.

But the lawyer was clear that if we filmed with a celebrity we had to get a waiver signed.

Now things are totally different if you are filming someone who is not aware of the camera in a space where you would expect privacy like a bathroom.

Anyone else in public is fair game as far as the law is concerned. While it’s creepy, you can definitely film anyone in public, I’ve heard protestors claim you can’t film their face or even young girls in public, same shit, no one has any authority to stop you from filming, the law is on your side.

36

Petaris t1_j8oxtum wrote

This varies based on country though. Just so that people keep it in mind that the rules are different depending on where you are.

For example, in Japan you cannot take a picture, or video, of someone without their explicit permission.

10

hawkwings t1_j8pxpym wrote

>in Japan you cannot take a picture, or video, of someone without their explicit permission.

That would interfere with vacation pictures. You would have to make sure that no one was in the way when you took a picture of a temple or beach.

1

Petaris t1_j8q1bju wrote

Privacy is a big deal in Japan.

The laws are not quite as straight forward as what I mentioned but its the safe way to conduct yourself when taking pictures there. There are of course exceptions and qualifications for what is and is not allowed and some of it is based around how the image may be used.

If you don't believe me you can go look it up for yourself. There is a lot of info out there about it.

That being said, I doubt that a random passerby is going to make a fuss if they end up in your vacation photo by accident.

If however you are taking vacation photos of them specifically, like Geisha on a picturesque street in Kyoto for example, you very likely will be in trouble and have the police called on you.

3

railgunsix t1_j8r4ngr wrote

I have Japan exported iPhone. You can't disable camera shutter sound. I also have Korea exported Samsung although they allow it to be on mute.

1

LandoChronus t1_j8q765d wrote

Your post didn't touch on the case (or I didn't understand fully) where the person/party recording will use that recording for profit. I think of like "candid camera" type shows.

Even though they don't need permission to film the random public, do they need permission if they're then going to profit off the footage? Is this why everyone around the "mark" they're messing with has their face blurred, or is that just a CYA thing?

This would be different than just filming someone for a project or whatever that won't be used in a commercial setting.

1

GregoPDX t1_j8qcczq wrote

At least for US productions, all candid camera shows where the person isn’t just in the background, they signed a waiver. Any of the radio shows where they call someone clandestinely are just actors - I don’t think there’s any legal way to do it otherwise.

1

FallenJoe t1_j8o2g95 wrote

No, copywrite doesn't work that way.

Copywrite protects the product of creative works. If you didn't make something, you don't have a copywrite to it. If someone makes a knockoff Pokemon game using Pokemon characters they get sued for using a creative work under copywrite without permissions.

So the person being deepfaked can't sue for copywrite infringement, the only person who could arguably do so would be the photographer who took the video or image used in the deepfake generation. The person getting photographed only owns the copywrite for the image if the copywrite had been explicitly transferred or sold to them by the person who previously owned it (by default the creator of the creative work).

There are other laws that might be applicable for the situation, but they're not copywrite laws.

3

nicuramar t1_j8p6buc wrote

It’s copyright. As in, the right to (control) copies.

7

FallenJoe t1_j8pbot6 wrote

Again, no, it doesn't work that way. You don't have any sort of general copyright to your personal appearance, and so someone creating a deepfake of you isn't violating copyright unless (and this is a maybe because it hasn't been litigated) they used material that did have a valid copyright in the generation of the deepfake. And then they would be violating the copyright of the person that holds the rights to the initial material, not necessarily the person being deepfaked.https://www.upcounsel.com/can-i-trademark-my-face

Copyright isn't a magic wand you can wave around just go "Oh it's a deepfake of me so I'll sue them for copyright." You have to meet very specific standards to have a copyright and other for it to be infringed.https://www.copyright.gov/comp3/chap300/ch300-copyrightable-authorship.pdf

For example:Works created unknowingly can't receive a copyright: https://www.youtube.com/watch?v=dJX_83mswFA

Pictures taken by nonhuman actors can't receive copyright: https://en.wikipedia.org/wiki/Monkey_selfie_copyright_dispute

AI generated art currently isn't eligible for copywrite (this may change): https://www.intellectualproperty.law/2022/05/copyright-office-denies-registration-of-computer-generated-art/

1

MoominTheFirst t1_j8n6725 wrote

The comments below this so far is why people think having a Reddit account is a red flag. Go outside.

20

EmbarrassedHelp t1_j8nk1mj wrote

The issue with laws attempting to deal with this issue, is that they they will likely be written vague enough to harm the art community, and they will place the onus on companies to damage their models instead of targeting the people who do use the models for harm.

18

Uristqwerty t1_j8og1o5 wrote

The dataset used to train the model needs to be sourced ethically, just like the supply chain used by a physical manufacturer needs to be audited to ensure a supplier isn't using slave labour in a country too remote to attract much attention over the issue. In this case, I'd say the companies need to either dilute their datasets further, using fewer samples from any given person to the point that AI can't replicate the appearance of a specific person or the style of an artist except by improbable coincidence or extreme genericity, or get consent from each person who (or whose work) appears in the training data.

Though this is deepfakes, which I think involve users applying additional training material specifically of the target, so that the AI over-fits to that specific output. If the original AI was ethically/respectfully produced, then the people responsible for the additional rounds of training ought to be the ones at fault, at least as much as the prompt-writer themselves (assuming they're not the same individual!). For that, the only good solution I can think of is legislation.

−1

Bad_Mood_Larry t1_j8qruwy wrote

>The dataset used to train the model needs to be sourced ethically, just like the supply chain used by a physical manufacturer needs to be audited to ensure a supplier isn't using slave labour in a country too remote to attract much attention over the issue.

Using data that readily and publicly accessible on the internet that was uploaded to the internet (many of which signed their right away to be collected in a dataset) to train a dataset is no where close to using slave labor this is a horrible analogy.

2

Uristqwerty t1_j8rsa3r wrote

When it comes to consumer behaviour, people flocking to the cheaper product and actively saying "I don't care about the supply chain! Give me my cheap phone/AI art" while others keep trying to draw attention to unethical practices? It's a very close parallel. Maybe the harm feels less tangible when spread out over orders of magnitude more people, or when you're so accustomed to abusive ToS conditions giving away your rights, but it's still there.

0

BlackandBlue14 t1_j8ov1x3 wrote

I urge everyone to pause and consider how ubiquitous these sorts of deep fake tools will become over the next decade. There is zero - I repeat, zero - chance of preventing bad actors from creating this sort of content and distributing it online. You MAY be able to stop people from profiting from it. Even so, an unregulated marketplace will persist.

18

Agariculture t1_j8s0ii2 wrote

The exact same thing is true of all human activities. Government bans do not work.

1

AvatarJack t1_j8n9wdy wrote

And here's another drawback to letting old people run our societies. As technology advances, laws aren't keeping up because the people writing laws still think computers are a fad. There really needs to be some sort of independent lobbying group that tries to educate lawmakers about technology and effective methods of regulation (as opposed to the nonsense, uninformed half measures they take today).

7

lordmycal t1_j8nhfw3 wrote

Legislation will always lag behind by it’s very nature. We just don’t go around proactively making new things illegal. There almost always has to be an actual problem that lawmakers are trying to solve, and then it takes a while for laws to be written, lobbies for and go into effect.

13

-_-_-__-_-_-__-_- t1_j8ncwzt wrote

That's quite a blanket statement. It's funny how reddit is so hypersensitive to anything that could be remotely interpreted as racist, sexist, homophobic etc yet ageism is so widely accepted and even encouraged.

8

AvatarJack t1_j8neqvq wrote

Do I think we should discriminate against people based on their age? No.

Do I think lawmakers should have baseline knowledge about the topics they’re legislating on? Yes.

Do I think old people are on average unprepared to handle a topic as complex as internet regulation? Also yes.

If that’s ageism to you, then I’m sorry. But don’t feel too bad for them because current old people control basically all government in the US (can’t speak for elsewhere) so I think they’ll be fine if someone hurts their feelings.

11

-_-_-__-_-_-__-_- t1_j8nfhy5 wrote

You're assuming age is the issue here, when the real issue is these lawmakers are mostly bought by corporations and are lax on issues that don't benefit their portfolio or get them re-elected. This goes for the young and old politicians alike. I know many older people who keep up with technology or are at least curious to learn.

Edit: Also because this deepfake porn issue affects so few people, it's never going to be moved from the backburner. Creating laws around it won't even create a ripple in the election cycle. This is the real issue: the only job of a politician is to get re-elected. The creation of a new law (even computer related) is not determined by the age of the politician, but the likelihood of it getting said politician elected again, or like I mentioned, getting some sort of financial compensation.

2

throwaway_MT_452298 t1_j8nt0cg wrote

>the real issue is these lawmakers are mostly bought by corporations

So much this!!! I would go so far as to remove mostly... It is very hard for a honest politician to get elected to do the will of the people who put them in office. They get put in office to do the will of the $$$ that put them in office.

3

throwaway_MT_452298 t1_j8nsrsw wrote

We voted for them correct? WHY? Also please define old for us.... is it over 50,60,65 or what is the cutoff please you know for science.

2

reddit-MT t1_j8o4g6o wrote

The problem isn't age so much as that modern technology is complicated and changes quickly. Judges and lawmakers need to rely on technology experts to explain issues they don't understand. Problem is that those experts are often tied to their respective industries.

2

VegaTDM t1_j8pw6bb wrote

Consent does not apply to drawings of yourself, why would it apply to AI?

2

FLRUlbts t1_j8qjfl5 wrote

"Deceptively manipulated pornography used the likenesses of Twitch stars without their consent, and now they're calling for more to be done."

Umm, phrasing? Do they really want more?

1

browserleet t1_j8r6x8m wrote

What if it’s stated in the video, clearly, that the videos are not the real person?

It’s still morally wrong, but at least the viewer knows the real person isn’t in the video.

1

DanHassler0 t1_j8q7dm1 wrote

Should laws protect them? It's not them, right?

−3

Gideon_Effect t1_j8q1l9p wrote

Maybe hunter will use this as a defense.

−4

AustinJG t1_j8oi053 wrote

Just deep fake everyone. Make the pics so ridiculous that it's comical and so numerous that it becomes pointless for anyone to make more. Lets just rip the collective band aid off.

−6

Sirmalta t1_j8ofspo wrote

I dont get why anyone cares about this.

It isnt you... people have been doing this with photoshop for like 25 years.

Why do you care? You cant stop people from picturing it, or drawing it, or painting it, or photoshopping it, so why do people care if a computer puts your face on someone in porn?

−13

OlynykDidntFoulLove t1_j8p9321 wrote

It’s the difference between being able to build a bomb with home materials and being able to order one premade on Amazon. One requires effort and knowledge while the other just needs a credit card. It’s about the barriers to entry being lower and the ability to mass produce.

6

djspacepope t1_j8nzsen wrote

Yup being a public figure means you allow the public to use your image in any parody, as long it does not copy the original content exactly. So yeah, don't wanna be publicly fucked, don't be a public figure.

−15

BigZaddyZ3 t1_j8o0b6a wrote

Kind of a stupid take on all this tbh. There’s nothing stopping this tech from being used against non-public figures as well.

13

djspacepope t1_j8o1xr6 wrote

Yup, welcome to the future. Things move fast. I didn't say it was right, Im just explaining, at this point in time, why it's always been a inherent risk to being a public figure. When it comes to private people, well I guess they have a new wrinkle to iron out

Rule 34 and revenge porn existed long before this.

−9

mabris t1_j8phtat wrote

And revenge porn is illegal in many jurisdictions…

6

TheRedGoatAR15 t1_j8mwt5q wrote

I am going to need a link to, uhm, research how 'believable' the porn might be.

Also, if someone told me, "Hey, you're in a porno." My first reaction would NOT be, "Wait, am I on this?"

Unless, of course, there was the real possibility that I WAS in a pr0n....

−26

very_bad_programmer t1_j8ntoh4 wrote

This is about what I'd expect from someone who browses jordan peterson, crowder, and conservative subs

11

Sirmalta t1_j8ogee5 wrote

Leftist here. The only reason I dont watch deepfakes is because theyre stupid. Show me a video of the actual person fucking, and I'm in. Deep fakes are just dumb fan art.

I dont care about them, and frankly neither should anyone else. Its lame, not sexy, and too dumb to be offensive.

3