Comments

You must log in or register to comment.

HanaBothWays t1_jac9ad2 wrote

This is an expansion of the existing tool to remove CSAM which has been around for a long time.

If you are a teenager and someone spread around the photos you shared with them, or if you’re an adult now but someone spread around nude photos of you as a teen from way back when (or you’re worried that they will as a form of revenge porn), you can upload hashes of those photos to this tool and they will be detected and removed when someone uploads them, like known CSAM content is.

3

freediverx01 t1_jaceg2s wrote

What mechanisms are in place to prevent people from submitting random photos just because they don’t like porn?

3

HanaBothWays t1_jacf8pt wrote

It doesn’t say.

But the tool, like the CSAM takedown system, is coordinated with the National Center for Missing & Exploited Children (NCMEC) and adult sites like Pornhub and Onlyfans use the CSAM tool, so even if they aren’t talking about it in the article they have something in place to prevent that. If it could easily be gamed to take down legal, consensually posted pornography featuring adults, Pornhub and Onlyfans would not be voluntarily using it.

3

freediverx01 t1_jaclir0 wrote

This system is ripe for abuse without any outside oversight. All it takes is for that organization to be controlled by a group of reactionary right wingers, and then they can ban anything they want.

1

HanaBothWays t1_jaco6si wrote

Just because the anti-abuse safeguards are not detailed in the article doesn’t mean they don’t exist.

It doesn’t talk about the specific hash functions used for this thing either but those definitely exist and they are definitely using them.

4

freediverx01 t1_jacrudt wrote

My concern is based on the lack of oversight and transparency of an organization with so much potential power. We have a long history of attacks on civil liberties under the banners of child protection and counter-terrorism.

1

HanaBothWays t1_jacuass wrote

When the big porn sites start having issues with NCMEC and how they curate their databases, I’ll worry about that. But they all seem to get along fine right now, so I’m not worried about it.

2

freediverx01 t1_jadi20h wrote

You’ll have to forgive me, but when the supreme court has become so completely politicized and illegitimate, it’s hard to have any faith in the resiliency of any other institutions.

1

BeardedDragon1917 t1_jacvd1j wrote

Couldn't somebody just alter the photo slightly to completely change its hash?

1

HanaBothWays t1_jacxr6a wrote

There’s a whole class of hashes used for media fingerprinting/detection and they don’t work the same way as the ones you’re probably thinking of (the ones used for digital signatures and tamper detection).

2

pickles55 t1_jacidkh wrote

I really hope this tool doesn't get expanded for use against all sexually explicit images. Not to be that guy but invasions of individual privacy and freedom of speech are typically used first against people that most would have little sympathy for, like child abusers and violent criminals, before being used on everyone else. The first criminals Obama deported under his new rules were pedophiles but it wasn't long before they were deporting grandmother's because they were the receptionist for a white collar criminal. This tool in it's present form seems great but it is very powerful and could do a lot of damage in the wrong hands

0

HanaBothWays t1_jactbkc wrote

This tool is an expansion of the existing tool used to detect and take down CSAM (Child Sexual Abuse Material). Dedicated adult content sites like Onlyfans and Pornhub also use that tool. They may adopt this expansion as well if it works out on the other platforms that are early adopters, since they don’t want any stuff with minors and/or anything the subjects of the uploaded media did not consent to on their site (it’s against their policy).

Expanding this to filter out any adult content whatsoever would be very difficult because it only works on “known” media, that is, media for which there is a hash already uploaded to the database. These tools can’t recognize “hey, that’s a naked child/teenager” or “hey, that’s a boob.” They can only recognize “that media matches a signature in my database.”

3