Comments

You must log in or register to comment.

CrucioIsMade4Muggles t1_ja8w47k wrote

Hashes...so someone changes a single pixel and this system doesn't work.

Top notch work Meta. /s

6

Kira9059 t1_ja8zd3x wrote

Nice! This has been needed for a while now.

0

ampjk t1_ja911vh wrote

Is this going to be like the poor people who comb the internet for child porn to remove and commit suduko at a high rate compared to other moderators from large companies.

1

HanaBothWays t1_ja916ts wrote

I suspected that this would basically work like the tools used to recognize and spike Child Sexual Abuse Material (CSAM) images and it actually is - it’s the same tools and the same database! This is basically expanding the eligibility criteria for what can go into the database.

Previously if you sent your high school sweetheart a nude selfie and that person did whatever with it, you didn’t have a lot of options, but now you can upload a hash of the picture (not the actual picture) to the database and it will get taken down.

Also if you are a legal adult now but have nude photos of yourself from when you were a minor floating around, you can upload hashes fo the database and have them taken down.

1

CrucioIsMade4Muggles t1_ja92h9v wrote

That's good to know. I'm still fairly skeptical. This seems like the literal least effort approach one could take and still claim they are doing something--to the point where I think they will have spent more money advertising their effort than they spent on developing the system itself.

1

HanaBothWays t1_ja93hzi wrote

This is the same system that’s used to detect and take down Child Sexual Abuse Material (CSAM). It’s been around for years. Meta is just expanding the criteria for what images (or hashes of images) they will use it on.

The CSAM system was not previously used to detect and take down nude photos that teens shared consensually: now, it is, even if the subject of the photo has since become a legal adult.

2

HanaBothWays t1_ja94k72 wrote

We are talking about situations where a minor consented to share an intimate photo with another party having the understanding that the other party would not spread it around in public…and the other party did so anyway.

When this kind of thing happens between adults it’s called “revenge porn” and the person who spread the photo is often subject to civil or criminal liability for doing so.

If you are seriously arguing that someone deserves to have nude photos of themselves as a minor floating around to “teach them a lesson” when having it happen to them as an adult would make them victims of a crime, you probably need to log off for a while.

4

Gerzhus t1_ja95edw wrote

Reposting due to automod.

“Known as PDQ and TMK+PDQF, these technologies are part of a suite of tools we use at Facebook to detect harmful content, and there are other algorithms and implementations available to industry such as pHash, Microsoft’s PhotoDNA, aHash, and dHash. Our photo-matching algorithm, PDQ, owes much inspiration to pHash although was built from the ground up as a distinct algorithm with independent software implementation.”

I don’t know if all are open source, some might be proprietary.

Source: meta/fb blog post about how they fight CSAM.

1

apextek t1_ja9789e wrote

This is so disconnected. Teens dont use fb or insta, they're on tiktok.

0

HanaBothWays t1_ja97j9u wrote

Most people did not read the article at all and don’t realize this is an expansion of the existing CSAM takedown tool that Facebook has had in place for many years. (Most other social media sites have very similar tools.)

2

Arrowtica t1_ja99ou7 wrote

You know what helps prevent that spread more? Not existing.

1

bokbie t1_ja9a3hf wrote

Why can’t the phones just not save an intimate photo of a teen?

−1

HanaBothWays t1_ja9t4ll wrote

Lots of young people use Instagram.

And if you read the article (what a concept LOL), this can be used for photos taken and spread on Facebook a long time ago. If your cad of a high school boyfriend posted the pictures you gave him 15-20 years ago on Facebook, you can send a hash to this thing to have them removed.

5

t0slink t1_jaa7hsp wrote

> peddles videos of animal cruelty, sexual assault, torture and killing?

You realize Reddit has all of that content openly, yeah?

This isn't "skirting" anything, it's a legitimately useful tool for teens.

2