this post was submitted on 25 Jul 2023
120 points (84.1% liked)

Fediverse

28223 readers
449 users here now

A community to talk about the Fediverse and all it's related services using ActivityPub (Mastodon, Lemmy, KBin, etc).

If you wanted to get help with moderating your own community then head over to [email protected]!

Rules

Learn more at these websites: Join The Fediverse Wiki, Fediverse.info, Wikipedia Page, The Federation Info (Stats), FediDB (Stats), Sub Rehab (Reddit Migration), Search Lemmy

founded 1 year ago
MODERATORS
 

Not the best news in this report. We need to find ways to do more.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 13 points 1 year ago (2 children)

Because it's another "WON'T SOMEONE THINK OF THE CHILDREN" hysteria bait post.

They found 112 images of cp in the whole Fediverse. That's a very small number. We're doing pretty good.

[–] [email protected] 5 points 1 year ago (1 children)

It is not "in the whole fediverse", it is out of approximately 325,000 posts analyzed over a two day period.
And that is just for known images that matched the hash.

Quoting the entire paragraph:

Out of approximately 325,000 posts analyzed over a two day period, we detected
112 instances of known CSAM, as well as 554 instances of content identified as
sexually explicit with highest confidence by Google SafeSearch in posts that also
matched hashtags or keywords commonly used by child exploitation communities.
We also found 713 uses of the top 20 CSAM-related hashtags on the Fediverse
on posts containing media, as well as 1,217 posts containing no media (the text
content of which primarily related to off-site CSAM trading or grooming of minors).
From post metadata, we observed the presence of emerging content categories
including Computer-Generated CSAM (CG-CSAM) as well as Self-Generated CSAM
(SG-CSAM).

[–] [email protected] 6 points 1 year ago

How are the authors distinguishing between posts made by actual pedophiles and posts by law enforcement agencies known to be operating honeypots?

[–] [email protected] 1 points 1 year ago (1 children)

Still, that number should be zero.

[–] [email protected] 3 points 1 year ago

In an ideal world sense, I agree with you - nobody should abuse children, so media of people abusing children should not exist.

In a practical sense, whether talking about moderation or law enforcement, a rate of zero requires very intrusive measures such as moderators checking every post before others are allowed to see it. There are contexts in which that is appropriate, but I doubt many people would like it for the Fediverse at large.