this post was submitted on 25 Jul 2023
28 points (64.0% liked)

Technology

59598 readers
3477 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 24 comments
sorted by: hot top controversial new old
[–] [email protected] 66 points 1 year ago (3 children)

https://stacks.stanford.edu/file/druid:vb515nd6874/20230724-fediverse-csam-report.pdf

I'd suggest that anyone who cares about the issue take the time to read the actual report, not just drama-oriented news articles about it.

[–] [email protected] 12 points 1 year ago (1 children)

Given new commercial entrants into the Fediverse such as WordPress, Tumblr and Threads, we suggest collaboration among these parties to help bring the trust and safety benefits currently enjoyed by centralized platforms to the wider Fediverse ecosystem

In such a system, the server on which a post originates would submit imagery to PhotoDNA for analysis

This same technique could also be applied to other hosted media analysis mechanisms (e.g. Google’s SafeSearch or Microsoft’s Analyze Image API40

While large social media providers utilize signals such as browser User-Agent, TLS fingerprint,8 IP and many other mechanisms to determine whether a previously suspended bad actor is attempting to re-create an account, Mastodon admins have little to work with apart from a user’s IP and e-mail address, both of which are easily fungible.

So basically people might have joined the fediverse in large due to privacy reasons but if fediverse is to be "ethical" it should share your images with big tech as well as track you better.

He also laments Tor and E2E messaging.

[–] [email protected] -1 points 1 year ago* (last edited 1 year ago) (1 children)

Anyone who's on Lemmy for "privacy reasons" is probably not looking very closely at the technology. Everything you do here, including votes and DMs, is effectively public. All of it can be scraped, ingested, processed, etc. by absolutely anyone.

[–] [email protected] 1 points 1 year ago

Votes are federated. They are tied to account names. Only your instance can tie them to your IP.

DMs are insecure in that admin instances can read them. Most instances tell you not to use them.

Scraping is more resource intensive than using an API to have data submitted to you. Since you are now offering a service you can set terms on what you can legally do with that data while scraping can lead to legal issues. PR issues as well.

In general using a corporate social media will allow companies to track you (or buy the tracking data from the social media company) far more thoroughly than scraping lemmy.

[–] [email protected] 2 points 1 year ago
[–] [email protected] 50 points 1 year ago (1 children)

I feel like these are just establishment hit pieces. They do it every time to up and coming platforms..

[–] [email protected] 35 points 1 year ago* (last edited 1 year ago) (1 children)

I know enough about internet porn to know that the online-porn communities will love something like Fediverse, and furthermore, the child-exploitation groups would also love something like this.

But what's surprising to me in this study is that they focused on the top 25 Mastodon servers. They've included specific keywords they were looking for (yall know what keywords I mean), and include a practical methodology involving just hashing files + matching known CSAM databases, rather than forcing a human to go through this crap and picking out what they think is, or isn't CSAM.

It seems like a good study from Stanford. I think you should at least read the paper discussed before discounting it. We all know that even here on the Lemmy-side of the Fediverse, that we need to be careful about who to federate with (or disfederate from). Its no surprise to me that there will be creepos out there on the Internet.


112 hits is pretty small, in the great scheme of things. But its also an automated approach that likely didn't get all the CSAM out there. The automated hits seem to have uncovered specific communities and keywords to use to help search for this stuff + moderate, and includes some interesting methodologies (ex: hashed files compared against a known-database) that could very well automate the process for a server like Lemmy.world.

I see this as a net-positive study. There's actually a lot of good, important, work that was done here.

[–] [email protected] 0 points 1 year ago

112 out of 325,000 posts is incredibly small, it's 0.03% of posts

[–] [email protected] 38 points 1 year ago

The "report" is issued by something called the Stanford Internet Observatory, which is not in fact a telescope on a hill, but rather an operation by the guy who, from 2015-2018, was the "Chief Security Officer" of Facebook - an ironic title, considering that this was the period of the Cambridge Analytica machination, the Rohingya genocide, and the Russian influence operation that exposed 128 million Facebook users to pro-Trump disinformation.

https://kolektiva.social/@ophiocephalic/110772380949893619

[–] [email protected] 24 points 1 year ago* (last edited 1 year ago) (1 children)

Sounds like they are becoming worried over the growth of these networks and wants to convince the large public that they should stay away.

It's pretty much standard tactics to paint a false picture of something, and they get away with it too. I bet people will now say "mastadon, isn't that where there is child porn? No thanks".

[–] [email protected] 11 points 1 year ago (1 children)

Its the eternal strawman against freedom. Shutdown everything in the name of protecting children.

[–] [email protected] 1 points 1 year ago

Yup its easy to see the pattern everywhere now.

[–] [email protected] 20 points 1 year ago

I'm not fully sure about the logic and hinted conclusions here. The internet itself is a network with major CSAM problems (so maybe we shouldn't use it?).

[–] [email protected] 13 points 1 year ago (3 children)

Why would anyone use Mastodon for this stuff? It would be private Telegram groups or something like that. This kind of "research" is barely a step above trolling or low-effort clickbait.

[–] [email protected] 16 points 1 year ago* (last edited 1 year ago)

The researchers are looking at actual posts on actual servers. The research itself is not made up. The speculation and handwaving that the tech press feels the need to introduce into it? That's made up.

As for why would anyone use Mastodon for it — your typical Internet pedophile isn't any smarter than your typical Internet user, and half of those are below average.

[–] [email protected] 2 points 1 year ago (1 children)

While I don't agree with the author's use of "major CSAM problem", if you browse instance federation list, you might notice a few mastodon instances that suggest they might be a MAPs community at best (domain name like pedo-school, mapsupport, etc usually with cute dolls in their banner image). They are closed community so can't see what happening inside.

[–] [email protected] 1 points 1 year ago (1 children)

What does MAP stand for? I'm a bit wary to use a web search to look it up

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago)

It stands for pedo. Minor attracted person.

[–] [email protected] 4 points 1 year ago

The Apache foundation has got a huge child sex problem. They must be policed by Microsoft. /s

[–] [email protected] 1 points 1 year ago

They don't seem to list the instances they trawled (just the top 25 on a random day with a link to the site they got the ranking from but no list of the instances, that I can see).

We performed a two day time-boxed ingest of the local public timelines of the top 25 accessible Mastodon instances as determined by total user count reported by the Fediverse Observer...

That said, most of this seems to come from the Japanese instances which most instances defederate from precisely because of CSAM? From the report:

Since the release of Stable Diffusion 1.5, there has been a steady increase in the prevalence of Computer-Generated CSAM (CG-CSAM) in online forums, with increasing levels of realism.17 This content is highly prevalent on the Fediverse, primarily on servers within Japanese jurisdiction.18 While CSAM is illegal in Japan, its laws exclude computer-generated content as well as manga and anime. The difference in laws and server policies between Japan and much of the rest of the world means that communities dedicated to CG-CSAM—along with other illustrations of child sexual abuse—flourish on some Japanese servers, fostering an environment that also brings with it other forms of harm to children. These same primarily Japanese servers were the source of most detected known instances of non-computer-generated CSAM. We found that on one of the largest Mastodon instances in the Fediverse (based in Japan), 11 of the top 20 most commonly used hashtags were related to pedophilia (both in English and Japanese).

Some history for those who don't already know: Mastodon is big in Japan. The reason why is… uncomfortable

I haven't read the report in full yet but it seems to be a perfectly reasonable set of recommendations to improve the ability of moderators to prevent this stuff being posted (beyond defederating from dodgy instances, which most if not all non-dodgy instances already do).

It doesn't seem to address the issue of some instances existing largely so that this sort of stuff can be posted.

load more comments
view more: next ›