this post was submitted on 04 Mar 2024
741 points (97.9% liked)

Technology

60062 readers
3394 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 77 points 9 months ago (4 children)

So what have they been doing to nuke the csam images, editing the database directly?

[–] [email protected] 65 points 9 months ago (1 children)

Often just nuking all image uploads made during a certain time period. Which is why old image threads in Lemmy have time periods littered with broken images.

[–] [email protected] 21 points 9 months ago (4 children)

I don’t understand why Lemmy needs to have a built-in image server at all. Reddit didn’t have one for the longest time and it was fine. Sure, I don’t think anyone would be particularly happy with going back to Imgur etc., but it doesn’t seem worth the trouble.

[–] [email protected] 8 points 9 months ago

It's a trade off for us.

You risk CSAM, and have to shoulder the storage costs.

But you also help to reduce link rot, as the images are kept on the site, rather than an external image host that might explode/go VC one day.

[–] [email protected] 6 points 9 months ago

Some instances do just disable the image server part (I think lemm.ee used to and still only allows small images?)

[–] [email protected] 1 points 9 months ago* (last edited 9 months ago) (1 children)

I mean I don't know why we need images at all, this stuff worked fine when it was just a BBS

[–] [email protected] 1 points 9 months ago

Uphill both ways.

[–] [email protected] 1 points 9 months ago

They definitely should remove it, at least until moderation tools are available.

[–] candyman337 28 points 9 months ago
[–] [email protected] 19 points 9 months ago

Often they delete all images during the time frame of a CSAM attack, as that has been the only real feasible way to ensure images weren't left behind. Though I think a few images have started using AI detection methods to remove images like that automatically (read up on that here and here), also Pict-rs now has a Log linking uploaded images to the user, so now images can be purged with the users.

[–] [email protected] 9 points 9 months ago* (last edited 9 months ago) (1 children)

Admins can purge posts manually which actually deletes them. Or use tools like db0's lemmy-safety that tries to automatically search for CSAM and wipe it.

I think the problem here is the user didn't finish their post which means the photo was uploaded but not associated with a post and therefore not purgeable that way.

[–] [email protected] 4 points 9 months ago

That last problem was fixed in an older version of the software. If you upload, but don't post, it will now be deleted after a time.

You can test this pretty easily by just leaving your browser open with an image uploaded and trying to post it later.