this post was submitted on 19 Sep 2023
74 points (88.5% liked)

Lemmy

523 readers
2 users here now

Everything about Lemmy; bugs, gripes, praises, and advocacy.

For discussion about the lemmy.ml instance, go to [email protected].

founded 4 years ago
MODERATORS
 

There's another round of CSAM attacks and it's really disturbing to see those images. It was really bothering to see those and they weren't taken down immediately. There was even a disgusting shithead in the comments who thought it was funny?? the fuck

It's gone now but it was up for like an hour?? This really ruined my day and now I'm figuring out how to download tetris. It's really sickening.

all 39 comments
sorted by: hot top controversial new old
[–] [email protected] 46 points 11 months ago (4 children)

Real talk, Lemmy needs some of the basic ass moderation tools that Reddit had so mods can be alerted and so mods can recommend that an admin ban an account or domain.

Sure, there are ways that we can scan uploads with AI and do a bunch of other complex magic, but we need the basics first.

[–] [email protected] 32 points 11 months ago (1 children)

One tool that I liked from Reddit was manually approving posts from accounts under a certain age or karma threshold. I hope we can get tools like that one day.

[–] [email protected] 4 points 11 months ago* (last edited 11 months ago)

There is already the ability to restrict by karma with lemmy bots, but this will just encourage karma farming IMO, hence why nobody has done this yet

I like the sound of the former approach - it sounds like a more effective solution and is similar to what Discourse does (manual approval of posts for new accounts, with an accompanying trust level) in a lemmy implementation it could possibly be managed or set by each instance

Edit:clarification

[–] [email protected] 8 points 11 months ago (1 children)

Lemmy will need a trust and safety team, but those can be expensive, and it would be an operational challenge for every instance to have experienced people. Would probably work best if there was a T&S collective and instances can elect to use them as a resource.

[–] [email protected] 5 points 11 months ago

But before we can even get to that, we need those basic mod tools. A volunteer TS team would need that to be effective.

Can’t address a serious report if you don’t know it exists, and if you aren’t empowered report bad actors to admins to ban them from an instance.

[–] [email protected] 6 points 11 months ago

Better tools will open the door for instance admins who don't come from a network admin/developer background to responsibly host their communities, too.

For the Lemmyverse to truly thrive, Admins should be relatively free to focus their time on the social elements of running an instance, which is a wholly different skillset than systems administration. Right now in order to be an effective Admin you need a heaping of both, (unless of course you're interested in running an unmoderated instance).

[–] [email protected] 4 points 11 months ago

Even with fantastic moderation tools if one malicious user can take down an entire Lemmy instance then all is for naught.

[–] [email protected] 28 points 11 months ago (2 children)

AI generated CSAM will be (or already is) the next big DoS/troll tool, all you can really do is delete/block

[–] nfjvubpnipdpvvx 25 points 11 months ago

im giving up on the internet

[–] [email protected] 6 points 11 months ago

I mean if it has the potential to kill the value of real CSAM that's kinda a win though... Sure, it's disturbing, but I'd rather people don't actually get abused in order to create such content - which will inevitably happen anyway.

[–] [email protected] 19 points 11 months ago (2 children)

AFAIK, it all falls down on moderators' shoulders. I don't envy their jobs one bit :(

[–] [email protected] 7 points 11 months ago (2 children)

How was it handled on Reddit? Did the moderators have to handle it there as well, or did Reddit filter it out beforehand?

[–] [email protected] 3 points 11 months ago (1 children)

Especially as Lemmy has even worse moderator tools than reddit (without custom tools) and the devs don't give a shit.

[–] [email protected] 1 points 11 months ago

They really don't care, do they?

[–] RvTV95XBeo 17 points 11 months ago

I think the Lemmy dev team could use some help pushing out more moderation controls if there are any devs out there who want to make the world a little bit better place.

For starters it would be nice to be able to set up rules like:

You can't comment for 1 day, you can't comment links for 1 week, you can't post until you have X comment karma, and you can't post images / links to non-whitelisted sites until you have mod approval/Y karma/whatever. Toss in a rate limit on posting, and it's not perfect but it may give mods a little more breathing room. Without adequate tools I understand why certain instances choose to go with the walled garden approach.

[–] [email protected] 13 points 11 months ago (2 children)

Set up CloudFlare’s CSAM scanning tool. It’s completely free. It’s not on lemmy devs to secure your instance. Lemmy devs could add better admin and moderating tools, but it’s better to stop it before it even makes it to your server.

[–] sugar_in_your_tea 13 points 11 months ago (1 children)

Imo, lemmy shouldn't allow image uploads at all. All images should be hosted elsewhere on services that can handle scanning content. This would also drastically cut down on hosting costs for lemmy instances.

If lemmy is to host images, it should merely be as a backup. But since lemmy content isn't easy to search as is anyway, that's not a short term concern. And those images should be archived via mod action imo, not user action.

[–] [email protected] 5 points 11 months ago* (last edited 11 months ago) (1 children)

can't you already run Lemmy without image hosting if you just disable the pictrs service?

there's also a new config option to disable caching of remote images

[–] sugar_in_your_tea 2 points 11 months ago* (last edited 11 months ago)

disable caching of remote images

I'm not exactly sure how Lemmy works here, but are pictrs images considered "remote," or are they copied between instances? AFAIK, each instance has its own pictrs service, but I'm not sure if that's sent along with the post content when federating messages.

But if lemmy can interact with other instances without storing any non-text data, then perhaps the problem is solved.

[–] [email protected] 2 points 11 months ago

Is there an option to delete a single image from your lemmy instance?

[–] [email protected] 5 points 11 months ago (1 children)

Dumb question, but I'm sure I'm not the only one ... What is CSAM? And what the acronym means?

[–] [email protected] 2 points 11 months ago