Fuck these trolls
Lemmy.World Announcements
This Community is intended for posts about the Lemmy.world server by the admins.
Follow us for server news π
Outages π₯
https://status.lemmy.world
For support with issues at Lemmy.world, go to the Lemmy.world Support community.
Support e-mail
Any support requests are best sent to [email protected] e-mail.
Report contact
- DM https://lemmy.world/u/lwreport
- Email [email protected] (PGP Supported)
Donations π
If you would like to make a donation to support the cost of running this platform, please do so at the following donation URLs.
If you can, please use / switch to Ko-Fi, it has the lowest fees for us
Join the team
troll is too mild of an adjective for these people
How about "pedophile"? I mean, they had to have the images to post them.
"Terrorist". Having the images doesn't mean they liked them, they used them to terrorize a whole community though.
Yeah, this isnβt just joking or shitposting. This is the kind of shit that gets people locked up in federal pound-you-in-the-ass prison for decades. The feds donβt care if you sought out the CSAM, because it still exists on your device regardless of intent.
The laws about possessing CSAM are written in a way that any plausible deniability is removed, specifically to prevent pedophiles from being able to go βoh lol a buddy sent that to me as a jokeβ and getting acquitted. The courts donβt care why you have CSAM on your server. All they care about is the fact that you do. And since you own the server, you own the CSAM and theyβll prosecute you for it.
That's not a troll, CSAM goes well beyond trolling, pedophile would be a more accurate term for them.
Criminals.
I would like to extend my sincerest apologies to all of the users here who liked lemmy shit posting. I feel like I let the situation grow too out of control before getting help. Don't worry I am not quitting. I fully intend on staying around. The other two deserted the community but I won't. Dm me If you wish to apply for mod.
Sincerest thanks to the admin team for dealing with this situation. I wish I linked in with you all earlier.
@[email protected] this is not your fault. You stepped up when we asked you to and actively reached out for help getting the community moderated. But even with extra moderators this can not be stopped. Lemmy needs better moderation tools.
Hopefully the devs will take the lesson from this incident and put some better tools together.
There's a Matrix Room for building mod tools here maybe we might want to bring up this issue there, just in case they aren't already aware.
Please, please, please do not blame yourself for this. This is not your fault. You did what you were supposed to do as a mod and stepped up and asked for help when you needed to, lemmy just needs better tools. Please take care of yourself.
Contact the FBI
This isn't as crazy as it may sound either. I saw a similar situation, contacted them with the information I had, and the field agent was super nice/helpful and followed up multiple times with calls/updates.
This doesn't sound crazy in the least. It sounds like exactly what should be done.
yha, what do people think the FBI is for... this isn't crazy. They can get access to ISP logs, VPN provider logs, etc.
This is good advice; I suspect they're outside of the FBI's jurisdiction, but they could also be random idiots, in which case they're random idiots who are about to become registered sex offenders.
They might be, but I'd imagine most countries have laws on the books about this sort of stuff too.
This is seriously sad and awful that people would go this far to derail a community. It makes me concerned for other communities as well. Since they have succeeded in having shitpost closed does this mean they will just move on to the next community? That being said here is some very useful information on the subject and what can be done to help curb CSAM.
The National Center for Missing & Exploited Children (NCMEC) CyberTipline: You can report CSAM to the CyberTipline online or by calling 1-800-843-5678. Your report will be forwarded to a law enforcement agency for investigation. The National Sexual Assault Hotline: If you or someone you know has been sexually assaulted, you can call the National Sexual Assault Hotline at 800-656-HOPE (4673) or chat online. The hotline is available 24/7 and provides free, confidential support.
The National Child Abuse Hotline: If you suspect child abuse, you can call the National Child Abuse Hotline at 800-4-A-CHILD (422-4453). The hotline is available 24/7 and provides free, confidential support. Thorn: Thorn is a non-profit organization that works to fight child sexual abuse. They provide resources on how to prevent CSAM and how to report it.
Stop It Now!: Stop It Now! is an organization that works to prevent child sexual abuse. They provide resources on how to talk to children about sexual abuse and how to report it.
Childhelp USA: Childhelp USA is a non-profit organization that provides crisis intervention and prevention services to children and families. They have a 24/7 hotline at 1-800-422-4453. Here are some tips to prevent CSAM:
Talk to your children about online safety and the dangers of CSAM.
Teach your children about the importance of keeping their personal information private. Monitor your children's online activity.
Be aware of the signs of CSAM, such as children being secretive or withdrawn, or having changes in their behavior. Report any suspected CSAM to the authorities immediately.
Not that I'm familiar with Rust at all, but... perhaps we need to talk about this.
The only thing that could have prevented this is better moderation tools. And while a lot of the instance admins have been asking for this, it doesnβt seem to be on the developers roadmap for the time being. There are just two full-time developers on this project and they seem to have other priorities. No offense to them but it doesnβt inspire much faith for the future of Lemmy.
Lets be productive. What exactly are the moderation features needed, and what would be easiest to implement into the Lemmy source code? Are you talking about a mass-ban of users from specific instances? A ban of new accounts from instances? Like, what moderation tool exactly is needed here?
Speculating:
Restricting posting from accounts that don't meet some adjustable criteria. Like account age, comment count, prior moderation action, average comment length (upvote quota maybe not, because not all instances use it)
Automatic hash comparison of uploaded images with database of registered illegal content.
On various old-school forums, there's a simple (and automated) system of trust that progresses from new users (who might be spam)... where every new user might need a manual "approve post" before it shows up. (And this existed in Reddit in some communities too).
And then full powers granted to the user eventually (or in the case of StackOverlow, automated access to the moderator queue).
The amount of people in these comments asking the mods not to cave is bonkers.
This isnβt Reddit. These are hobbyists without legal teams to a) fend off false allegations or b) comply with laws that they donβt have any deep understanding of.
This is flat out disgusting. Extremely questionable someone having an arsenal of this crap to spread to begin with. I hope they catch charges.
There are just two full-time developers on this project and they seem to have other priorities. No offense to them but it doesnβt inspire much faith for the future of Lemmy.
this doesn't seem like a respectful comment to make. People have responsibilities; they aren't paid for this. It doesn't seem to fair to make criticisms of something when we aren't doing anything to provide a solution. A better comment would be "there are just 2 full time developers on this project and they have other priorities. we are working on increasing the number of full time developers."
Imagine if you were the owner of a really large computer with CSAM in it. And there is in fact no good way to prevent creeps from putting more into it. And when police come to have a look at your CSAM, you are liable for legal bullshit. Now imagine you had dependents. You would also be well past the point of being respectful.
On that note, the captain db0 has raised an issue on the github repository of LemmyNet, requesting essentially the ability to add middleware that checks the nature of uploaded images (issue #3920 if anyone wants to check). Point being, the ball is squarely in their court now.
I agree with you, I'd just gently suggest that it's borne of what is probably significant upset at having to deal with what they're having to deal with.
I hope the devs take this seriously as an existential threat to the fediverse. Lemmyshitpost was one of the largest communities on the network both in AUPH and subscribers. If taking the community down is the only option here, that's extremely insufficient and bodes death for the platform at the hands of uncontrolled spam.
Fucking bastards. I don't even know what beef they have with the community and why, but using THAT method to get them to shut down is nothing short of despicable. What absolute scum.
Please get some legal advice, this is so fucked up.
Genuine question: won't they just move to spamming CSAM in other communities?
With how slow Lemmy moves anyways, it wouldn't be hard to make everything "mod approved" if it's a picture/video.
We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.
It's likely that we'll be seeing a large number of instances switch to whitelist based federation instead of the current blacklist based one, especially for niche instances that does not want to deal with this at all (and I don't blame them).
Sounds like the 4chan raids of old.
Batten down, report the offender's to the authorities, and then clean up the mess!
Good job so far ^_^
How does closing lemmyshitpost do anything to solve the issue? Isn't it a foregone conclusion that the offenders would just start targeting other communities or was there something unique about lemmyshitpost that made it more susceptible?
Is it possible to (at least temporarily):
- Turn off instance image hosting (disable pictrs)
- Disallow image and video posts across all communities
- As in Firefish, turn off caching of remote images from other instances.
whilst longer term solutions are sought? This would at least ensure poor mods aren't exposed to this shit and an instance could be more positive they're not inadvertently hosting CSAM.
good thing you did it the way you did nobody should have to look at awful stuff like this. keep your mind healthy nobody should have to deal with that
Thank you so much for all of the effort and time all of you are putting into this situation. Having to deal with bad actors is one thing, but you are now dealing with images that are traumatizing to view.
Please, for your sanity and overall well being, PLEASE take care of yourself. Yes, it sucks about having to close !lemmyshitpost, but self-care and support are of the utmost importance.
I'm afraid the fediverse will need a crowdsec-like decentralized banning platform. Get banned one platform for this shit, get banned everywhere.
I'm willing to participate in fleshing that out.
Edit: it's just an idea, I do not have all the answers, otherwise I'd be building it.
I assume you've contacted the FBI, but if not PLEASE DO.
Thank you for your work to keep that despicable trash out of our feeds. Sorry you have to deal with it. Fuck those losers.
Thank you for all your work. It sucks that there are people who would do shit like this. Please don't forget to take care of yourselves as well.
Looks like Google has some tooling available that might help: https://protectingchildren.google/tools-for-partners
Probably other options too.