this post was submitted on 29 Mar 2024
341 points (93.4% liked)
Technology
61774 readers
3988 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It’s unacceptable.
For reference, here’s how we’re doing with child porn. Platforms with problems include (copying from my comment two months ago):
Obligatory nitpick: you can't generate CSAM. That's not what "CSAM" means. The entire point of calling it that is to distinguish actual pictures of actual children suffering real-life sexual abuse... versus things that didn't happen.
It's like someone claiming "I'll generate proof of murder." You sure won't, bucko. Whatever gross image you produce - it will never be that. Especially if the guy in the image is still walking around.
Ahaha true!
Clarifying: my copied comment used “generate” to mean “produce”.
In any case your point is well taken. Something to think about.
Ahh, yeah, I guess that'd be the definition used prior to... all of this.
Though doing that on-demand for ten bucks is a whole different kind of fucked up.