this post was submitted on 01 Mar 2025
45 points (100.0% liked)
World News
721 readers
832 users here now
Rules:
- Be a decent person
- No spam
- Add the byline, or write a line or two in the body about the article.
Other communities of interest:
founded 5 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
You can't... generate... abuse.
No more than you can generate murder.
The entire point of saying "child abuse images" is to distinguish evidence of rape from, just, drawings.
If you want drawings of this to also be illegal, fine, great, say that. But stop letting people use the language of actual real-world molestation of living human children, when describing some shit a guy made up alone.
How did they train the model? I'd say it's just as problematic if the generator was trained using CSAM.
How can anyone believe these models have a big pile of hyper-illegal go-to-jail images, labeled specifically for word-to-image training?
This technology combines concepts. That's why it's a big fucking deal. Generating a thousand images of Darth Shrektopus riding a horse on the moon does not imply a single example of exactly that. The model knows what children are - the model knows what pornography is. It can satisfy those unrelated concepts, simultaneously.
nootch.
The best punchline would be if you manually Photoshopped this together circa 2012.
And the only clear thing to suggest it's not that, is the middle left tentacle.