this post was submitted on 28 Feb 2025
264 points (99.3% liked)

Technology

64937 readers
3987 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] sugar_in_your_tea 1 points 1 week ago (1 children)

At that point you have an actual victim and evidence of harm. If the image depicts an actual person, you run into a ton of other laws that punish such things.

But if the child doesn't actually exist, who exactly is the victim?

Yeah, it would be CSAM if it were real. But it's not, so it's not CSAM, therefore no victim.

[–] otp 2 points 1 week ago (1 children)

I replied to another comment with a definition from a definitive source. Computer-generated CSAM is the preferred term. Call it CSEM if you prefer. (E = exploitation)

CSAM/CSEM refers to images/material depicting the sexual Abuse/Exploitation of a child.

AI-generated CSAM means the AI produced images that depict sexual exploitation of children.

You can ask the model to generate murder scenes. You then have AI-generated images of murder scenes. That doesn't mean anybody was killed. It's not illegal to own images of murder scenes, but it's often illegal to own images of CSEM.

Whether the CSEM being AI-generated is enough to protect you in the eye of the law for crimes of owning CSEM is something to take up with a legal professional in your particular jurisdiction.

[–] sugar_in_your_tea 2 points 1 week ago

Whether the CSEM being AI-generated is enough to protect you in the eye of the law for crimes of owning CSEM is something to take up with a legal professional in your particular jurisdiction.

And that's where I take issue. It shouldn't be legal to prosecute someone without a victim.

That doesn't change the law, so you have good advice here. But if I'm put on a jury on a case like this, I would vote to nullify.