this post was submitted on 28 Feb 2025
264 points (99.3% liked)

Technology

64653 readers
5800 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] mindbleach 6 points 1 week ago (20 children)

You cannot generate CSAM. That's the entire point of calling it CSAM!

CSAM stands for "photographic evidence of child rape." If there's no child - then that didn't happen, did it? Fictional crimes don't tend to be identically illegal, for obvious reasons. Even if talking about murder can rise to the level of a crime - it's not the same crime, because nobody died.

distribution of images of minors fully generated by artificial intelligence

What minors? You're describing victims who are imaginary. Zero children were involved. They don't fucking exist.

We have to treat AI renderings the same way we treat drawings. If you honestly think Bart Simpson porn should be illegal - fine. But say that. Say you don't think photographs of child rape are any worse than or different from a doodle of a bright yellow dick. Say you want any depiction of the concept treated as badly as the real thing.

Because that's what people are doing, when they use the language of child abuse to talk about a render.

[–] otp -3 points 1 week ago (13 children)

Some AI programs remove clothes from images of clothed people. That could cause harm to a child in the image or to their family.

And the reason it can be called AI-generated CSAM is because the images are depicting something that would be CSAM if it were real. Just like we could say CGI murder victims. Or prosthetic corpses. Prosthetics can't die, so they can't produce corpses. But we can call them prosthetic corpses because they're prosthetics to simulate real corpses.

I'm curious as to why you seem to be defending this so vehemently though. You can call it AI CP if it makes you feel better.

[–] Eezyville 6 points 1 week ago (1 children)

Photoshop can remove the clothes off a child too. Should we ban that and arrest people that use it? What about scissors and tape? You know the old fashion way. Take a picture of a child and put the face on someone else body. Should we arrest people for doing that? This is all a waste of money and resources. Go after actual abusers and save real children instead of imaginary AI generated 1's and 0's.

[–] otp 0 points 6 days ago

Should we ban that and arrest people that use it?

Nobody is saying we should ban AI and arrest the people using it.

Should we arrest people who use photoshop to edit the clothing off of children to produce CSEM? YES! Why is that your defense of this?...

Take a picture of a child and put the face on someone else body. Should we arrest people for doing that?

YES! Creating CSEM is illegal in a lot of jurisdictions.

Do you want people doing that for your kids?

Hell, CSEM can make a lot of money. Are you going to do that with your own kids? Help them save up for their education!

load more comments (11 replies)
load more comments (17 replies)