this post was submitted on 01 Mar 2025
44 points (97.8% liked)

World News

1035 readers
394 users here now

Rules:

  1. Be a decent person
  2. No spam
  3. Add the byline, or write a line or two in the body about the article.

Other communities of interest:

founded 6 months ago
MODERATORS
 

Police in 20 countries have taken down a network that distributed images of child sexual abuse entirely generated by artificial intelligence.

The operation – which spanned European countries including the United Kingdom as well as Canada, Australia, New Zealand – is "one of the first cases" involving AI-generated images of child sexual abuse material, Europe's law enforcement agency Europol, which supported the action, said in a press release.

Danish authorities led an operation, which resulted in 25 arrests, 33 house searches and 173 devices being seized.

you are viewing a single comment's thread
view the rest of the comments
[–] mindbleach 10 points 1 month ago (2 children)

How can anyone believe these models have a big pile of hyper-illegal go-to-jail images, labeled specifically for word-to-image training?

This technology combines concepts. That's why it's a big fucking deal. Generating a thousand images of Darth Shrektopus riding a horse on the moon does not imply a single example of exactly that. The model knows what children are - the model knows what pornography is. It can satisfy those unrelated concepts, simultaneously.

[–] [email protected] 4 points 1 month ago (1 children)
[–] mindbleach 2 points 1 month ago

The best punchline would be if you manually Photoshopped this together circa 2012.

And the only clear thing to suggest it's not that, is the middle left tentacle.

[–] QBertReynolds 1 points 1 month ago

Because that's exactly what happens. The vast majority of AI porn isn't coming from unmodified models. People use LoRAs and their massive porn collections to retrain and fine tune models to generate their exact fetish. Pedos are definitely doing that too.