this post was submitted on 29 Mar 2024
341 points (93.4% liked)

Technology

61774 readers
3988 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

you are viewing a single comment's thread
view the rest of the comments
[–] brbposting 16 points 10 months ago (1 children)

It’s unacceptable.

We have legal and justice systems to deal with this.

For reference, here’s how we’re doing with child porn. Platforms with problems include (copying from my comment two months ago):

Ill adults and poor kids generate and sell CSAM. Common to advertise on IG, sell on TG. Huge problem as that Stanford report shows.

Telegram got right on it (not). Fuckers.

[–] mindbleach 2 points 10 months ago (1 children)

Obligatory nitpick: you can't generate CSAM. That's not what "CSAM" means. The entire point of calling it that is to distinguish actual pictures of actual children suffering real-life sexual abuse... versus things that didn't happen.

It's like someone claiming "I'll generate proof of murder." You sure won't, bucko. Whatever gross image you produce - it will never be that. Especially if the guy in the image is still walking around.

[–] brbposting 2 points 10 months ago (1 children)

You sure won't, bucko.

Ahaha true!

Clarifying: my copied comment used “generate” to mean “produce”.

In any case your point is well taken. Something to think about.

[–] mindbleach 1 points 10 months ago

Ahh, yeah, I guess that'd be the definition used prior to... all of this.

Though doing that on-demand for ten bucks is a whole different kind of fucked up.