this post was submitted on 03 Nov 2023
286 points (96.1% liked)

Technology

58011 readers
3199 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Teen boys use AI to make fake nudes of classmates, sparking police probe::Parents told the high school "believed" the deepfake nudes were deleted.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] -5 points 10 months ago (3 children)

If you're making porn of real underage people, I have no problem with you being put on the pedo registry.

If no serious harm was done, I'm fine with convicting them and then doing full expungement after 5-10 years.

[–] [email protected] 26 points 10 months ago (2 children)

And youre proof that the pedo registry shouldnt exist as is.

Teenagers being sexually interested in their peers is not pedophilia, and you want to ruin a decade of their life guaranteed, with the """"""promise""""""" of an expungement that would never actually happen thanks to the permanent nature of the internet for it.

This misuse of AI is a crime and should be punished and deterred, obviously. But labeling children about to enter the world as pedophiles basically for the rest of their lives?

Youre kind of a monster.

[–] [email protected] -1 points 10 months ago (1 children)

What about the fact that the girls who are victims of something like this will have to contend with the pictures being online if someone posts them there? What if people who don't know that the pictures depict minors re-post them to other sites, making them very difficult to remove? That can cause very serious employablity problems. It doesn't matter how open minded people are, they don't want porn coming up if someone googles one of their employees.

[–] [email protected] 4 points 10 months ago

The creation is still a crime, no one said otherwise.

It is just not an act of pedophilia.

[–] [email protected] -5 points 10 months ago (1 children)

If you produce CP, you should be on a registry for producing and distributing CP. If you create CP, you are enabling pedophilia.

[–] [email protected] 6 points 10 months ago (1 children)

They are children. Being horny about classmates.

Being sexually aroused by people your own age and wishing to fantasize about it is not enabling pedophilia, you literal psychopath.

[–] [email protected] -4 points 10 months ago (1 children)

Circulating porn of minors is a crime and enables pedophiles. Not to mention teenage girls could easily commit suicide over something like this.

[–] [email protected] 3 points 10 months ago* (last edited 10 months ago) (2 children)

So does yearbook and any other kind of photos that depict children for that matter

You can’t keep pushing the goal posts, by your logic young people should never date or take photos together because it could enable pedophiles somewhere somehow

These are children with brains still in development, they are discovering themselves and you want to label them forever a pedophile because they didn’t make a conscious effort to research how their spanking material could potentially enable a pedo (because we all know pedos can only be enabled by things produced by kids… yeah that’s the real threat)

Instead of suggesting a way to help the victims you are advocating for the creation of yet more victims

What a pathetic brain dead stance you are defending

[–] [email protected] -2 points 10 months ago (1 children)

A yearbook photo is not porn.

[–] [email protected] 1 points 10 months ago (1 children)

And an AI image with a face photoshopped over it isnt a photo of a child.

And a teen being sexually interested in other teens isnt a pedophile.

[–] [email protected] 1 points 10 months ago (1 children)

It's still child porn and someone getting off to child porn is a pedophile.

[–] [email protected] 0 points 10 months ago

So, to clarify.

You think 2 15 year olds having sex makes them both pedophiles?

[–] [email protected] 12 points 10 months ago (1 children)

I'd argue that someone making porn of someone their own age is not pedophilia.

[–] [email protected] -3 points 10 months ago

They're still making porn of a minor. That is harmful to them and it enables any pedophiles who find it.

[–] [email protected] 4 points 10 months ago (1 children)

That's an easy enough judgement when the perpetrator is an adult. What do you do when the perpetrator is a minor themselves? As they are in this article.

Of course their still needs to be some sort of recourse, but for every other crime there is a difference between being tried as a child or being tried as an adult.

I find it tough to consider myself.

[–] [email protected] -1 points 10 months ago

Considering the consequences for a high school student if porn of them gets circulated, I'm fine with putting them on the registry. Expungement can happen later based on the aftermath. Teenage girls have killed themselves over this sort of thing.