this post was submitted on 08 Oct 2024
351 points (91.5% liked)

Technology

58599 readers
3828 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Well, this just got darker.

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 30 points 2 days ago (9 children)

Ain't that what are the tools there for. I mean I don't like cp and I don't want to engage in way with people who like it. But I use those llms to describe fantasies that I wouldn't even talk about with other humans.

As long as they don't do it on real humans nobody is hurt.

[–] [email protected] 2 points 8 hours ago

As a man who was descending to a dark place when I was a teen, I can say this with confidence:

This kind of content, like CP or r*pe-y stuff, even if clearly not real and only a fantasy, feeds these desires, and makes them grow. In time, if you continue to foster it, they will bleed into real life, and then it becomes a real problem. That's why this kind of stuff is scary.

Thankfully, I was able to spot this pattern before it became a problem, that is a dangerous slippery slope

[–] [email protected] 7 points 1 day ago* (last edited 1 day ago) (1 children)

As long as they don’t do it on real humans nobody is hurt.

Living out the fantasy of having sex with children (or other harmful sexual practices and fantasies) wit AI or alike can strengthen the wish to actual do it in RL. It can weaken the strength to abstain. If you constantly have fantasies where for example "the child AI wanted it too" then it can desensitize you and making it harder and harder to push that thought aside when in a tempting situation. Instead of replacing the real thing with a fantasy you are preparing for the real thing. Some pedophiles already interpret children's behavior as sexual that isn't at all, but the AI might be told to act in that way and strengthen these beliefs.

This is still something that is and has to be studied more to fully understand it. Of course this is difficult because of the stigma. There might be differences between people who only are attracted to children and ones that are attracted to adults and children and there is just not enough data yet, but even the communities in which pedophiles who do not act on their attraction discuss coping strategies this is heavily discussed and controversial.

If you are interested in the subject a bit more, this is a start: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8419289/

load more comments (1 replies)
[–] [email protected] 15 points 2 days ago (1 children)

The problem with AI generated CP is that if they're legal, it opens a new line of defense for actual CP. You would need to prove the content is not AI to convince real abusers. This is why it can't be made legal, it needs to be prosecuted like real CP to be sure to convict actual abusers.

[–] [email protected] 21 points 1 day ago* (last edited 1 day ago) (3 children)

This is an incredibly itchy and complicated theme. So I will try not go go really further into it.

But prosecute what is essentially a work of fiction seems bad.

This it not even a topic new to the AI. CP has been wildly represented in both written and graphical media. And the consensus in most free countries is not to prosecute those as they are a work of fiction.

I cannot think why an AI written CP fiction is different from human written CP fiction.

I suppose "AI big bad" justify it for some. But for me there should be a logical explanation behind if we would began to prosecute works of fiction why some will be prosecuted and why other will not. Specially when the one that's being prosecuted is just regurgitating the human written stories about CP that are not being prosecuted nowadays.

I essentially think that a work of fiction should never be prosecuted to begin with, no matter the topic. And I also think that an AI writing about CP is no worse than an actual human doing the same thing.

[–] [email protected] 2 points 1 day ago

I'm not claiming it's legally simple but the difference is that this new "fiction" is very hard, if not impossible to distinguish from reality. Nowadays AI can form a regular human hand.

load more comments (2 replies)
load more comments (6 replies)
[–] [email protected] 249 points 3 days ago (12 children)

This isn't surprising, it's inevitable.

If you folks knew how common pedophilic fantasies are amongst the general public, you would be shocked. Just look to cultures like Japan and Russia that don't strongly condemn such things, and you'll find it's about 15% of the population. It's only less in the West because of the near homicidal stigma attached to it that makes people vigorously hide that part of themselves.

Fortunately, this also shows that the vast majority of those people don't offend.

We also tend to define pedophilia as "anything sexual involving a minor", while reacting to it as if it means "violent rape of a toddler", so no shit, we sexualize youth all the time, the 18 year mark is a legal and social formality, not a hard limit on human attraction. Adults will find themselves attracted to teens, and they won't reveal that because who the fuck ever would?

If anything, the issue isn't that people have these attractions and fantasies, it is that some portion of those people can't separate fantasy from reality and are willing to hurt a child to get what they want, or they are sociopaths that consume child porn without feeling disgust for witnessing horrific child abuse.

[–] [email protected] 71 points 2 days ago (10 children)

I think the common incest fantasy in the west isn't too far removed from this too. Like all the actors are above age minimums but they pretend to be step kids or babysitters like these roles aren't commonly associated with children and older teens. It's clearly a form of deflection IMO.

[–] [email protected] 5 points 1 day ago* (last edited 1 day ago)

Personally I think the rise in incest porn has to do with the rise in isolationism. Lots of people, young men especially, are going out less and less and having more of their social interactions online. As a consequence of this, for a number of these men, the vast majority of the real life female interactions they get are from women in their own homes. And biology has a way of adapting, so I think a lot these men are getting confusing feelings about people in their own homes due largely just to lack of outside exposure to women.

load more comments (9 replies)
load more comments (11 replies)
[–] [email protected] 166 points 3 days ago (35 children)

I actually don't think this is shocking or something that needs to be "investigated." Other than the sketchy website that doesn't secure user's data, that is.

Actual child abuse / grooming happens on social media, chat services, and local churches. Not in a one on one between a user and a llm.

[–] [email protected] 4 points 1 day ago (1 children)

Why tf are there so many people okay with people roleplaying child sexual abuse AT ALL??? Real or fake KEEP AN EYE ON ALL OF THEM.

I dont care if its a real child or a fucking bot, that shit is disgusting, and the AI is the reason how some pedos are able to generate cp of children without having to actually get their hands on children.

The fact someone will look at this and go "Yea but what about the REAL child rapists huh??" Is astounding. Mfcker if a grown ass adult is willing to make a bot that is promoted to act like a horny toddler, then what exactly is stopping them from looking at real children that way.

Keep in mind, Im not talking about Lolicon, fuck that. I'm talking about people generating images of realistic or semirealistic children to use as profiles for sex bots. I'm talking about AI. I'VE ACTUALLY SEEN PEOPLE DO THIS, someone actually did this with my character recently. They took the likeness of my character and began generating porn with it using prompts like "Getting fcked on the playground, wet psy, little girl, 6 year old, 2 children on playground, slut..."

Digital or not this shit still affects people, it affects people like me. These assholes deserve to be investigated for even attempting this kinda shit on the clear net.

And before you ask, the character that belonged to me looks really young because I look really young. I got severe ADHD which makes me mentally stunted or childish, and that gets reflected in my OCs or fursonas. This person took a persona, an extension of me PERSONALLY, lowered her age on purpose, and made porn of her. That fuckin hurts dude. Especially after speaking about how close these characters are to me. I'm aware it could be a troll, but honest to god, the prompt they used was demonstrably specific and detailed. Some loser online drawing kanna's feet hurts me way less than someone using AI to generate faux CP and then roleplay with those same bots or prompts. What hurts me more is that there's no restrictions on some AI's to stop people from generating images like this. I don't wanna see shit like this become commonplace or "fine" to do. Keep tabs on individuals like this, cus they VERY WELL could be using the likeness/faces of REAL children for AI CP and that's just as bad.

[–] [email protected] 1 points 1 day ago (1 children)

Im not talking about Lolicon, fuck that.

I think that this is ironic and a poor choice of words. It's almost a pun.

[–] [email protected] 1 points 1 day ago

DAMN- YOU RIGHT 😭

[–] [email protected] 27 points 2 days ago

It's the "burn that witch" reaction.

See how they hate pedophiles and not child rapists.

The crowd wants to feel its power by condemning (and lynching if possible) someone.

I'd rather want to investigate those calling for "investigation" and further violation of privacy of people who for all we know have committed no crime.

That's about freedom of speech and yelling "fire" in a crowded theater and thousand hills radio, you know the argument.

load more comments (33 replies)
[–] [email protected] 14 points 2 days ago (2 children)

If it's one thing I trust chronically online incel creeps to do, it's manipulate online tools to access or create CSAM.

[–] [email protected] 2 points 1 day ago

Theres people arguing here that AI gened CP May not be that bad at all cus its fictional.

I wanna blow my actual fucking head off, genuinely. This ain't even the realm of lolicon anymore. Just straight up, realistic cheese pizza.

If that's the case then sharing of all that AI Taylor swift porn should be fine too cus its fictional. It may be of a real and public figure, but it's not REALLY her nudes!! Idk man- eughhhhhhh, all this rubs me the wrong way, no pun intended

Imagine just looking at an online AI gallery and seeing literal AI CP, just out there, public, free to use. It gives me the impression that people only care about child abuse or CP once it involves a real child, not that its general existence is an absolute endangerment to real children if they happen to get caught in the crossfire; that allowing people to fester that content as a "Coping mechanism" instead of getting help may just normalize shit like this or desensitize people to that kinda content. Like imagine stumbling upon that shit and seeing porn AI gened porn of someone who looks exactly like you, adult or child. Even worse if that person who made it knows you. Again- not real art, AI. Idk man- again it just all make me feel sick and queasy...

[–] [email protected] 1 points 1 day ago

It's a subset of Rule 34.

[–] [email protected] 10 points 2 days ago

wow really, i never would have guessed/s

[–] [email protected] 31 points 2 days ago

Not at all surprising but also it is an AI

load more comments
view more: next ›