this post was submitted on 05 Oct 2023
37 points (70.3% liked)

Unpopular Opinion

6335 readers
51 users here now

Welcome to the Unpopular Opinion community!


How voting works:

Vote the opposite of the norm.


If you agree that the opinion is unpopular give it an arrow up. If it's something that's widely accepted, give it an arrow down.



Guidelines:

Tag your post, if possible (not required)


  • If your post is a "General" unpopular opinion, start the subject with [GENERAL].
  • If it is a Lemmy-specific unpopular opinion, start it with [LEMMY].


Rules:

1. NO POLITICS


Politics is everywhere. Let's make this about [general] and [lemmy] - specific topics, and keep politics out of it.


2. Be civil.


Disagreements happen, but that doesn’t provide the right to personally attack others. No racism/sexism/bigotry. Please also refrain from gatekeeping others' opinions.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Shitposts and memes are allowed but...


Only until they prove to be a problem. They can and will be removed at moderator discretion.


5. No trolling.


This shouldn't need an explanation. If your post or comment is made just to get a rise with no real value, it will be removed. You do this too often, you will get a vacation to touch grass, away from this community for 1 or more days. Repeat offenses will result in a perma-ban.



Instance-wide rules always apply. https://legal.lemmy.world/tos/

founded 1 year ago
MODERATORS
 

It seems crazy to me but ive seen this concept floated on several different post. There seems to be a number of users here that think there is some way AI generated CSAM will reduce Real life child victims.

Like the comments on this post here.

https://sh.itjust.works/post/6220815

I find this argument crazy. I don't even know where to begin to talk about how many ways this will go wrong.

My views ( which are apprently not based in fact) are that AI CSAM is not really that different than "Actual" CSAM. It will still cause harm when viewing. And is still based in the further victimization of the children involved.

Further the ( ridiculous) idea that making it legal will some how reduce the number of predators by giving predators an outlet that doesnt involve real living victims, completely ignores the reality of the how AI Content is created.

Some have compared pedophilia and child sexual assault to a drug addiction. Which is dubious at best. And pretty offensive imo.

Using drugs has no inherent victim. And it is not predatory.

I could go on but im not an expert or a social worker of any kind.

Can anyone link me articles talking about this?

you are viewing a single comment's thread
view the rest of the comments
[–] mindbleach 8 points 1 year ago (1 children)

There is no such thing as generated CSAM.

That's the entire point of calling it "CSAM."

If you still want those images treated the same - fine. But treating them the same will never make them the same.

It is fundamentally impossible to abuse children who do not exist.

It cannot "further victimize the children involved," because there are no children involved.

We are talking about drawings.

Fancier drawings than with pen and paper, yes - but still made-up images. Renderings. Hallucinations. They depict things that did not actually happen. They can depict persons who have never lived. They can involve characters who are not human. Please stop lumping that together with photographic evidence of child rape.

[–] [email protected] -3 points 1 year ago (1 children)

So having depictions of cartoon Jews being killed in gas chambers or cartoon black people enslaved and depicted as such is not harmful, because the Jews and black people in those cartoons aren't real, so they weren't harmed and thus it is entirely okay to do?

You cannot simply ignore the context of a thing existing. Historical, cultural or legal something like this would be very wrong and very harmful due to the messages it sends. There is a good reason why plenty of countries don't just band actual CSAM but depictions of it as well, because it normalizes and makes harmless of a thing that is anything but that.

[–] mindbleach 4 points 1 year ago (1 children)

Arrest the guy who did Maus, I guess.

Unless you can tell the difference between depicting bad things and endorsing them in real life.

[–] [email protected] 1 points 1 year ago (1 children)

But that is just the thing. Depictions of CSAM very often suggest that the victims are enjoying what is happening. This isn't about a caricature or a satire type art piece trying to hold up a mirror. OP argues against generated CSAM, because it is or would be used to satisfy urges. In this particular case, the depiction equates to an endorsement. I doubt you can be like "Hey, get your rocks off to this, but remember, its bad! But have it anyway, we're not endorsing this though."

[–] mindbleach 4 points 1 year ago (1 children)

If you think all porn says 'do this in real life,' I have terrifying news about some stuff involving adults.

[–] [email protected] 1 points 1 year ago (1 children)

Adults can consent. Children cannot. Big difference.

[–] mindbleach 3 points 1 year ago (1 children)

Adults can also be raped, and there's plenty of porn depicting that.

[–] [email protected] 1 points 1 year ago (1 children)

Yes, but adults can actually consent to being in such scenes and there are laws in place that are aimed to prevent actual rape occurring (whether or not these laws are effective or effectively enforced isn't the question and there's probably a lot to be done there still to ensure the safety of actresses and actors). Actual crimes in that way should be prosecuted and aren't okay either. A depiction of CSAM cannot depict any legal scenario at all, ever, because children are incapable of consent. Having depictions that help normalize or suggest that it is okay, is harmful.

[–] mindbleach 1 points 1 year ago

"Protecting actors" is real life, not a matter of what's being depicted.

Does pornography depicting rape cause harm?