this post was submitted on 05 Oct 2023
37 points (70.3% liked)

Unpopular Opinion

6335 readers
51 users here now

Welcome to the Unpopular Opinion community!


How voting works:

Vote the opposite of the norm.


If you agree that the opinion is unpopular give it an arrow up. If it's something that's widely accepted, give it an arrow down.



Guidelines:

Tag your post, if possible (not required)


  • If your post is a "General" unpopular opinion, start the subject with [GENERAL].
  • If it is a Lemmy-specific unpopular opinion, start it with [LEMMY].


Rules:

1. NO POLITICS


Politics is everywhere. Let's make this about [general] and [lemmy] - specific topics, and keep politics out of it.


2. Be civil.


Disagreements happen, but that doesn’t provide the right to personally attack others. No racism/sexism/bigotry. Please also refrain from gatekeeping others' opinions.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Shitposts and memes are allowed but...


Only until they prove to be a problem. They can and will be removed at moderator discretion.


5. No trolling.


This shouldn't need an explanation. If your post or comment is made just to get a rise with no real value, it will be removed. You do this too often, you will get a vacation to touch grass, away from this community for 1 or more days. Repeat offenses will result in a perma-ban.



Instance-wide rules always apply. https://legal.lemmy.world/tos/

founded 1 year ago
MODERATORS
 

It seems crazy to me but ive seen this concept floated on several different post. There seems to be a number of users here that think there is some way AI generated CSAM will reduce Real life child victims.

Like the comments on this post here.

https://sh.itjust.works/post/6220815

I find this argument crazy. I don't even know where to begin to talk about how many ways this will go wrong.

My views ( which are apprently not based in fact) are that AI CSAM is not really that different than "Actual" CSAM. It will still cause harm when viewing. And is still based in the further victimization of the children involved.

Further the ( ridiculous) idea that making it legal will some how reduce the number of predators by giving predators an outlet that doesnt involve real living victims, completely ignores the reality of the how AI Content is created.

Some have compared pedophilia and child sexual assault to a drug addiction. Which is dubious at best. And pretty offensive imo.

Using drugs has no inherent victim. And it is not predatory.

I could go on but im not an expert or a social worker of any kind.

Can anyone link me articles talking about this?

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 33 points 1 year ago (3 children)

Boy this sure seems like something that wouldn't be that hard to just... do a study on, publish a paper perhaps? Get peer reviewed?

It's always weird for me when people have super strong opinions on topics that you could just resolve by studying and doing science on.

"In my opinion, I think the square root of 7 outta be 3"

Well I mean, that's nice but you do know there's a way we can find out what the square root of seven is, right? We can just go look and see what the actual answer is and make an informed decision surrounding that. Then you don't need to have an "opinion" on the matter because it's been put to rest and now we can start talking about something more concrete and meaningful... like interpreting the results of our science and figuring out what they mean.

I'd much rather discuss the meaning of the outcomes of a study on, say, AI Generated CSAM's impact on proclivity in child predators, and hashing out if it really indicates an increase or decrease, perhaps flaws in the study, and what to do with the info.

As opposed too just gesturing and hand waving about whether it would or wouldn't have an impact. It's pointless to argue about what color the sky outta be if we can just, you know, open the window and go see what color the sky actually is...

[–] [email protected] 15 points 1 year ago

I love your enthusiasm for research but if only it were that easy. I'm a phd researcher and my field is sexual violence. It's really not that easy to just go out and interview child sex offenders about their experiences of perpetration.

[–] [email protected] 13 points 1 year ago* (last edited 1 year ago)

[This comment has been deleted by an automated system]

[–] [email protected] 6 points 1 year ago* (last edited 1 year ago)

While I agree that studies would help, actually performing those studies has historically been very difficult. Because the first step to doing a study on pedophilia is actually finding a significant enough number of pedophiles who are willing and able to join the study. And that by itself is a tall order.

Then you ask these pedophiles (who are for some reason okay with admitting to the researchers that they are, in fact, pedophiles) to self-report their crimes. And you expect them to be honest? Any statistician will tell you that self-reported data is consistently the least reliable data, and that’s doubly unreliable when you’re basically asking them to give you a confession that could send them to federal prison.

Or maybe you try going the court records/police FOIA request route? Figure out which court cases deal with pedos, then figure out if AI images were part of the evidence? But that has issues of its own, because you’re specifically excluding all the pedos who haven’t offended or been caught; You’re only selecting the ones who have been taken to court, so your entire sample pool is biased. You’re also missing any pedos who have sealed records or sealed evidence, which is fairly common.

Maybe you go the anonymous route. Let people self report via a QR code or anonymous mail. But a single 4chan post could ruin your entire sample pool, and there’s nothing to stop bad actors from intentionally tainting your study. Because there are plenty of people who would jump at a chance to make pedos look even worse than they already do, to try and get AI CSAM banned.

The harsh reality is that studies haven’t been done because there simply isn’t a reliable way to gather data while controlling for bias. With pedophilia being taboo, any pedophiles will be dissuaded from participating. Because it means potentially outing yourself as a pedophile. And at that point, your best case scenario is having enough money to ghost your entire life.

[–] [email protected] 16 points 1 year ago (2 children)

That's so fucked up that anyone thinks that enablement is a genuine means of reduction here...

[–] [email protected] 9 points 1 year ago

Go check out how many downvotes i got on that post i linked.

[–] Hanabie 9 points 1 year ago

The way I see it, and I'm pretty sure this will get downvoted, is that pedophiles will always find new material on the net. Just like actual, normal porn, people will put it out.

With AI-generated content, at least there's no actual child being abused, and it can feed the need for ever new material without causing harm to any real person.

I find the thought of kiddie porn abhorrent, and I think for every offender who actually assaults kids, there are probably a hundred who get off of porn, but this porn has to come from somewhere, and I'd rather it's from an AI.

What's the alternative, hunt down and eradicate every last closeted pedo on the planet? Unrealistic at best.

[–] mindbleach 8 points 1 year ago (8 children)

There is no such thing as generated CSAM.

That's the entire point of calling it "CSAM."

If you still want those images treated the same - fine. But treating them the same will never make them the same.

It is fundamentally impossible to abuse children who do not exist.

It cannot "further victimize the children involved," because there are no children involved.

We are talking about drawings.

Fancier drawings than with pen and paper, yes - but still made-up images. Renderings. Hallucinations. They depict things that did not actually happen. They can depict persons who have never lived. They can involve characters who are not human. Please stop lumping that together with photographic evidence of child rape.

load more comments (8 replies)
[–] [email protected] 7 points 1 year ago

I agree with you, I saw people on twitter once talking about this. Pretty disgusting to even consider.

[–] [email protected] 6 points 1 year ago (15 children)

AI CSAM is not really that different than “Actual” CSAM

How do you not see how fucking offensive this is. A drawing is not really different from a REAL LIFE KID being abused?

It will still cause harm when viewing

The same way killing someone in a video game will cause harm?

And is still based in the further victimization of the children involved.

The made up children? What the hell are you talking about?

Some have compared pedophilia and child sexual assault to a drug addiction

No one sane is saying actually abusing kids is like a drug addiction. But you're conflating pedophilia and assault. When it's said pedophilia is like a drug addiction, it's non offending pedophiles that is being discussed. Literally no one thinks assaulting kids is like a drug addiction. That's your own misunderstanding.

Can anyone link me articles talking about this?

About what exactly? There's 0 evidence that drawings or fantasies cause people to assault children.

load more comments (15 replies)
[–] [email protected] 6 points 1 year ago* (last edited 1 year ago) (2 children)

I'm just gonna put this out here and hope not to end up on a list:

Let's do a thought experiment and be empathetic with the human that is behind the predators. Ultimately they are sick and they feel needs that cannot be met without doing something abhorrent. This is a pretty fucked up situation to be in. Which is no excuse to become a predator! But understanding why people act how they act is important to creating solutions.

Most theories about humans agree that sexual needs are pretty important for self realization. For the pedophile this presents two choices: become a monster or never get to self realization. We have got to accept that this dilemma is the root of the problem.

Before there was only one option of getting a somewhat middleway solution: video and image material which the consumer could rationalize as being not as bad. Note that that isn't my opinion, I agree with the popular opinion that that is still harming children and needs to be illegal.

Now for the first time there is a chance to cut through this dilemma by introducing a third option: generated content. This is still using the existing csam as a basis. But so does every database that is used to find csam for prevention and policing. The actual pictures and videos aren't stored in the ai model and don't need to be stored after the model has been created. With that model more or less infinite new content can be created, that imo does harm the children significantly less directly. This is imo different from the actual csam material because noone can tell who is and isn't in the base data.

Another benefit of this approach has to do with the reason why csam exists in the first place. AFAIK most of this material comes from situations where the child is already being abused. At some point the abuser recognises that csam can get them monetary benefits and/or access to csam of other children. This is where I will draw a comparison to addiction, because it's kind of similar: people doing illegal stuff because they have needs they can't fulfill otherwise. If there is a place to get the "clean" stuff, much less people would go to the shady corner dealer.

In the end I think there is an utilitarian argument to be made here. With the far removed damage that generating csam via ai still deals to the actual victims we could help people to not become predators, help predators to not repeat, and most importantly prevent or at least lessen the amount of further real csam being created.

[–] [email protected] 10 points 1 year ago (2 children)

Except there is a good bit of evidence to show that consuming porn is actively changing how we behave related to sex. By creating CSAM by AI, you create the depiction of a child that is mere object for the use of sexual gratification. That fosters a lack of empathy and an ego centric, self gratifying viewpoint. I think that can be said of all porn, honestly. The more I learn about what porn does to our brains the more problematic I see it

[–] [email protected] 4 points 1 year ago* (last edited 1 year ago) (1 children)

I agree with this.

The more I learn about what porn does to our brains the more problematic I see it

And I agree with this especially. Turns out a brain that was/is at least in part there to get us to procreate isn't meant to get this itch scratched 24/7.

But to answer your concern: I will draw another comparison with addiction: Giving addicitive drugs out like candy isn't wise just as it wouldn't be wise to give access to generated csam to everyone. You'd need a control mechanism so that only people that need access get access. Admitedly this will deter a few people from getting their fix from the controlled instances compared to the completely free access. With drugs this seems to lead to a decrease of the amount of street-sold drugs though, so I see no reason this wouldn't be true, at least to some extent, for csam.

[–] [email protected] 2 points 1 year ago (1 children)

I'm an advocate of safe injection sites, so I will agree somewhat here. Safe injection sites work because they identify addicts and aggressively supply them with resources to counteract the need for the addiction in the first place, all while encouraging less and less use. This is an approach that could have merit for pedophiles, but there are some issues that pop up with it as well that are unique- to consume a drug, the drug must enter the body somehow, where it is metabolized.

CSAM on the other hand, is taken in simply by looking at it. There is no "gloves on" approach to generating or handing the content without absorbing it- the best that can be hoped for is have it generated by someone completely 'immune' to it, which raises questions about how "sexy" they could make the content- if it doesn't "scratch the itch" the addicts will simply turn back to the real stuff.

There is a slim argument to be made that you could actually create MORE pedophiles through classical conditioning by exposing nonpedophilic people to erotic content paired with what looks like children. You could of course have it produced and handled by recovering/in treatment pedophiles, but that sounds like it defeats the point of limited access entirely and is therefore still bad, at least to the ones in charge of distribution.

Additionally, digital content isn't destroyed upon consumption like a drug, and you have a more minor but still real problem of content diversion, where content made for the program is spread to those not getting the help that was meant to be paired with it. This is an issue, of course, but could be rationalized as worth it so long as at least some pedophiles were being treated.

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago)

Yes there are a lot of open questions around this, especially about the who and how of generation, and tbh it makes me a bit uncomfortable to think about a system like this in detail, because it will have to include rating these materials on a "sexyness" scale which feels revolting.

[–] [email protected] 6 points 1 year ago* (last edited 1 year ago) (4 children)

[This comment has been deleted by an automated system]

load more comments (4 replies)
[–] [email protected] 5 points 1 year ago

How is this an unpopular opinion?

[–] [email protected] 3 points 1 year ago

People like that are pedo apologists and the fact that they're not being banned from the major Lemmy instances tells us all we need to know about those worthless shitholes.

[–] Gorilladrums 1 points 1 year ago

The way I see it is like this, Pedos, like everybody else, have sexual urges, and like most people, they will try to do something about them... but they can't for obvious reasons. Since the stigma against pedophilia is so great, a lot them are too afraid to come out and get help. Therefore, they tend to feel repressed. They're in a situation where they can't do what they want and they're too afraid to get help, and so they try to bottle things up until they snap.

When that happens, they tend to snap in one of two ways. The first way is by trying to seek out CP online and the second is to actually molest a child. Both of these options are terrible because the former generates demand for more CP content to created, and thus puts more children in abusive situations, and the latter is just straight up child rape. I think a lot of them understand the risk of doing either, but they do them anyway either because they don't care either or because they can't help themselves. However, there might be a solution for at least some of them.

If we assume that a certain portion of pedophiles are too afraid to do anything but too repressed not to, then providing them with a legal outlet could push at least some them away from harming actual kids. That's where I see AI generated CP come in. I see it as an outlet, and I don't think it's anything new either. We've had the lolicon/shotacon shit for awhile, which is basically just drawn CP. As disgusting as it is, it hasn't resulted in any major spikes in child sexual assault rates as far as I am aware. Therefore, if fake CP, including AI generated CP, doesn't use any actual CP to generate the images and they can keep pedos away from kids then I don't see the issue with it. Sure, it is gross, but we do have to remind ourselves that we don't ban thing on how gross they are. The reason why we ban CP in the first place isn't because it's gross, but because it actually harms the kids mentally and physically. If AI generated CP can keep any pedos from harming kids then I see that as a win.

load more comments
view more: next ›