You cannot generate CSAM. That's the entire point of calling it CSAM!
CSAM stands for "photographic evidence of child rape." If there's no child - then that didn't happen, did it? Fictional crimes don't tend to be identically illegal, for obvious reasons. Even if talking about murder can rise to the level of a crime - it's not the same crime, because nobody died.
distribution of images of minors fully generated by artificial intelligence
What minors? You're describing victims who are imaginary. Zero children were involved. They don't fucking exist.
We have to treat AI renderings the same way we treat drawings. If you honestly think Bart Simpson porn should be illegal - fine. But say that. Say you don't think photographs of child rape are any worse than or different from a doodle of a bright yellow dick. Say you want any depiction of the concept treated as badly as the real thing.
Because that's what people are doing, when they use the language of child abuse to talk about a render.
I should not have to explain 'hyperbole that's obviously not literally true can be used to exaggerate and underline a point' when that is also the plain text of your comment. What are we doing, here? Are you okay?
I'm just pointing out that your point completely stands without the hyperbole.
If you're saying CSAM = rape, then you open yourself up to a ton of irrelevant arguments proving that it's not rape, when the point you should be defending is that it's abuse. Taking a picture of a kid from a distance isn't rape, but it is abuse. See the difference? The hyperbole just weakens your position, and that's what I'm trying to point out.
You're worried about my deliberately wrong acronym gag... because some sexual abuse isn't quite rape... when cops don't even care whether these children exist.
Some AI programs remove clothes from images of clothed people. That could cause harm to a child in the image or to their family.
And the reason it can be called AI-generated CSAM is because the images are depicting something that would be CSAM if it were real. Just like we could say CGI murder victims. Or prosthetic corpses. Prosthetics can't die, so they can't produce corpses. But we can call them prosthetic corpses because they're prosthetics to simulate real corpses.
I'm curious as to why you seem to be defending this so vehemently though. You can call it AI CP if it makes you feel better.
Photoshop can remove the clothes off a child too. Should we ban that and arrest people that use it? What about scissors and tape? You know the old fashion way. Take a picture of a child and put the face on someone else body. Should we arrest people for doing that? This is all a waste of money and resources. Go after actual abusers and save real children instead of imaginary AI generated 1's and 0's.
Yeah can't imagine why evidence of child rape would deserve special consideration. We only invented the term CSAM, as opposed to CP, specifically to clarify when it is real, versus the make-believe of 'wow, this would be bad, if it wasn't made up.'
Do you think CGI of a dead body is the same as murder?
That'd be bad if it was real! It'd be murder! But it - fucking - isn't, because it's not real.
I must underline, apparently: the entire reason for using the term "child sexual abuse material" is to describe material proving a child was sexually abused. That's kinda fucking important. Right? That's bad in the way that a human person dying is bad. If you treated James Bond killing someone, in a movie, the same way you treated an actual human person dying, people would think you're insane. Yet every fucking headline about these renders uses the same language as if typing words into a program somehow fucked an actual child.
The term “computer (or digitally) generated child sexual abuse material” encompasses all forms of material representing children involved in sexual activities and/or in a sexualised manner,
with the particularity that the production of the material does not involve actual contact abuse of
real children but is artificially created to appear as if real children were depicted. It includes what is
sometimes referred to as “virtual child pornography” as well as “pseudo photographs”.
[...]
There is nothing “virtual” or unreal
in the sexualisation of children, and these terms risk undermining the harm that children can suffer
from these types of practices or the effect material such as this can have on the cognitive distortions
of offenders or potential offenders. Therefore, terms such as “computer-generated child sexual abuse
material” appear better suited to this phenomenon [than virtual child pornography].
Terminology Guidelines for the
Protection of Children from Sexual
Exploitation and Sexual Abuse, section F.4.ii
There's a reputable source for the terminology usage.
If you want to keep defending CG CSAM, take it up with the professionals
I'm going to hold the words of the people who are actually fighting against child exploration in much higher regard than someone who is defending AI-generated CSAM/CSEM. And honestly, I don't understand why you're defending it. It's weirding me out..lol
As I wrote in another comment,
You can ask the model to generate murder scenes. You then have AI-generated images of murder scenes. That doesn't mean anybody was killed.
I'm fucking disappointed that anyone professionally engaged in this wants to equate damning evidence of physical abuse with generic representation of the concept - for the exact reasons already described.
There is an insurmountable difference between any depiction of a sex crime involving fictional children - and the actual sexual abuse of real living children. Fuck entirely off about throwing aspersions for why this distinction matters. If you don't think child rape is fundamentally worse than some imagined depiction of same - fuck you.
"Generated sexual abuse" is explicitly being equated to actual child rape.
These 25 people were not charged with thinkin' real hard about the possibility of murder. The sting is described like they were caught doing some murder.
At that point you have an actual victim and evidence of harm. If the image depicts an actual person, you run into a ton of other laws that punish such things.
But if the child doesn't actually exist, who exactly is the victim?
Yeah, it would be CSAM if it were real. But it's not, so it's not CSAM, therefore no victim.
I replied to another comment with a definition from a definitive source. Computer-generated CSAM is the preferred term. Call it CSEM if you prefer. (E = exploitation)
CSAM/CSEM refers to images/material depicting the sexual Abuse/Exploitation of a child.
AI-generated CSAM means the AI produced images that depict sexual exploitation of children.
You can ask the model to generate murder scenes. You then have AI-generated images of murder scenes. That doesn't mean anybody was killed. It's not illegal to own images of murder scenes, but it's often illegal to own images of CSEM.
Whether the CSEM being AI-generated is enough to protect you in the eye of the law for crimes of owning CSEM is something to take up with a legal professional in your particular jurisdiction.
Whether the CSEM being AI-generated is enough to protect you in the eye of the law for crimes of owning CSEM is something to take up with a legal professional in your particular jurisdiction.
And that's where I take issue. It shouldn't be legal to prosecute someone without a victim.
That doesn't change the law, so you have good advice here. But if I'm put on a jury on a case like this, I would vote to nullify.
You cannot generate CSAM. That's the entire point of calling it CSAM!
CSAM stands for "photographic evidence of child rape." If there's no child - then that didn't happen, did it? Fictional crimes don't tend to be identically illegal, for obvious reasons. Even if talking about murder can rise to the level of a crime - it's not the same crime, because nobody died.
What minors? You're describing victims who are imaginary. Zero children were involved. They don't fucking exist.
We have to treat AI renderings the same way we treat drawings. If you honestly think Bart Simpson porn should be illegal - fine. But say that. Say you don't think photographs of child rape are any worse than or different from a doodle of a bright yellow dick. Say you want any depiction of the concept treated as badly as the real thing.
Because that's what people are doing, when they use the language of child abuse to talk about a render.
Why not use the actual acronym definition?
Child Sexual Abuse Material
It's just as clear that it's about abuse (more than just rape). You can't abuse someone who doesn't exist...
Nevermind that it's right there in the headline - do you know how a joke works?
Where's the joke?
What you're doing is using hyperbole, and it's completely unnecessary to make your point.
I should not have to explain 'hyperbole that's obviously not literally true can be used to exaggerate and underline a point' when that is also the plain text of your comment. What are we doing, here? Are you okay?
I'm just pointing out that your point completely stands without the hyperbole.
If you're saying CSAM = rape, then you open yourself up to a ton of irrelevant arguments proving that it's not rape, when the point you should be defending is that it's abuse. Taking a picture of a kid from a distance isn't rape, but it is abuse. See the difference? The hyperbole just weakens your position, and that's what I'm trying to point out.
You're worried about my deliberately wrong acronym gag... because some sexual abuse isn't quite rape... when cops don't even care whether these children exist.
Pass.
Some AI programs remove clothes from images of clothed people. That could cause harm to a child in the image or to their family.
And the reason it can be called AI-generated CSAM is because the images are depicting something that would be CSAM if it were real. Just like we could say CGI murder victims. Or prosthetic corpses. Prosthetics can't die, so they can't produce corpses. But we can call them prosthetic corpses because they're prosthetics to simulate real corpses.
I'm curious as to why you seem to be defending this so vehemently though. You can call it AI CP if it makes you feel better.
Photoshop can remove the clothes off a child too. Should we ban that and arrest people that use it? What about scissors and tape? You know the old fashion way. Take a picture of a child and put the face on someone else body. Should we arrest people for doing that? This is all a waste of money and resources. Go after actual abusers and save real children instead of imaginary AI generated 1's and 0's.
Nobody is saying we should ban AI and arrest the people using it.
Should we arrest people who use photoshop to edit the clothing off of children to produce CSEM? YES! Why is that your defense of this?...
YES! Creating CSEM is illegal in a lot of jurisdictions.
Do you want people doing that for your kids?
Hell, CSEM can make a lot of money. Are you going to do that with your own kids? Help them save up for their education!
Yeah can't imagine why evidence of child rape would deserve special consideration. We only invented the term CSAM, as opposed to CP, specifically to clarify when it is real, versus the make-believe of 'wow, this would be bad, if it wasn't made up.'
Do you think CGI of a dead body is the same as murder?
That'd be bad if it was real! It'd be murder! But it - fucking - isn't, because it's not real.
I must underline, apparently: the entire reason for using the term "child sexual abuse material" is to describe material proving a child was sexually abused. That's kinda fucking important. Right? That's bad in the way that a human person dying is bad. If you treated James Bond killing someone, in a movie, the same way you treated an actual human person dying, people would think you're insane. Yet every fucking headline about these renders uses the same language as if typing words into a program somehow fucked an actual child.
There's a reputable source for the terminology usage.
If you want to keep defending CG CSAM, take it up with the professionals
"The professionals are also full of shit" is not much of an argument.
I'm going to hold the words of the people who are actually fighting against child exploration in much higher regard than someone who is defending AI-generated CSAM/CSEM. And honestly, I don't understand why you're defending it. It's weirding me out..lol
As I wrote in another comment,
You can ask the model to generate murder scenes. You then have AI-generated images of murder scenes. That doesn't mean anybody was killed.
That's all this is.
Or that any child was "explored."
I'm fucking disappointed that anyone professionally engaged in this wants to equate damning evidence of physical abuse with generic representation of the concept - for the exact reasons already described.
There is an insurmountable difference between any depiction of a sex crime involving fictional children - and the actual sexual abuse of real living children. Fuck entirely off about throwing aspersions for why this distinction matters. If you don't think child rape is fundamentally worse than some imagined depiction of same - fuck you.
That's not what it is.
Just like AI-generated murder scenes are not being equated to physical evidence of someone having been murdered.
I think you're getting caught up in semantics. Can we at least agree that those AI-generated images are bad?
"Generated sexual abuse" is explicitly being equated to actual child rape.
These 25 people were not charged with thinkin' real hard about the possibility of murder. The sting is described like they were caught doing some murder.
The 25 people charged may have had other incriminating evidence against them.
If you take issue with the law, take it up with the jurisdictions.
If you think it should be perfectly okay for people to produce AI-generated CSEM, then I'm not really sure we can come to an agreement here.
At that point you have an actual victim and evidence of harm. If the image depicts an actual person, you run into a ton of other laws that punish such things.
But if the child doesn't actually exist, who exactly is the victim?
Yeah, it would be CSAM if it were real. But it's not, so it's not CSAM, therefore no victim.
I replied to another comment with a definition from a definitive source. Computer-generated CSAM is the preferred term. Call it CSEM if you prefer. (E = exploitation)
CSAM/CSEM refers to images/material depicting the sexual Abuse/Exploitation of a child.
AI-generated CSAM means the AI produced images that depict sexual exploitation of children.
You can ask the model to generate murder scenes. You then have AI-generated images of murder scenes. That doesn't mean anybody was killed. It's not illegal to own images of murder scenes, but it's often illegal to own images of CSEM.
Whether the CSEM being AI-generated is enough to protect you in the eye of the law for crimes of owning CSEM is something to take up with a legal professional in your particular jurisdiction.
And that's where I take issue. It shouldn't be legal to prosecute someone without a victim.
That doesn't change the law, so you have good advice here. But if I'm put on a jury on a case like this, I would vote to nullify.