453
Child psychiatrist jailed after using AI to make pornographic deep-fakes of kids
(www.theregister.com)
This is a most excellent place for technology news and articles.
What's horse?
What's modern?
You're a complete moron if you think "nudity", a concept, is equivalent to an adjective without a verb which you omitted because otherwise you cannot even cope with sending it since you know that you're wrong.
Horse is a concept, to the machine. That's the point. It cares even less about grammar than you do, as you insist there's no verb in the sentence 'what is horse.' (It's is.)
Type in "purple apple" and it'll give you the concept of apple plus the concept of purple.
The concept of nudity is in every wonky NSFW content detection network. The machine absolutely has some model of what it means. Rage about it to someone else, if you can't handle that.
Correct, and these models produce hallucinations. That's literally what the process is called.
Do you think AI is a camera? Are you still convinced there's a horse hearse somewhere out there, in real life? The model makes shit up. The shit it makes up is based on a bunch of labeled images - for example, hearses and horses. It can combine the two even if those two things never coexist, anywhere. So of course it can combine the two even if those two things simply do not coexist, in its training data.
The training data includes children.
The training data includes nudity.
That's enough.
Correct.
Which of those things do you think AI produces? Hallucinations, or reality?
How does this relate to you being unable to understand that a car with wheels replaced to be horse legs is incomparable to inferring what a naked child looks like from adult porn?
What about it is incomparable, dingus? That's literally what it does. As surely as it combines any other two things.
The machine doesn't know the difference. It's just pixels and labels.
Steampunk isn't a thing, but AI can generate the hell out of it.
... images of naked people definitely exist.
If they're labeled, then an AI trained on them will discern differences from images of not-naked people.
Like how "dark fantasy" isn't an object, but AI can distill the label and apply it to other things.
It can absolutely infer things like that.
Inferring things is the whole fucking idea.
How else do you think this technology works? We shovel labeled images into a pile of matrices, and the process figures out which patterns correspond to which labels. An image that is both a horse and a hearse is not any different, to the machine, than an image that both a child and nude.
It doesn't work, if it did, you'd have posted a model that does what you are claiming it can easily do.
Since it wasn't trained on child porn, it's not illegal nor any of its results either, so why don't you?
How the fuck would I post a model? Would you even know what to do with it?
You've already ignored multiple images demonstrating concepts applied to things. You know you can find more, at the drop of a hat. But instead of wondering how those work - you've pulled a goalpost from your ass, and demand that I, personally, provide you the ability to create what you consider child pornography.
No.
You only demonstrated that I'm right.
Also no.
The technology works the way it works, no matter how you posture. Your willful ignorance is not vindicated by the fact I, personally, don't generate child porn. What kind of asshole even asks for that? Why would any random commenter correcting you with an image of a horse hearse necessarily generate anything? It's one of a hundred images posted here, every week, that disproves how you insist this works. You can deal with that or not.
I'll call you a lot worse if you can't figure out 'make child porn for me' is an insane demand. As if knowing how this year's most impactful technology works means I, personally, am an experienced user. (And prepared to drag your ignorant ass through the process of setting it up.)
And now your obscene moving goalpost is... matching a naked photograph of yourself, as a child? I'm not convinced you understand what AI does. It makes up things that don't exist. If you take a photo of a guy and ask for that guy as a young astronaut, and that photo is Buzz Aldrin eating dinner yesterday, it's not gonna produce an existing image of Buzz Aldrin on the fuckin' moon. Not even if it has that exact image in its training data. What it got from that training image is more like 'astronaut means white clothes.' Except as a pile of weighted connections, where deeper layers... why am I bothering to explain this? You're not listening. You're just skimming this and looking for some way to go 'you didn't jump through my gross hoop, therefore nuh-uh.'
If you want to fuck around with Stable Diffusion, you don't need me to do it. I'd be no help - I haven't used it. But it's evidently fairly easy to set up, and you can test all your stupid assertions about how it does or doesn't work.
... oh my god, I just realized the dumbest part of your demand. If I somehow did "post a model" (what file format would that even be?) that did exactly what you ask, it wouldn't prove what was or wasn't in the training data. So you'd just loop back around to going 'ah-HA, there must have been hyper-illegal images of exactly that, in the training data.' Your grasp for burden-of-proof doesn't even fit.
Pictured: skimming for some way to go 'nuh-uh.'
Troll harder.
What else did I say about it?