this post was submitted on 21 May 2024
512 points (95.4% liked)
Technology
60306 readers
2697 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Lol you don't need to train it ON CSAM to generate CSAM. Get a clue.
https://purl.stanford.edu/kh752sm9123
I don't know if we can say for certain it needs to be in the dataset, but I do wonder how many of the other models used to create CSAM are also trained on CSAM.
You can't generate CSAM because there's no C to A.
It should be illegal either way, to be clear. But you think theyre not training models on CSAM? Youre trusting in the morality/ethics of people creating AI generated child pornography?
The use of CSAM in training generative AI models is an issue no matter how these models are being used.
The training doesn't use csam, 0% chance big tech would use that in their dataset. The models are somewhat able to link concept like red and car, even if it had never seen a red car before.
Well, with models like SD at least, the datasets are large enough and the employees are few enough that it is impossible to have a human filter every image. They scrape them from the web and try to filter with AI, but there is still a chance of bad images getting through. This is why most companies install filters after the model as well as in the training process.
We're trusting that billion-dollar corporate efforts don't possess and label hyper-illegal images, specifically so people can make more of them. Because why the fuck would they.
If there was more money to be made than the cost of defending it they most definitely would.
'Google would love to be in the child pornography business' is quite a fucking take.
These assholes are struggling to stop their networks from generating Mickey Mouse even when someone specifically asks for Mickey Mouse. Why would any organization that size want radioactive criminal-to-possess inputs stirred into their venture-capital cash cow?
They're fine with platforming fascists for a buck. Why would they have a problem with kid porn, especially if they can maintain a veneer of plausible deniability