this post was submitted on 27 Jan 2024
248 points (88.8% liked)
Not The Onion
12666 readers
938 users here now
Welcome
We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!
The Rules
Posts must be:
- Links to news stories from...
- ...credible sources, with...
- ...their original headlines, that...
- ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”
Comments must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.
And that’s basically it!
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
You can punish the results, and demand takedowns, but anything more would really overstep what's feasible. These networks are not going away. If they somehow did - the cat's out of the bag, on how they're trained. They even used consumer hardware. Lots of it... but a lot of money's gone toward using less. Some people even have terabytes of images, already downloaded. New protections literally cannot stop these programs from existing.
And frankly I refuse to get mad about that. I want the technology to write "Santa Clause riding a velocirapter" and get an image of that thing. There's no version of that which won't also handle "woman, but naked." Nor is there a version of Photoshop that can stop you from doing it yourself.
That's kind of a defeatist attitude. Just because the problem can't be fixed entirely doesn't mean that it's not worthwhile to do anything about it.
Legislation could force strict logging requirements on cloud image generation services. Make it so the logs are only accessible with court orders. If strict criteria are given for what is bad use, it would be nice, it'd likely be easiest for this case to lean into copyright infringement, since Taylor Swift is probably like one of the actors from those wrestling TV shows and someone owns her likeness, but that doesn't help the average joe at all, so it'd need to be more open somehow, which is likely to get us into speech issues. I'm not a lawyer, so hopefully someone more experienced than me can iron that one out.
Lay out fines for failure to comply at maybe a years gross income, so that you don't have anyone deciding that the storage to store all the images and information is too expensive. For bonus points, you could lay out fines to US based payment processors for providing payment services to non-compliant computer generation services, this would help add a barrier to entry for using non-compliant services hosted outside of the US.
These images are generated so quickly, that it'd be impossible for a human to search them all, so maybe require the logging system create hashes of every 10x10 pixel group in an image, or every single generated sentence in a paragraph. Stuff like that, maybe also have some be random or something based on a value in the logs. That way someone who is lazy or slips up while editing their data can get their stuff found faster. With that and storage of whatever the user typed to generate the images, and which user generated which item, it could really narrow down results and might make it feasibly possible to figure out who did what.
Combined with payment information and such, it would be easy to find out who created which images and distribute punishment. This might slow the spread of such images by creating the fear of retribution for creating them, or by forcing a user to use slower consumer hardware instead of faster commercial infrastructure. It does not fix the problem, because the cat is out of the bag, but it does help.
This is just one way to approach the issue, it doesn't fix it, but it helps, so it's a step in the right direction. And it's just what I thought of on my own in a few minutes, imagine what a room full of people really thinking hard could accomplish. Something even better! It's not so hopeless a problem that we should just give up on solving it.
However, there does exist another issue on the table that I've heard a lot of folks ask about these kinds of things in the past. It's not "How could you solve this" but "Why should we solve this". Just because the White House thinks this is wrong, doesn't mean the general public does. And if the general public thinks this is wrong, it doesn't necessarily mean that their elected representatives do too. So we may never even see anyone try to solve this one...
I think it might be good to dip our legal toes into this uncharted computer generation territory soon though, before it gets too far away from us.
'The government could use this to demand more surveillance' is why impossible goals have to called out. The next step is always 'that didn't do anything, let's spy harder.'
And local models are subject to precisely none of that. Unless, of course, you demand we spy harder, and lock down this nascent technology, because drugs porn terrorism abracadabra your rights disappeared.