this post was submitted on 19 Sep 2023
622 points (98.0% liked)

Europe

8324 readers
1 users here now

News/Interesting Stories/Beautiful Pictures from Europe πŸ‡ͺπŸ‡Ί

(Current banner: Thunder mountain, Germany, πŸ‡©πŸ‡ͺ ) Feel free to post submissions for banner pictures

Rules

(This list is obviously incomplete, but it will get expanded when necessary)

  1. Be nice to each other (e.g. No direct insults against each other);
  2. No racism, antisemitism, dehumanisation of minorities or glorification of National Socialism allowed;
  3. No posts linking to mis-information funded by foreign states or billionaires.

Also check out [email protected]

founded 1 year ago
MODERATORS
 

Police investigation remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: β€œUndress anybody with our free service!”

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 246 points 11 months ago (5 children)

This was just a matter of time - and there isn't really that much the affected can do (and in some cases, should do). Shutting down that service is the correct thing - but that'll only buy a short amount of time: Training custom models is trivial nowadays, and both the skill and hardware to do so is in reach of the age group in question.

So in the long term we'll see that shift to images generated at home, by kids often too young to be prosecuted - and you won't be able to stop that unless you start outlawing most of AI image generation tools.

At least in Germany the dealing with child/youth pornography got badly botched by incompetent populists in the government - which would send any of those parents to jail for at least a year, if they take possession of one of those generated pictures. Having it sent to their phone and going to police for a complaint would be sufficient to get prosecution against them started.

There's one blessing coming out of that mess, though: For girls who did take pictures, and had them leaked, saying "they're AI generated" is becoming a plausible way out.

[–] [email protected] 127 points 11 months ago (41 children)

There's one blessing coming out of that mess, though: For girls who did take pictures, and had them leaked, saying "they're AI generated" is becoming a plausible way out.

Indeed, once the AI gets good enough, the value of pictures and videos will plummet to zero.

Ironically, in a sense we will revert back to the era before photography existed. To verify if something is real, we might have to rely on witness testimony.

[–] [email protected] 57 points 11 months ago (1 children)

Politics is about to get WILD

[–] [email protected] 19 points 11 months ago* (last edited 11 months ago) (1 children)

Dwayne Elizondo Mountain Dew Herbert Camacho approves!

Shit's going to get real emotional

load more comments (1 replies)
[–] [email protected] 37 points 11 months ago (1 children)

To verify if something is real, we might have to rely on witness testimony.

This is not going to work. Just because images and videos become less reliable that doesn't mean we will forget about the fact that eyewitness testimony is very unreliable.

[–] [email protected] 26 points 11 months ago

You say "forget" like it's not still incredibly common as evidence.

There's lots of data showing that eyewitnesses aren't reliable but that doesn't mean courts actually stopped relying on it. Ai making another form of evidence untrustworthy will result in eyewitnesses taking its place.

[–] [email protected] 30 points 11 months ago (4 children)

Indeed, once the AI gets good enough, the value of pictures and videos will plummet to zero.

This just isn't true. They will still be used to sexualise people, mostly girls and women, against their consent. It's no different from AI-generated child pornography. It does harm even if no 'real' people appear in the images.

Fucking horrible world we're forced to live in. Where's the fucking exit?

[–] [email protected] 13 points 11 months ago (1 children)

It is different than AI-generated CSAM because real people are actually being harmed by these deepfake images.

[–] [email protected] 10 points 11 months ago* (last edited 11 months ago) (8 children)

I was replying to someone who was claiming they aren't harmful as long as everyone knows they're fake. Maybe nitpick them, not me?

Reak kids are harmed by AI CSAM normalising a problem they should be seeking help for, not getting off on.

load more comments (8 replies)
load more comments (3 replies)
[–] [email protected] 18 points 11 months ago (8 children)

A bit off topic, but I wonder if the entertainment industry as a whole is going to be completely destroyed by AI when it gets good enough.

I can totally see myself prompting β€œa movie about love in the style of Star Wars, with Ryan Gosling and Audrey Hepburn as the leads, directed by Alfred Hitchcock, written by Vincent Hugo.” And then what? It’s game over for any content creation.

Curious if I’ll see that kind of power at home (using open source tools) in my lifetime.

load more comments (8 replies)
[–] [email protected] 10 points 11 months ago

Holy shit, I never thought of the whole witness testimony aspect. For some reason my mind was just like β€œwell, nothing we see in videos or pictures is real anymore, guess everyone is just gonna devolve into believing whatever confirms their bias and argue endlessly about which pictures are fake and which are real.”

Witness testimony and live political interactions are going to become incredibly important for how our society views β€œthe truth” in world events in the near future. I don’t know if I love or hate that.

load more comments (36 replies)
[–] [email protected] 18 points 11 months ago

Same goes for any deepfake. People are loosing their shit because we won't know what's real and what's not!.

We should have been teaching critical thinking a generation ago. Sagan was pleading for reform in the 90s. We can start teaching the next generation how to navigate the Information Age. What we can't do is make the world childproof.

load more comments (3 replies)
[–] [email protected] 107 points 11 months ago (1 children)

At least now you can claim it's AI if your real nudes leak

[–] [email protected] 75 points 11 months ago (1 children)

In the long term that might even lead to society stopping their freak-outs every time someone in some semi-sensitive position is discovered to have nude pictures online.

[–] [email protected] 25 points 11 months ago

I hope so. We shouldn't be ashamed of our bodies or sexuality.

[–] [email protected] 59 points 11 months ago* (last edited 11 months ago) (5 children)

Interesting. Replika AI, ChatGPT etc crack down on me for doing erotic stories and roleplay text dialogues. And this Clothoff App happily draws child pornography of 14 year olds? Shaking my head...

I wonder why they have no address etc on their website and the app isn't available in any of the proper app-stores.

Obviously police should ask Instagram who blackmails all these girls... Teach them a proper lesson. And then stop this company. Have them fined a few millions for generating and spreading synthetic CP. At least write a letter to their hosting or payment providers.

load more comments (5 replies)
[–] [email protected] 41 points 11 months ago (2 children)

Yes, lets name the tool in the article so everybody can participate in the abuse

[–] [email protected] 31 points 11 months ago (1 children)

I doubt it will do much of anything not to name it.

[–] [email protected] 11 points 11 months ago (1 children)

Considering that AI services typically cost money, especially those advertising adult themes, it kinda does do support the hosters of such services.

[–] [email protected] 12 points 11 months ago (2 children)

Then again, naming and shaming puts pressure on them too. But in the end I doubt it matters. Those who want to use them will find them.

load more comments (2 replies)
load more comments (1 replies)
[–] [email protected] 33 points 11 months ago (6 children)

The shock value of a nude picture will become increasingly humdrum as they become more widespread. Nudes will become so common that no one will batt an eye. In fact, some less endowed, less perfect ladies will no doubt do AI generated pictures or movies of themselves to sell on the internet. Think of it as photoshop X 10.

[–] [email protected] 45 points 11 months ago (15 children)

This isn't about nude photos, it's about consent.

[–] [email protected] 48 points 11 months ago (11 children)

I can already get a canvas and brush and draw what I think u/DessertStorms looks like naked and there is nothing you can do about it.

load more comments (11 replies)
[–] [email protected] 32 points 11 months ago

Photoshopped nude pictures of celebrities (and people the photoshopper knew personally) have been around for at least 30 years at this point. This is not a new issue as far as the legal situation is concerned, just the ease of doing it changed a bit.

load more comments (13 replies)
load more comments (5 replies)
[–] [email protected] 28 points 11 months ago (4 children)

Banning diffusion models doesn't work, the tech is already out there and you can't put it back in the box. Fake nudes used to be done with PhotoShop, the current generative AI models only makes them faster to make.

This can only be stopped on the distribution side, and any new laws should focus on that.

But the silver lining of this whole thing is that nude scandals for celebs aren't really possible any more if you can just say it's probably a deepfake.

load more comments (4 replies)
[–] [email protected] 28 points 11 months ago (2 children)

Can this come full circle so I can shirtcock it and later say, "dog, that's AI" when people post pictures?

load more comments (2 replies)
[–] [email protected] 27 points 11 months ago* (last edited 11 months ago)

The only thing new about this is that the photos are probably more realistic, but still fake. Apps to do this existed before GenAI was a thing

[–] [email protected] 21 points 11 months ago (2 children)

Maybe something will change as soon as people start creating and distributing fake AI nudes of that country’s leaders.

[–] [email protected] 16 points 11 months ago (1 children)

Honestly surprised this didn't happen first.

Be a great way to discredit politicians in homophobic states, by showing a politician taking it up the arse.

load more comments (1 replies)
load more comments (1 replies)
[–] [email protected] 14 points 11 months ago (1 children)

That's really, really sad, EU, please try to regulate AI.

load more comments
view more: next β€Ί