this post was submitted on 20 Sep 2023
322 points (97.6% liked)

World News

38262 readers
2001 users here now

A community for discussing events around the World

Rules:

We ask that the users report any comment or post that violate the rules, to use critical thinking when reading, posting or commenting. Users that post off-topic spam, advocate violence, have multiple comments or posts removed, weaponize reports or violate the code of conduct will be banned.

All posts and comments will be reviewed on a case-by-case basis. This means that some content that violates the rules may be allowed, while other content that does not violate the rules may be removed. The moderators retain the right to remove any content and ban users.


Lemmy World Partners

News [email protected]

Politics [email protected]

World Politics [email protected]


Recommendations

For Firefox users, there is media bias / propaganda / fact check plugin.

https://addons.mozilla.org/en-US/firefox/addon/media-bias-fact-check/

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 19 points 11 months ago (5 children)

You can't stop them being made, they're just the same deepfakes people have been making before. It's important to note that they're not photos of people, they're guesses made by a algorithm.

[–] [email protected] 62 points 11 months ago (1 children)

While you're completely right, that's hardly a consolation for those affected. The damage is done, even if it's not actually real, because it will be convincing enough for at least some.

[–] [email protected] 5 points 11 months ago (2 children)

While I understand your point, what consolation can be provided?

[–] [email protected] 12 points 11 months ago

I think the people who made the pictures have to suffer consequences. Otherwise this sends the message as if it was just fair game to behave that way.

[–] [email protected] 19 points 11 months ago* (last edited 11 months ago)

The faces are not generated, and that is where the damage comes. It targets the girls for humiliation by implying that they allowed the nudes to be taken of them. Depending upon the location and circumstances, this could get the girls murdered. Think of "honor killings" by fundamentalists. It makes them targets for further sexual abuse, too. Anyone distributing the photos are at fault, as well as the people who made the photos.

The problem goes deeper, though. We can never trust a photo as proof of anything, again. Let that sink in, what it means to society.

[–] [email protected] 14 points 11 months ago (1 children)

To push back your attempt to minimalise what's going on here ...

Yes, they're not actually photos of the girls. But, nor is a photo of a naked person actually the same as that person standing in front of you naked.

If being seen naked is unwanted and embarrassing etc, why should a photo of you naked be embarrassing, and, to make my point, what difference would it make if the photo is more or less realistic? An actual photo can be processed or taken under certain lighting or with a certain lens or have been taken some time in the past ... all factors that lessen how close it is to the current naked appearance of the subject. How unrealistic can a photo be before it's no longer embarrassing?

Psychologically, I'd say it's pretty obvious that the embarrassment of a naked image is that someone else now has a relatively concrete image in their minds of what the subject looks like naked. It is a way of being seen naked by proxy. A drawn or painted image could probably have the same effect.

There's probably some range of realism within which there's an embarrassing effect, and I'd bet AI is very capable of getting in that range pretty easily these days.

While the technology is out there now ... it doesn't mean that our behaviours with it are automatically acceptable. Society adapts to the uses and abuses new technology has and it seems pretty obvious that we're yet to culturally curb the abuses of this technology.

[–] [email protected] 6 points 11 months ago (2 children)

Exactly, the technology is out there and will not cease to exist. Maybe we'll digitally sign our photos in the future so that deepfakes can be sorted out by that.

[–] [email protected] 17 points 11 months ago

Omg it's NFTs time to shine!!!!

/S

[–] [email protected] 5 points 11 months ago

Will everyone be expected to have some kind of official PGP key?