this post was submitted on 01 Feb 2024
49 points (78.2% liked)

Australia

3649 readers
19 users here now

A place to discuss Australia and important Australian issues.

Before you post:

If you're posting anything related to:

If you're posting Australian News (not opinion or discussion pieces) post it to Australian News

Rules

This community is run under the rules of aussie.zone. In addition to those rules:

Banner Photo

Congratulations to @[email protected] who had the most upvoted submission to our banner photo competition

Recommended and Related Communities

Be sure to check out and subscribe to our related communities on aussie.zone:

Plus other communities for sport and major cities.

https://aussie.zone/communities

Moderation

Since Kbin doesn't show Lemmy Moderators, I'll list them here. Also note that Kbin does not distinguish moderator comments.

Additionally, we have our instance admins: @[email protected] and @[email protected]

founded 2 years ago
MODERATORS
 

After Nine blamed an 'automation' error in Photoshop for producing an edited image of Georgie Purcell, I set out to find out what the software would do to other politicians.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 14 points 10 months ago (1 children)

The issue is also present in DallE and bing image generation. Hypothesis is the sheer amount of porn being generated is affecting the models.

When I tried to create a joke profile for Tinder with some friends, I tried "woman eating fried chicken". Blocked result. "Man eating fried chicken" works.
Tried "Man T posing on beach" clothed. Woman T posing on beach, blocked.
Woman t posing at sunset on beach, returned nearly silhouetted nude image. Same thing for guy, clothed.

Went back to the first one, had to specify that the woman was wearing clothes to make it return the image. Sometimes specifying specific articles.

[–] [email protected] 7 points 10 months ago (1 children)

Your hypotheses makes no sense?

People generating porn would make no change to its training data set.

[–] [email protected] 4 points 10 months ago (2 children)

You wouldn't feed the images people generate and save back into the system to improve it?

[–] [email protected] 4 points 10 months ago (1 children)

This actually doesn't work to improve the model, generally. It's not new information for it.

[–] [email protected] 0 points 10 months ago* (last edited 10 months ago)

Yup. But they would logically have bots up to troll for new posts and would be consuming social media posts with their own generated data.

Also they would absolutely feed in successful posts back into the system. You'd be stupid to not refine successful generations to further help the model.

[–] [email protected] 3 points 10 months ago* (last edited 10 months ago)

Not after the initial training, no.

That would make it less effective, because instead of being trained on known real things it’s being further reinforced on its own hallucinations.