this post was submitted on 01 Aug 2023
525 points (82.2% liked)
Technology
59581 readers
3293 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Honestly it's just not being used correctly. I actually believe this is just user error.
These AI image creators rely on the base models they were trained with and more than likely were fed wayyyyy more images of Caucasians than anyone else. You can add weights to what you would rather see in your prompts, so while I'm not experienced with the exact program she used, the basics should be the same.
You usually have 2 sections, the main prompt (positive additions) and a secondary prompt for negatives, things you don't want to see. An example prompt could be "perfect headshot for linked in using supplied image, ((Asian:1.2))" Negative: ((Caucasian)), blue eyes, blonde, bad eyes, bad face, etc....
If she didn't have a secondary prompt for negatives I could see this being a bit more difficult, but still there are way better systems to use then. If she didn't like the results from the one she used instead of jumping to "AI racism!" she could have looked up what other systems exist. Hell, with the model I use with Automatic1111 I have to put Asian in my negatives because it defaults to that often.
Edit: figures I wrote all this then scrolled down and noticed all the comments saying the same thing lol at least we're on the same page