this post was submitted on 07 Nov 2023
145 points (82.2% liked)

Technology

59525 readers
3081 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 27 points 1 year ago (2 children)

That’s how GANs are trained, and I haven’t seen anything about GPT4 (or DALL-E) being trained this way. It seems like current generative AI research is moving away from GANs.

[–] [email protected] 4 points 1 year ago

I know it’s intrinsic to GANs but I think I had read that this was a flaw in the entire “detector” approach to LLMs as well. I can’t remember the source unfortunately.

[–] [email protected] 4 points 1 year ago

Also one very important aspect of this is that it must be possible to backpropagate the discriminator. If you just have access to inference on a detector of some kind but not the model weights and architecture itself, you won't be able to perform backpropagation and therefore can't generate gradients to update your generator's weights.

That said, yes, GANs have somewhat fallen out of favor due to their relatively poor sample diversity compared to diffusion models.