this post was submitted on 13 Aug 2023
1072 points (96.0% liked)

Technology

60123 readers
2725 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

As the AI market continues to balloon, experts are warning that its VC-driven rise is eerily similar to that of the dot com bubble.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 5 points 1 year ago* (last edited 1 year ago)

They are, but training models is hard and inference (actually using them) is (relatively) cheap. If you make a a GPT-3 size model you don't always need the full H100 with 80+ gb to run it when things like quantization show that you can get 99% of its performance at >1/4 the size.

Thus NVIDIA selling this at 3k as an 'AI' card, even though it wont be as fast. If they need top speed for inference though, yea, H100 is still the way they would go.