this post was submitted on 28 Feb 2025
311 points (99.1% liked)

Technology

63614 readers
2780 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 32 points 2 days ago (2 children)

While the GPU's created aren't used for gaming, the wafers that the dies are made with could've been allocated to produce the dies for consumer level graphics cards right?

[–] [email protected] 2 points 1 day ago

So with datacenter GPUs (excellerators is the more accurate term, honestly), historically they were the exact same architecture as nVidia's gaming GPUs (usually about half to a full generation behind. But in the last 5 years or so they've moved to their own dedicated architectures.

But more to your question, the actual silicon that got etched and burned into these datacenter GPUs could've been used for anything. Could've become cellular modems, networking ASICs, SDR controllers, mobile SOCs, etc. etc. but more importantly these high dollar data center GPUs are usually produced on the newest, most expensive process nodes so the only hardware that would be produced would be similarly high dollar, and not like basic logic controllers used in dollar store junk

[–] [email protected] 4 points 2 days ago