this post was submitted on 28 Feb 2025
311 points (99.1% liked)
Technology
63614 readers
2753 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Good. That asshole doesn't deserve any. Those should be used for PC gaming. Not creating the torment nexus.
You wouldn't be using these for gaming (well, not of the 3D graphics sort).
They run in the tens of thousands of dollars each, as I recall.
Probably more correct to call them "parallel compute accelerator" cards than "GPUs". I don't think that they have a video out, even.
What they do have is a shit-ton of on-board RAM.
EDIT: Oh, apparently those are whole servers containing multiple GPUs.
https://www.trgdatacenters.com/resource/nvidia-dgx-buyers-guide-everything-you-need-to-know/
For comparison, the most powerful electric space heater I have draws about a tenth that.
While the GPU's created aren't used for gaming, the wafers that the dies are made with could've been allocated to produce the dies for consumer level graphics cards right?
So with datacenter GPUs (excellerators is the more accurate term, honestly), historically they were the exact same architecture as nVidia's gaming GPUs (usually about half to a full generation behind. But in the last 5 years or so they've moved to their own dedicated architectures.
But more to your question, the actual silicon that got etched and burned into these datacenter GPUs could've been used for anything. Could've become cellular modems, networking ASICs, SDR controllers, mobile SOCs, etc. etc. but more importantly these high dollar data center GPUs are usually produced on the newest, most expensive process nodes so the only hardware that would be produced would be similarly high dollar, and not like basic logic controllers used in dollar store junk
Absolutely.
This makes me want a game called "Torment Nexus." That shit sounds like a dope-ass Soulslike.
There's an old CRPG called Planescape Torment that's pretty brutal. And it's sequel, Torment: Tides of Numinera.
Planescape is awesome. The only game that came close to its story (and one I've played) is Disco Elysium, which is quite a feat. Given how DE is praised everywhere.
I loved Disco Elysium.
They say art will make you feel things. I say Disco Elysium is very good art
Hello, I heard there is a Planescape club here. Recently replayed it. Can confirm it's not nostalgia, old games are better.
It's my favorite RPG from the 90's.
I haven't played Tides of Numinera though.
I highly recommend it. It's not a direct sequel. But I can't explain further because spoilers. But I can say it's definitely worth a try.
He's getting 10,000 more next week
He doesn't deserve them.
Bet he's not even using them for anything except a Govt Subsidized Crypto farm.