this post was submitted on 25 Feb 2025
633 points (98.5% liked)
Technology
63375 readers
5731 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
To anyone complaining about non-replaceable RAM: This machine is for AI, that is why.
Think of it like a GPU wirh a CPU on the side, vs the other way around.
Inference requires very fast ram transfer speed, and that is only possible (currently) on soldered buses. Even this is pretty slow at 256Gb/s, but it's RAM size of 96GB to GPU makes it interesting for larger models.
I'm excited for how these GPU-centric machines will also be used for non-AI stuff, because it means fully embracing parallelism. I want video encoders limited by disk speed.