this post was submitted on 26 Feb 2025
26 points (88.2% liked)
LocalLLaMA
2661 readers
2 users here now
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Memory bandwidth is 256GB/sec, much less than M4 Max (526GB/s) or M2 Ultra (800GB/s). Expect performance to reflect that.
It's comparable to the M4 Pro in memory bandwidth but has way more RAM for the price.
Good point. You can't even get an M* Pro with 128GB. Only the Max and Ultra lines go that high, and then you'll end up spending at least twice as much.