208
this post was submitted on 03 Dec 2024
208 points (97.3% liked)
Technology
59770 readers
3095 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
An LLM card with quicksync would be the kick I need to turn my n100 mini into a router. Right now, my only drive to move is that my storage is connected via usb. SATA is just not enough value for a whole new box. £300 for Ollama, much faster ml in immich etc and all the the transcodes I could want would be a "buy now figure the rest out later" moment.
Oh also you might look at Strix Halo from AMD in 2025?
Its IGP is beefy enough for LLMs, and it will be WAY lower power than any dGPU setup, with enough vram to be "sloppy" and run stuff in parallel with a good LLM.
*adds to wishlist
You could get that with 2x B580s in a single server I guess, though yoi could have already done that with the A770s.
... That's nuts. I only just graduated to a mini from a pi, I didnt consider a dual GPU setup. Arbitrary budget aside, I should have added an "idle power" constraint too. Reasonable to assume that as soon as LLMs get involved all concept of "power efficient" goes out the window. Don't mind me, just wishing for a unicorn.
Strix Halo is your unicorn, idle power should be very low (assuming AMD VCE is OK over quicksync)