this post was submitted on 17 Dec 2024
211 points (97.7% liked)

Technology

60029 readers
2881 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 21 points 4 days ago (11 children)

This is why I love Zotac. We'll, not this reason, but it adds to the small pile of smiles they've given me.

That being said I'm skipping this Nvidia gen and might break for AMD next. My 3090 is still trucking fine and I feel like Nvidia has lost their value after the debacles of the 40 series.

[–] [email protected] 5 points 4 days ago (6 children)

You should think about selling it TBH. 3090 prices are shooting up like crazy, and may be at a peak, because they are the last affordable card to self host LLMs.

[–] WolfLink 3 points 4 days ago (1 children)

Can’t you run LLMs on 4090/5090 maybe 5080? Basically any Nvidia card with 24GB+ of VRAM?

[–] [email protected] 9 points 4 days ago* (last edited 4 days ago) (1 children)

Yeah, but they not worth it.

The 4090 is basically just as good as the 3090 because it has the same amount of vram, but twice the price... so you mind as well get 2x 3090s.

The 5090 will be hilariously expensive, and 24GB -> 32GB is not that great, as you still can't run 70B class models in that pool... again, mind as well get 2x 3090s. I would not even bother trading my single 3090 for 5090.

If AMD sold a 48GB consumer card, you would see them dominate the open source LLM space in a month, because every single backend dev would buy one and get their projects working on them. Same with Intel. VRAM is basically the only thing that matters, and 24GB is kinda pitiful at a 4090's price.

[–] [email protected] 1 points 3 days ago (1 children)

Halo has me hopeful that AMD are going to continue down this idea of having APUs that can use onboard RAM instead of requiring it to be built in. It'd be great to just be able to upgrade my RAM rather than replace a whole ass GPU.

[–] [email protected] 2 points 3 days ago* (last edited 3 days ago)

It uses embedded LPDDR5X, so it will not be upgradeable unless the mobo/laptop maker uses LPCAMMs.

And... that's kinda how it has to be. Laptop SO-DIMMs are super slow due to the design of the DIMMs, and they need crazy voltages to even hit the speeds/timings they run at now.

load more comments (4 replies)
load more comments (8 replies)