this post was submitted on 28 Nov 2023
1 points (100.0% liked)

AMD

25 readers
1 users here now

For all things AMD; come talk about Ryzen, Radeon, Threadripper, EPYC, rumors, reviews, news and more.

founded 11 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 1 points 9 months ago (1 children)

It's for AI acceleration. In AI training (and with LLMs, inference), the VRAM is basically hard limit of the complexity of the AI model you can run on the GPU. 20GB is enough to train some small LLMs, 10GB is not.

[โ€“] [email protected] 1 points 9 months ago

Brother, nobody is running AI acceleration on a 580.