this post was submitted on 28 Nov 2023
1 points (100.0% liked)
AMD
26 readers
1 users here now
For all things AMD; come talk about Ryzen, Radeon, Threadripper, EPYC, rumors, reviews, news and more.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It's for AI acceleration. In AI training (and with LLMs, inference), the VRAM is basically hard limit of the complexity of the AI model you can run on the GPU. 20GB is enough to train some small LLMs, 10GB is not.
Brother, nobody is running AI acceleration on a 580.