I guess it depends on what you mean by usable, I think people have had success with ROCM, it's not as solid as CUDA of course but it's been more than usable
this post was submitted on 20 Jul 2023
6 points (100.0% liked)
LocalLLaMA
2293 readers
1 users here now
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
founded 2 years ago
MODERATORS
Before buying a GPU for this, evaluate models using your CPU and system memory first. The only difference would be speed but note that CPUs can be acceptably fast and that responses are the same.