this post was submitted on 27 Jan 2024
24 points (96.2% liked)
LocalLLaMA
2265 readers
10 users here now
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This link is misleading. For example, the Radeon RX6800 IS supported because it is the same chip as one of the Radeon Pros. GFX1030. Many others are too…though support does not go very far back.