this post was submitted on 02 Jul 2023
10 points (100.0% liked)

LocalLLaMA

2265 readers
7 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 1 year ago
MODERATORS
 

So what is currently the best and easiest way to use an AMD GPU for reference I own a rx6700xt and wanted to run 13B model maybe superhot but I'm not sure if my vram is enough for that Since now I always sticked with llamacpp since it's quiet easy to setup Does anyone have any suggestion?

you are viewing a single comment's thread
view the rest of the comments
[–] actuallyacat 3 points 1 year ago (1 children)

Not sure what happened to this comment... Anyway, ooba (text-generation-webui) works with AMD on Linux but ROCm is super jank at the best of times and 6700XT is not officially supported so it might be hopeless.

llama.cpp has some GPU acceleration support on AMD in CLBlast mode, if you aren't already using it, might be worth trying.

[–] [email protected] 2 points 1 year ago (1 children)

How do use ooba with rocm I looked at the python file where you can install amd and it will just say "amd not supported" and exit. I guess it just doesn't update the webui.py when I update ooba? I somewhere heard that llama.cpp with CLBlast wouldn't work with ooba, or am I wrong? Also is konoldcpp worth a shot? I hear some success with it

[–] actuallyacat 5 points 1 year ago (1 children)

I can recommend kobold, it's a lot simpler to set up than ooba and usually runs faster too.

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago)

I will try that once in home! Ty for the suggestions can I use kobold also in sillytavern? iirc there was an option for koboldai or something is that koboldcpp or what does that option do?

EDIT: I got it working and its wonderful thank you for suggesting me this :) I had some difficulties setting it up especially with opencl-mesa since I had to install opencl-amd and then finind out the device ID and so on but once it was working its great!