this post was submitted on 06 Sep 2023
26 points (93.3% liked)

LocalLLaMA

2269 readers
5 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 1 year ago
MODERATORS
26
How usable are AMD GPUs? (lemmy.dbzer0.com)
submitted 1 year ago* (last edited 1 year ago) by [email protected] to c/localllama
 

Heyho, I'm currently on a RTX3070 but want to upgrade to a RX 7900 XT

I see that AMD installers are there, but is it all smooth sailing? How well do AMD cards compare to NVidia in terms of performance?

I'd mainly use oobabooga but would also love to try some other backends.

Anyone here with one of the newer AMD cards that could talk about their experience?

EDIT: To clear things up a little bit. I am on Linux, and i'd say i am quite experienced with it. I know how to handle a card swap and i know where to get my drivers from. I know of the gaming performance difference between NVidia and AMD. Those are the main reasons i want to switch to AMD. Now i just want to hear from someone who ALSO has Linux + AMD what their experience with Oobabooga and Automatic1111 are when using ROCm for example.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 1 year ago

thanks for the concern but no worries, i did my fair share of optimization for my config and i believe i got everything out of it... i will 100% switch to AMD so my question basically just aims at: Can i sell my 3070 or do i have to keep it and put into a "server" on which i can run StableDiffusion and oobabooga because AMD is still too wonky for that...

That's all. My decision is not depending on whether this AI stuff works, but it just accelerates it if AMD can run this, because i can sell my old card to get the money quicker.