this post was submitted on 19 Nov 2023
2 points (100.0% liked)

AMD

25 readers
1 users here now

For all things AMD; come talk about Ryzen, Radeon, Threadripper, EPYC, rumors, reviews, news and more.

founded 11 months ago
MODERATORS
 

https://www.youtube.com/watch?v=QEbI6v2oPvQ

I had a lot of trouble setting up ROCm and Automatic1111. I tried first with Docker, then natively and failed many times. Then I found this video. It has a good overview for the setup and a couple of critical bits that really helped me. Those were the reinstallation of compatible version of PyTorch and how to test if ROCm and pytorch are working. I still had a few of those Python problems that crop up when updating A1111, but a quick search in A1111 bug reports gave work arounds for those. And a strange HIP hardware error came at startup, but a simple reboot solved that.

Also he says he couldn't make it work with ROCm 5.7, but for me now 2 months later, ROCm 5.7 with 7900 XTX and Ubuntu 22.04 worked.

And coming from a Windows DirectML setup, the speed is heavenly.

top 14 comments
sorted by: hot top controversial new old
[–] [email protected] 1 points 10 months ago (1 children)

I did this setup for native use on Fedora 39 workstation about a week and a half ago, the amount of dicking about with python versions and venvs to get a compatible python+pytorch+rocm version together was a nightmare, 3 setups that pytorch site said were "supported" before it finally worked with rocm5.7. It was my first experience with setting it up natively, have ran a docker version in the past and will probably stick to that in the future.

[–] [email protected] 1 points 10 months ago

Just use a Podman container with Distrobox

[–] [email protected] 1 points 10 months ago

It works out of the box using vladmantic sd.Next?

[–] [email protected] 1 points 10 months ago (1 children)

i am looking for a guide that is based on latest version of rocm&pytorch etc. available? Is it running good?

[–] [email protected] 1 points 10 months ago

So far I'm pretty happy with it. I can do 1024 resolution gens at 2.7 it/s. If I try to put more of them in a batch, then I might run out of memory, but compared to Windows and DirectML this is quite a bit faster and has better memory management.

I also tried some animatediff for the first time on this, but only managed to render a 256 resolution gif. Even 512 resolution caused a crash.

I also managed to get ComfyUI setup to serve Krita as a Stable Diffusion backend, but I only just got it to work and don't have the first clue about how to use it properly yet. I used this plugin: https://github.com/Acly/krita-ai-diffusion.

[–] [email protected] 1 points 10 months ago (1 children)

It should not be this involved. It is still a cluster of a process. But I hope some folks can get this to work.

[–] [email protected] 1 points 10 months ago (1 children)

It could be way easier with proper Docker images. That's what I tend to do for all these projects.

ROCM team had the good idea to release Ubuntu image with the whole SDK & runtime pre-installed. But that's simply not enough to conquer the market and gain trust. Ideally, they'd release images bundled with some of the most popular FLOSS ML tools ready to use and the latest stable ROCm version.

[–] [email protected] 1 points 10 months ago (2 children)

Are there reliable Docker images for Oobabooga, A1111, Silly Tavern etc?

[–] [email protected] 1 points 10 months ago

However, you'll often find repos with Docker scripts, some of them with ROCm.

For example.

[–] [email protected] 1 points 10 months ago

Not that I've seen, at least no with ROCm pre-installed.

[–] [email protected] 1 points 10 months ago (1 children)

Maybe he should switch to Fedora since ROCM is in the official repositories: https://i.imgur.com/wvdXZdl.png

And nope i didn't have to install AMD drivers everything runs on Mesa inside a Distrobox Fedora container : https://i.imgur.com/5t3ucSu.png

[–] [email protected] 1 points 10 months ago

Looking good!
How do you come out of the python dependency hell in fedora?
When i to set up the python3.10 env and try to install the requirements, torchsde always complain about a specific numpy, python or pip versions.

[–] [email protected] 1 points 10 months ago (1 children)

save yourself the trouble and go team green if you're serious about AI.

[–] [email protected] 1 points 10 months ago

u still need to install their crappy drivers that take your system down with them on a daily basis :>