this post was submitted on 01 Feb 2024
513 points (95.6% liked)

Programmer Humor

31793 readers
237 users here now

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

founded 5 years ago
MODERATORS
 
top 22 comments
sorted by: hot top controversial new old
[–] [email protected] 69 points 6 months ago* (last edited 6 months ago)

I prefer ROCM:
R -
O -
C -
M -

  • Fuck me, it didn't work again
[–] [email protected] 39 points 6 months ago (2 children)

I program 2-3 layers above (Tensorflow) and those words reverberate all the way up.

[–] [email protected] 19 points 6 months ago (1 children)

I program and those words reverberate.

[–] [email protected] 14 points 6 months ago (1 children)
[–] [email protected] 8 points 6 months ago
[–] [email protected] 7 points 6 months ago

Recently, I've just given up trying to use cuda for machine learning. Instead, I've been using (relatively) cpu intensive activation functions & architecture to make up the difference. It hasn't worked, but I can at least consistently inch forward.

[–] [email protected] 37 points 6 months ago (1 children)

Oh cool I got the wrong nvidia driver installed. Guess I'll reinstall linux 🙃

[–] [email protected] 4 points 6 months ago

Yum downgrade.

[–] [email protected] 16 points 6 months ago

Some numbnut pushed nvidia driver code with compilation errors and now I have to use an old Kernel until it's fixed

[–] [email protected] 13 points 6 months ago

Nvidia: I have altered the deal, pray I do not alter it further.

[–] [email protected] 9 points 6 months ago

Not a hot dog.

[–] [email protected] 8 points 6 months ago (1 children)

Pretty much the exact reason containerized environments were created.

[–] [email protected] 3 points 6 months ago (1 children)

Yep, I usually make docker environments for cuda workloads because of these things. Much more reliable

[–] [email protected] 2 points 6 months ago* (last edited 6 months ago) (1 children)

You can't run a different Nvidia driver in a container though

[–] [email protected] 1 points 6 months ago

When you hit that config need the next step is light weight VM + pcie passthru.

[–] [email protected] 7 points 6 months ago (2 children)

I've been working with CUDA for 10 years and I don't feel it's that bad...

[–] [email protected] 11 points 6 months ago

I started working with CUDA at version 3 (so maybe around 2010?) and it was definitely more than rough around the edges at that time. Nah, honestly, it was a nightmare - I discovered bugs and deviations from the documented behavior on a daily basis. That kept up for a few releases, although I'll mention that NVIDIA was/is really motivated to push CUDA for general purpose computing and thus the support was top notch - still was in no way pleasant to work with.

That being said, our previous implementation was using OpenGL and did in fact produce computational results as a byproduct of rendering noise on a lab screen, so there's that.

[–] [email protected] 2 points 6 months ago (1 children)

I don't know wtf cuda is, but the sentiment is pretty universal: please just fucking work I want to kill myself

[–] [email protected] 3 points 6 months ago

Cuda turns a gpu in to a very fast cpu for specific operations. It won't replace the cpu, just assist it.

Graphics are just maths. Plenty of operations for display the beautiful 3d models with the beautiful lights and shadows and shines.

Those maths used for display 3d, can be used for calculate other stuffs, like chatgpt's engine.

[–] Justas 7 points 6 months ago

I don't know what any of this means, upvoted everything anyway.

[–] Socsa 2 points 6 months ago

Just be happy your toolchain doesn't require bazel

[–] [email protected] 1 points 6 months ago* (last edited 6 months ago)

Insert JavaScript joke here

spoilerError: joke is undefined