this post was submitted on 14 May 2025
1025 points (98.9% liked)

memes

14835 readers
4619 users here now

Community rules

1. Be civilNo trolling, bigotry or other insulting / annoying behaviour

2. No politicsThis is non-politics community. For political memes please go to [email protected]

3. No recent repostsCheck for reposts when posting a meme, you can only repost after 1 month

4. No botsNo bots without the express approval of the mods or the admins

5. No Spam/AdsNo advertisements or spam. This is an instance rule and the only way to live.

A collection of some classic Lemmy memes for your enjoyment

Sister communities

founded 2 years ago
MODERATORS
all 30 comments
sorted by: hot top controversial new old
[–] [email protected] 104 points 6 days ago (1 children)
[–] [email protected] 37 points 6 days ago (1 children)

The trick is to not play games with a rootkit

[–] [email protected] 16 points 6 days ago (1 children)

I only still have a windows install (that hasn't been booted for months at this point) just for my VR headset. Just can't get good performance out of Linux using it.

Just wanted to mention there are a few other reasons beside rootkits to still use windows, unfortunately.

[–] [email protected] 0 points 6 days ago* (last edited 6 days ago) (1 children)

I was able to play HL Alyx with an Index, on... I think it was an AMD 5950X CPU and an AMD 6900 XT GPU... on PopOS!... back in... 2022.

Not saying you are any kind of 'wrong', just saying it is possible now, and was even possible back in 22, to get very good VR performance out of a linux system... full res, maxxed out settings, 90hz/fps good.

You could replicate my old build now for roughly .. 65% the cost as I paid for the same parts back then. 5950x current sale price is less than half what it once was, 6900XT current sale price is a bit more than half, though you have to look a bit harder... GPU market is just generally insane right now.

EDIT: For a long while, generally speaking... AMD stuff is better supoorted and more performant on Linux... because many of the drivers are much much more open source.

Also... AMD CPUs and GPUs synergize and perform better when paired with each other, then pairing an Nvidia GPU with an AMD CPU, or Intel CPU with AMD GPU.

If Intel can pump up its GPU game, it may be able to achieve a similar result, but so far their GPUs... while honestly pretty good for the performance/price range they're in... they just don't come as close to the high end of GPU performance yet.

[–] [email protected] 2 points 4 days ago* (last edited 4 days ago) (1 children)

Good for you. I tried only a few months ago, couldn't get it to run well.

It's not the hardware ('s capabilities, could still be Nvidia-related), seeing as the same hardware runs VR just fine on Windows.

Unfortunately I am still stuck with an Nivdia GPU until I can justify spending that kinda money on an AMD one. But even so, the Nvidia drivers have been working really well for me ever since the time around explicit sync got merged (or something like that, don't remember exactly). It's just VR that doesn't perform well.

Also… AMD CPUs and GPUs synergize and perform better when paired with each other, then pairing an Nvidia GPU with an AMD CPU, or Intel CPU with AMD GPU.

Do you have a source for that? Cause that sounds like a stretch, and if it were true, I'd imagine would result in an antitrust lawsuit.

[–] [email protected] 1 points 4 days ago* (last edited 4 days ago) (1 children)

Sadly, I no longer have that computer, nor the Index.

Yay burglary!

But disregarding that.... Nvidia, at least in the past, the timeframe I built that PC in and before... just tended to not work as well on Linux, because AMD has always open sourced... almost all? literally all? ... of their driver code.... and has done so for far longer and to a greater degree than Nvidia has.

That means it can much more rapidly be integrated into working well with the Linux kernel and various GPU driver libraries... where as with Nvidia, a lot of their drivers were closed source for quote a long time, and only work at their max efficiency/performancr on proprietary OS's... untill someone working on Linux driver compatibility reverse engineered some new feature or optimization in a new driver/hardware design paradigm.

However, it does seem that more recently, as in, basically 6 months after I built that PC..., Nvidia is better supporting Linux with more open source code, though the opensource Nvidia drivers still don't seem to have the backwards compatibility with many older or more niche Nvidia GPUs.

https://www.howtogeek.com/805004/nvidia-releases-open-source-linux-gpu-drivers-with-a-catch/

https://www.guru3d.com/story/nvidia-announces-transition-to-open-source-gpu-kernel-modules/

I've been dabbling with linux and gaming on linux for... a little over a decade, and for the vast majority of that time, before Proton was a thing, you basically only had WINE, or VMs, and you had to install a proprietary, closed source Nvidia Linux kernel, and drivers, to get an Nvidia GPU to do basically anything 3d on Linux ... and uh whew, nothing like fucking up a kernel migration to teach you how little you actually understand about computers, lol.

Either way, I am genuinely glad Nvidia + Linux is generally working well for you and totally understand the expense of any new GPU not being justifiable right now... yay tariffs and supply shortages and third party mfgs hiking up specs and prices, yay scalpers, etc etc...

... though yes it is odd that VR isn't working as well, that could possibly be from ... some Linux Nvidia driver feature that hasn't yet been implemented or optimized... or it could possibly be the drivers for your VR headset itself? Or maybe just the game you're trying to run in Linux needs a patch, or Proton needs to catch up to it?

Could be a bunch of things.

...

As to the AMD CPU + GPU synergy... I'm not sure how it could really qualify for an antitrust lawsuit... but either way:

Smart Access Memory has evolved since this article, but it explains a bit about CPU GPU synergy:

https://graphicsreport.com/smart-access-memory/

And then just here's a bunch of AMDs own brief descriptions of many other 'Smart' features, though many of these are geared toward AMD laptops... though some of them also help with making even desktops a bit more power (as in wattage draw) efficient, as well as helping to better manage heat, and thus also fan noise, and thus also OC capabilities.

https://www.amd.com/en/gaming/technologies/smart-technologies.html

All these things together do have admittedly fairly minor improvements vs say an AMD CPU + Nvidia GPU system, or Intel CPU + Nvidia GPU system of approximately equivalent cost... but they can become significant factors if you are trying to squeeze the absolute max performance out of a strict budget limit, or smaller form factor, or maximize something like 'fps in a given game at given graphical settings per watt draw' or 'per dollar spent'.

[–] [email protected] 2 points 3 days ago (1 children)

You're absolutely right, Nvidia used to be a nightmare on Linux. Not just unstable bad, often times just unuseably bad for me. Even the closed drivers are a lot better nowadays though, but I think older Nvidia support is still not great.

I've also been using Linux for over a decade at this point, but only switched to it as my main gaming machine fairly recently. I always had issues with stupid Nvidia bullshit before, until finally I found Bazzite which was working great even in the period when Nvidia drivers were still a bit unstable, especially on wayland. Honestly I'm not even sure why I still got an Nvidia gpu last time.

I’m not sure how it could really qualify for an antitrust lawsuit

This would sound to me like they'd somehow unlock more performance with their own gpu than is available for other brands, so they're giving their own hardware an unfair advantage.

But thinking about it a bit more now, I realise that could probably be prevented by trademarking or patenting the technology they use to do that or something.

[–] [email protected] 1 points 2 days ago* (last edited 2 days ago)

Hah, I too am using Bazzite, at least on a Steam Deck.

And yeah, if... having a propietary hardware tech with proprietary software that utilized it was somehow a legal argument for 'you are a monopoly'...

Then Nvidia would have a very obvious monopoly on the real time ray tracing tech, and should thus be broken up.

Then MSFT should be broken up for owning DirectX, and not open sourcing it. Proton and Vulkan wouldn't need to exist.

[–] [email protected] 53 points 6 days ago (1 children)

Me: Hey bank AI, how much money is in my account?
Bank AI: (long pause) It looks like there are five banks in your area.

[–] [email protected] 18 points 6 days ago

“I found a PayDay Loan office near you.”

[–] [email protected] 24 points 6 days ago* (last edited 6 days ago)

Fuck, just tried to log into office.com to email a coworker that my stupid windows updates were going to make me late to a meeting and apparently I must use the copilot app instead of the web browser.

Fuck no, guess they will find out when I get there.

[–] [email protected] 23 points 6 days ago (1 children)

Whenever a chatbot is added to a product I use, the first and only thing I ask is how to disable it

[–] 6nk06 12 points 6 days ago (2 children)

Unless you want to disable Gemini which tells you to go to the settings to find a hallucinated feature. Thanks Google, but you're drunk!

[–] [email protected] 8 points 6 days ago (1 children)

When they forced that shit this week the first and only thing I asked gemini for was directions on how to disable gemini and it did provide incorrect names for the settings to disable itself. Close enough to find the right ones, but just a great example that it doesn't even have the right answers for their own products.

AI being forced into everything is such a shitshow.

[–] [email protected] 3 points 6 days ago

I saw a comment from a store owner having to defend himself against a potential customer who was convinced he sold something AI said he did. She would not believe him when he said he did not sell such thing. Kept quoting the AI to him lmao

[–] [email protected] 2 points 6 days ago

I don't understand why Gemini is such a disaster. DeepMind Gemma works better and that's a 27B model. It's like there are two separate companies inside Google fucking off and doing their own thing (which is probably true)

[–] [email protected] 14 points 6 days ago (1 children)

You missed the other key detail: “and raised the price regardless of whether we use AI”.

I finally migrated my work email from Google workspace specifically because they jacked the subscription price and justified it due to Gemini, a tool I never used.

[–] [email protected] 2 points 5 days ago

In the EU you can still have google one without gemini?

Anyways I switched to Infomaniak just to get rid of google tbh :))

[–] [email protected] 14 points 6 days ago (3 children)

I don't mind AI now that I've vastly lowered my expectations of what it can do, and am aware that "AI" isn't actually real. An LLM might be useful in some situations where you can reasonably expect it to give a decent answer and where your task isn't particularly important.

Problem is, most of it is forced on you and is not privacy friendly.

[–] [email protected] 10 points 6 days ago (2 children)

could be useful if it wasn't for capitalism smh

[–] [email protected] 6 points 6 days ago

There are uses for AI in scientific and other things where using it for pattern matching to help find areas to focus on and then thoroughly doing the real work where accuracy is important are where the current AI really excels.

https://news.berkeley.edu/2022/05/24/ai-reveals-unsuspected-math-underlying-search-for-exoplanets/

Artificial intelligence (AI) algorithms trained on real astronomical observations now outperform astronomers in sifting through massive amounts of data to find new exploding stars, identify new types of galaxies and detect the mergers of massive stars, accelerating the rate of new discovery in the world’s oldest science.

It didn't just spit out answers they treated as correct, it did stuff they looked into and found ways to improve their methods. That is the real benefit of AI.

[–] [email protected] 4 points 6 days ago

After dabbling with AI for years, I think it should be called out for what it is; machine learning. We're not in the ballpark of intelligence and barely close to mimicking (artificial) intelligence.

Though ML has a lot of things going for it, my most successful results are with voice synthesizing. Almost flawless and pretty amazing considered the yield.

[–] [email protected] 2 points 6 days ago

I use as an advanced rubber duck for coding.

I know the answer is wrong but it gets my brain going into finding the right answer.

Like: "This is a ridiculous approach to make this. It would be much easier to just..."

Getting the wrong answer sometimes speed up the process, like some kind of dialectics.

[–] [email protected] 8 points 6 days ago
[–] [email protected] 6 points 6 days ago
[–] [email protected] 6 points 6 days ago
[–] [email protected] 4 points 6 days ago

I'm so glad I use Linux. No AI bullshit!

[–] [email protected] 2 points 6 days ago

Everytime there's an update on VS Code

[–] [email protected] 1 points 6 days ago

I've been having a heated discussion with meta ai on and off now.