this post was submitted on 03 Dec 2024
868 points (99.2% liked)
Technology
59770 readers
3142 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Our old asses are over here learning mint and Ubuntu on new machines. That wasn’t on our 30s-40s disco card.
It’s fun. Everything looks good, then attach the external monitor to the laptop and it won’t detect. There’s a workaround, there’s almost always a workaround, but these basics of windows are in pieces in Linux.
The basic expectations with windows, like monitor detection, aren’t necessarily there.
Spite is a hell of a fuel though. Oh and I still have my win 10 disc and put a fresh install on another machine.
Mint and Ubuntu are Debian based.
Try something Fedora based. I've had far less issues with it when it comes to hardware.
I've tried quite a few distros on an MSI I got and it wouldn't recognize dual monitors with Nvidia drivers on any I tried. I went with fedora, Debian based ones, kde, etc. And none worked. Had to go back to Windows on that laptop.
Ah my work laptop had the same issue but as soon as I saw it didn't work I just switched to windows and it worked.
The only laptop I keep permanently Linuxed I use as a VPS lol. Got Nextcloud on it and a few bots.
Ah, yeah, MSI Nvidia does have issues in general for some reason. At that point basically only Arch or similar that's more advanced would fix the problem, and at that point it does make sense for most users to stick with Windows.
I'd recommend what others here say and get an iot version or using a Rufus install in those cases of Windows though, to avoid all the telemetry etc.
I would but my cares are pretty much gone rn. I don't have enough time to do anything nowadays except work, doomscroll and sleep. Much less to start messing with weird stuff and breaking my $2800 laptop for fun hahah. I think I'll keep it as it came. I hope Bill Gates one day wakes up and looks at a sneak pic of my balls. If I get fired I'll boot up my work laptop and install Arch on it though. Always wanted to try it!
Should clarify: I meant the IoT LTSC version of Windows. It gets support for much longer too, since it sounded like you reinstalled Windows anyway. Plus games and RAM heavy software work snappier on those cleaner, more minimal versions of Windows. It made a difference even on my 7.5k water cooled desktop. You'd think 128gb of DDR5 RAM, 7900x3D, 3090 computer wouldn't have any slow down, but base Windows is REALLY bloated - enough that even at those specs you can notice a difference on a gen 5 m.2 ssd. I still use Windows for some modded games and a specific audio program. Oh, and CAD software.
Same with my girlfriend's 2k gaming laptop. Startup and such is way faster now.
Plus no telemetry or ads as a bonus of course.
Will intensely think about it. Last I heard no bitlocker. Will research this week.
that's switchable graphics for you. nvidia refuse to spill their secret sauce so all the effort in supporting that over the past 10 years have been clean-room reverse engineering. the only way it will ever get any good is if nvidia does it, or if they open it up.
Hmm. Switchable graphics. Do you mean like integrated & GPU? I didn't think that could affect dual screen setup. Guess maybe it could? Idk.
Most laptops with discrete Nvidia and AMD GPUs also have onboard/integrated graphics and only use the Nvidia/AMD GPU when something graphically-intensive is happening (playing a game, video editing or encoding/decoding, etc). They call this "hybrid graphics".
However, the HDMI port on the laptop (as well as the USB-C graphics) is wired directly to the Nvidia GPU (I'll call this the "dGPU" from now on). This means that when an external monitor is plugged in but nothing graphically intense is being done, the screen is rendered on the iGPU, then sent to the dGPU to send over the HDMI port.
The hand-off between the dGPU and iGPU (called "reverse PRIME") is basically voodoo magic. People have tried to get it working in Linux, but there's a bunch of issues with it.
To get dual monitors working properly on my work laptop (Lenovo X1 Extreme Gen5 with an RTX3050), I have to go into the BIOS and force it to only use the dGPU (disable the hybrid mode). If I don't do that, the external monitor renders at maybe 5fps? A coworker got it working by instead forcing the Nvidia card to always use a high clock speed for the RAM instead of reducing it to save power, but I haven't tired that.
This is a laptop-specific problem, only for laptops with hybrid graphics. I have no problems using three monitors on a desktop PC.
I didn't know basically anything in your entire comment yet you explained it pretty clearly. Thanks for a learning experience 😊
Each GPU has a limited number of display outputs (also called display pipelines or display controllers). as an example, the macbook air can only support the built-in display and one external display. This is a hardware limitation of its GPU architecture. When using multiple displays on laptops that support it, some systems can utilize both the integrated GPU and discrete GPU simultaneously to drive different displays.