this post was submitted on 19 Apr 2024
341 points (98.6% liked)

Linux

46819 readers
1090 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
top 28 comments
sorted by: hot top controversial new old
[–] [email protected] 114 points 4 months ago (2 children)

That's not to say the two men don't think AI will be helpful in the future. Indeed, Torvalds noted one good side effect already: "NVIDIA has gotten better at talking to Linux kernel developers and working with Linux memory management," because of its need for Linux to run AI's large language models (LLMs) efficiently.

[–] [email protected] 87 points 4 months ago* (last edited 4 months ago) (1 children)

so THATS why we are getting better nvidia support.

i knew it just couldnt be from the goodness of their newly converted hearts.

[–] [email protected] 3 points 4 months ago (1 children)

You didn't know GPUs are used for training/running DNN?

[–] [email protected] 21 points 4 months ago

sure, but am i surprised this is the only reason they improved desktop drivers for us? no.

am i disappointed? yeah, a little.

[–] [email protected] 60 points 4 months ago* (last edited 4 months ago)

Hahaha. I love it. Fuck closed source hardware gatekeepers.

Nice to see them groveling for performance.

Kneel!!

C'mon, I can joke. Such a cathartic paragraph to read. Intractable cunts.

[–] [email protected] 82 points 4 months ago* (last edited 4 months ago) (1 children)

Did not know the thing about purposefully adding rogue tabs to kconfig files to catch poorly written parsers. That's fucking hilarious and I'd love to have the kind of clout to get away with something like that rather than having to constantly work around other people's mistakes.

[–] [email protected] 83 points 4 months ago (5 children)

I write a lot of scripts that engineers need to run. I used to really try to make things 'fail soft' so that even if one piece failed the rest of the script would keep running and let you know which components failed and what action you needed to take to fix the problem.

Eventually I had so many issues with people assuming that any errors that didn't result in a failure were safe to ignore and crucial manual steps were being missed. I had to start making them 'fail hard' and stop completely when a step failed because it was the only way to get people to reliably perform the desired manual step.

Trying to predict and account for other people's behavior is really tricky, particularly when a high level of precision is required.

[–] [email protected] 29 points 4 months ago

It is a developer milestone :) when you learn to be a resilient applicant is about recovery situation you perfect understanding. Fail fast everything else. Repeat 1000 times, you have something

[–] [email protected] 9 points 4 months ago* (last edited 4 months ago)

soft failures add complexity and ambiguity to your system, as it creates many paths and states you have to consider. It's generally a good idea to keep the exception handling simple, by failing fast and hard.

here is a nice paper, that highlights some exception handling issues in complex systems

https://www.usenix.org/system/files/conference/osdi14/osdi14-paper-yuan.pdf

[–] [email protected] 4 points 4 months ago

Always fail soft in underlying code and hard in user space IMHO

[–] [email protected] 4 points 4 months ago* (last edited 4 months ago) (1 children)

This is why I enjoy programming libraries only I will ever use. "Do I need to account for user ignorance and run a bunch of early exit conditions at the beginning of this function to avoid throwing an exception? Naww, fuck it, I know what I'm doing."

[–] [email protected] 3 points 4 months ago

It's the quickest way to prove to yourself that you know what you're doing... Most of the time, anyway...

[–] [email protected] 2 points 4 months ago

Sounds familiar, haha.

[–] [email protected] 45 points 4 months ago

Hohndel agreed but added that the industry needs to support these smaller projects -- and not only with money. "Companies need to engage with these projects. Have your company adopt a couple of such projects and just participate. Read the code, review the patches, and provide moral support to the maintainers. It's as simple as that."

Really glad he said this, I keep seeing posts about how all these big companies could solve the problem by just throwing money at small projects and while that is better than nothing it would help way more to have their own developers helping to review and fix issues.

[–] [email protected] 19 points 4 months ago (3 children)

Is there a link to this talk (or interview, or whatever this is) but in a video format, or at least a text without all those «SEE ALSO» self ads?

[–] [email protected] 5 points 4 months ago* (last edited 4 months ago) (2 children)

Maybe it's this one (I'm in a rush here 🙂) ? https://youtube.com/watch?v=VHHT6W-N0ak Someone in the comments writes that the full interview is in the channel of Linux Foundation : https://piped.video/channel/UCfX55Sx5hEFjoC3cNs6mCUQ

[–] [email protected] 3 points 4 months ago (1 children)

Sadly no, that one is three months old. Hopefully they'll publish it on the Linux Foundation yt channel or something.

[–] [email protected] 1 points 4 months ago

Here is an alternative Piped link(s):

Linux Foundation yt channel

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source; check me out at GitHub.

[–] [email protected] 2 points 4 months ago

Here is an alternative Piped link(s):

https://piped.video/watch?v=VHHT6W-N0ak

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source; check me out at GitHub.

[–] [email protected] 4 points 4 months ago

not yet, you should run an adblocker

[–] [email protected] 2 points 3 months ago (1 children)

If you're still interested it seems that they've uploaded the keynote, see link in my comment:
https://lemmy.deedium.nl/comment/115389

[–] [email protected] 2 points 3 months ago
[–] [email protected] 8 points 4 months ago

Wait a minute... BS stands for "Beautiful Science" now?

[–] [email protected] 6 points 4 months ago (1 children)

In addition, hardware developers reinvent old ways of doing things and only learn by making all the same mistakes that have been made before. It's sad, but true. 

This same criticism is validly launched at software devs all the time lol.

One thing I've anecdotalally seen and heard is hardware guys indicating that something is rock solid and solved because it's old, so building on top of it isn't a problem. Obviously we have to build on the old to get to the new, but if we just skip auditing hardware due to age we end up deploying vulnerable hardware globally. Spectre and Meltdown are an interesting example where I've heard from at least one distinguished professor that "everyone" believed branch prediction design/algorithms were essentially done. Was it adequately assessed from a security POV? Clearly not, but was it assessed from a security POV in general? I have no idea, but it would be nice as a tech enthusiast and software guy to see the other side of the fence take these things seriously in a more public way, in particular when it comes to assessing old hardware for new attack vectors.

[–] [email protected] 4 points 4 months ago

Spectre and Meltdown are an interesting example where I’ve heard from at least one distinguished professor that “everyone” believed branch prediction design/algorithms were essentially done.

Interesting to hear this.

Was it adequately assessed from a security POV? Clearly not, but was it assessed from a security POV in general? I have no idea, but it would be nice as a tech enthusiast and software guy to see the other side of the fence take these things seriously in a more public way, in particular when it comes to assessing old hardware for new attack vectors.

Right.

[–] [email protected] 1 points 3 months ago (2 children)
[–] [email protected] 2 points 3 months ago

Cool, thanks for sharing!

[–] [email protected] 2 points 3 months ago

Here is an alternative Piped link(s):

Keynote: Linus Torvalds, Creator of Linux & Git, in Conversation with Dirk Hohndel

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source; check me out at GitHub.