this post was submitted on 01 Apr 2025
25 points (96.3% liked)
Linux
53804 readers
1299 users here now
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Rules
- Posts must be relevant to operating systems running the Linux kernel. GNU/Linux or otherwise.
- No misinformation
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0
founded 6 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
it's true that many applications normal users will use won't, but on the flip side, creative types may actually be really familiar with applications that do. But you also do have users who consume it without doing creative things, like MPV and I think madVR too.
you talked about how laptops have a brightness configuration that desktops didn't.
It is true that it can be random for sure, but this is to be expected. While it is useful to keep in mind that PQ is an absolute metric, it is very much intended for displays to pick and choose how they should treat the light. The mastered content is a reference which is why we always talk in "reference nits" when we refer to grading stuff. This behavior is very much to be expected and the user should be able to compensate for it via their own controls. I think that handling PQ is an absolute value is useful on one hand, but fundamentally flawed on the other. indeed, this is one of the short comings of modern operating systems for sure.
Personally I believe that the way to handle this is after all the other processing is done. PQ should be treated as absolute when doing anything like colorspace conversion. When your "reference" looks correct, then you can compensate for display issues. Though perhaps if you have a user supplied chart of a luminance response level other behavior should be considered.
I'm not sure we can remove SDR/HDR from bitrate, most colorspace's transfers do specify a specific bitdepth, but even then, say you have a 3k nit video which is not actually uncommon thanks to apple, a transfer like sRGB/G2.2 bt.1886/G2.4 will still be inadequate to appropriately displayed. This ofc includes if you were to do inverse tonemapping from "SDR" to "HDR" without appropriate debanding.
I don't think one should try to separate bitdepth and a transfer's intended reference peak luminance from the terms SDR and HDR, because they do play an important role in the display of "SDR" content and "HDR" content. Then again, I am an ardent believer in the death of the terms HDR and SDR. Especially when it comes to anything even remotely technical. To me, SDR and HDR when it related to video is useful for nothing but marketing. Colorspaces are absolute and every video has one (even if it is unknown which is it's own can of crap to open).
It's true that higher bitdepth can alleviate banding in lower luminance regions, but even at a given bitdepth of 8bits, HLG will still have less banding in lower luminance regions then sRGB will, and this can indeed be noticable on high luminance displays that can really pump up the brightness or especially in viewing environments with no ambient light aside from the bounce light of the TV and small lumen stuff like phones and night lights which is an extremely common viewing environment, even if a bad one.
also, very much agree about the bitrate crap, AV1 and VVC do help... too bad YT killed off it's good av1 encodes...