this post was submitted on 01 Apr 2025
25 points (96.3% liked)
Linux
53618 readers
997 users here now
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Rules
- Posts must be relevant to operating systems running the Linux kernel. GNU/Linux or otherwise.
- No misinformation
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
the video https://www.youtube.com/watch?v=NzhUzeNUBuM goes more indepth, but it's a very true statement to say that "some displays decode with the inverse oetf and some don't" this issue has been plaguing displays for decades now.
You are 100% right in saying "the reference display is gamma 2.2" however, we can only wish this is what displays do, Color.org themselves got this wrong!!! https://www.color.org/srgb.pdf and leads people astray.
I don't actually believe this to be the case, if it was people who use custom ICCs would get extremely wonky results that don't typically happen. On the other hand it is very true that colors when doing it the way they do, you get the "least offensive" results. Though IMO the best solution would be to simply be to default to the pure 2.2 and allow users to override the transfer. the Color protocol allows for explicit peicewise sRGB anyways, so doing this should fit right into a fleshed out colormanaged setup.
I think I am a bit confused on the laptop analogy then, could you elaborate on it?
How monitors typically handle this is beyond me I will admit, But I have seen some really bonkers ways of handling it so I couldn't really comment on whether or not this holds true one way or another. Just so I am not misinterpeting you, are you saying that "if you feed 300nits of PQ, the monitor will not allow it to go above it's 300nits"? IF so this is not the case on what happens on my TV unless I am in "creator/PC" mode. In other modes it will allow it to go brighter or dimmer.
My current monitor is only a 380nit display so I can't really verify on that (nor do I have the hardware to atm)
ah I see, I was a bit confused on what you had meant then. My apologies.
Keep in mind this was based on the above misinterpretation of what I thought you meant.
With libjxl it doesn't really default to the "SDR white == 203" reference from the "reference white == SDR white" common... choice? not sure how to word it... Anyways, libjxl defaults to "SDR white = 255" or something along those lines, I can't quite remember. The reasoning for this was simple, that was what they were tuning butteraugli on.
I think this is an issue of terminology and stuff, reference white is something the colourist often decides. When you assume that HDR graphics white == SDR white this actually causes more problems then it solves. I would say that it is a "good default", but not a safe value to assume. This is something the user may often need to override. I know personally even when just watching movies on MPV this is something I very often need to play with to get a good experience, and this is not even counting professionally done work.
this actually isn't really that true. It is indeed the case that users wont know what transfer function content is using. but they absolutely do see a difference other then "HDR gets brighter then SDR" and that is "it's more smooth in the dark areas" because that is also equally true.
Users have a lot of different assumptions about HDR, but they all follow some sort of trend "it makes the content look more smooth at a greater range of luminance" and if I were to give a "technical definition that follows general user expectations" the definition would be something along the lines of "A transfer that provides perceptually smooth steps of luminance at a given bit depth up to at least 1000 nits in a given reference environment" which is bad for sure, but at the very least, it more closely aligns with general expectations of HDR given it's use in marketing.
(I really hate the terms HDR and SDR btw, I wish they would die in a fire for any technical discussion and really wish we could dissuade people from using the term)
They wouldn't, because applying ICC profiles is opt-in for each application. Games and at least many video players don't apply ICC profiles, so they do not see negative side effects of it being handled wrong (unless they calibrate the VCGT to follow the piece-wise TF).
With Windows Advanced Color of course, that may change.
What analogy?
Yes, that's exactly what happens. TVs do random nonsense to make the image look "better", and one of those image optimizations is to boost brightness. In this case it's far from always nonsense of course (on my TV it was though, it made the normal desktop waaay too bright).
Almost certainly just trying to copy what monitors do.
Heh, when it came to merging the Wayland protocol and we needed implementations for all the features, I was searching for a video or image standard that did exactly that. The protocol has a feature where you can specify a non-default reference luminance to handle these cases.
That is technically speaking true, but noone actually sees that. People do often get confused about bit depth vs. HDR, but that's more to do with marketing conflating the two than people actually noticing a lack of banding with HDR content. With the terrible bitrates videos often use nowadays, you can even get banding in HDR videos too :/
When you play an HDR and an SDR video on a desktop OS side by side, the only normally visible differences are that the HDR video sometimes gets a lot brighter than the SDR one, and that (with a color managed video player...) the colors may be more intense.
it's true that many applications normal users will use won't, but on the flip side, creative types may actually be really familiar with applications that do. But you also do have users who consume it without doing creative things, like MPV and I think madVR too.
you talked about how laptops have a brightness configuration that desktops didn't.
It is true that it can be random for sure, but this is to be expected. While it is useful to keep in mind that PQ is an absolute metric, it is very much intended for displays to pick and choose how they should treat the light. The mastered content is a reference which is why we always talk in "reference nits" when we refer to grading stuff. This behavior is very much to be expected and the user should be able to compensate for it via their own controls. I think that handling PQ is an absolute value is useful on one hand, but fundamentally flawed on the other. indeed, this is one of the short comings of modern operating systems for sure.
Personally I believe that the way to handle this is after all the other processing is done. PQ should be treated as absolute when doing anything like colorspace conversion. When your "reference" looks correct, then you can compensate for display issues. Though perhaps if you have a user supplied chart of a luminance response level other behavior should be considered.
I'm not sure we can remove SDR/HDR from bitrate, most colorspace's transfers do specify a specific bitdepth, but even then, say you have a 3k nit video which is not actually uncommon thanks to apple, a transfer like sRGB/G2.2 bt.1886/G2.4 will still be inadequate to appropriately displayed. This ofc includes if you were to do inverse tonemapping from "SDR" to "HDR" without appropriate debanding.
I don't think one should try to separate bitdepth and a transfer's intended reference peak luminance from the terms SDR and HDR, because they do play an important role in the display of "SDR" content and "HDR" content. Then again, I am an ardent believer in the death of the terms HDR and SDR. Especially when it comes to anything even remotely technical. To me, SDR and HDR when it related to video is useful for nothing but marketing. Colorspaces are absolute and every video has one (even if it is unknown which is it's own can of crap to open).
It's true that higher bitdepth can alleviate banding in lower luminance regions, but even at a given bitdepth of 8bits, HLG will still have less banding in lower luminance regions then sRGB will, and this can indeed be noticable on high luminance displays that can really pump up the brightness or especially in viewing environments with no ambient light aside from the bounce light of the TV and small lumen stuff like phones and night lights which is an extremely common viewing environment, even if a bad one.
also, very much agree about the bitrate crap, AV1 and VVC do help... too bad YT killed off it's good av1 encodes...