Plasma 6.0 HDR success report

So far so good with a 4070 Super using the latest 550.54.14 driver on kernel 6.7! Here’s what I found:

  • HDR works. It engages the HDR mode of the monitor and enables high bit depth output. It isn’t yet fully realized, but the basics are there and will undoubtedly only get better as the plumbing for color management continues to be hooked up.
  • HDR color won’t be correct for nearly all monitors as it assumes (I think) REC2020 primaries which will likely be way off from your monitor’s actual coordinates. There isn’t a way to change these or manually enter them, so we’re stuck with the (likely) wrong primaries. Fortunately, it appears the developers are already at work on this issue and will grab the self-reported primaries from the monitor. Hopefully this solution gets prioritized and shipped soon!
  • While we wait for the above to get fixed, there are a couple of options to try to get things good enough for now: there’s a slider in the menu for SDR Color Intensity and a sort of hidden kscreen-doctor command to keep the display in HDR mode but force an sRGB gamut. Between the two, I found that the kscreen-doctor command worked best for my monitor. Specifically, the command is “kscreen-doctor output.1.wcg.disable” (or “kscreen-doctor output.1.wcg.enable” to switch back to wide gamut).
  • High bit depth works, tested visually with a VK_FORMAT_R16G16B16A16_SFLOAT surface
  • Developers can experiment with HDR today by using greater than 1.0f values while on a VK_FORMAT_R16G16B16A16_SFLOAT surface. 1.0f appears to correspond to the SDR Brightness level as specified in the Display Configuration page or via the kscreen-doctor output.1.sdr-brightness.N command. To be clear, it appears that if you had the desktop SDR brightness set to 100 nits, a pixel with a luminance value of 1.0f would be around 100 nits and and 4.0f should scale to roughly 400 nits provided KDE got accurate information from the monitor. A simple slider to let the user find their clipping point, and sticking with sRGB gamut, might be sufficient to get things up and running until more explicit gamma / gamut surfaces are available.

To recap, Plasma 6’s signature HDR features (such as they are in base 6.0) appear to be working as expected on the latest 550.54.14 driver. This is a huge step forward and provides tangible benefits for users right now.

Congrats to all the developers who have made this possible, and thanks to NVIDIA for having day 1 support.

2 Likes

An issue with SDR/sRGB content in HDR mode:

  • It appears that SDR content is being converted using a 2.2 gamma function instead of the proper piece-wise sRGB formula. This means that PC applications that are correctly following the standard will be rendered improperly and suffer from crushed dark regions.

  • Here’s a great image showing what happens when using mismatched functions: https://artoriuz.github.io/blog/images/gamma_correction/gamma_comparison.png The middle should be a smooth gradient all the way through the darkest patches (correct conversion), while in the leftmost image they’ll be slightly too dark and in the rightmost they’ll be slightly too bright (both incorrect conversions one way or the other). If the image is a little hard to see on your PC monitor, try using an iPhone if handy (OLED is perfect for this purpose.) Plasma 6.0 appears to be converting in a way that causes the issue as seen in the leftmost case, where the dark patches get crushed. This implies they are doing something like 2.2 ->linear->PQ (or whatever) rather than sRGB->linear->PQ.

  • For now, the only way around this is to hope the application is outputting non-standard 2.2 encoded content or to use plasma 6 in SDR mode for normal desktop use. Hopefully the developers reconsider using 2.2 directly for everything and instead decode the sRGB content with the piecewise function or at least provide some sort of option (global, per-application, whatever).

Pretty much all of the so called “sRGB content” has been made for gamma 2.2 displays. So, in practice, that’s the right choice.
Actually, there isn’t an sRGB decode curve, only an encode curve which some might think should be inverted for decoding but that wasn’t the intention of the creators of the standard. It was meant to include a correction between viewing conditions of content creators and end users. But it usually wasn’t utilized in practice and things were both made and were meant to be watched on gamma 2.2 displays. This is a complicated and old issue. But I think the Linux guys made the right decision here and I hope Windows will offer a way to manually switch between sRGB inverse-encode and gamma 2.2 because 99% of the content looks overly bright in the shadow region with sRGB inverse-encode.

Thanks for chiming in - I had considered updating my previous post to make it clearer that the sRGB/2.2 issue wasn’t really a mistake on their part, but rather a choice in how to handle the mess that is the sRGB “standard”. Given that GPUs have used IEC 61966-2-1 for linearizing sRGB textures, as have many others as well for various purposes, I think the best way to put it is that all the uncertainty surrounding this legacy standard perfectly illustrates why building a better color managed future is so important.

Hi there, I was wondering were you able to get Gamescope with HDR to actually enable HDR in the supported games according to the report here: English - Planet KDE as I haven’t had success.
The only way I can enable HDR is through the display settings, but neither of the games stated to work on the link don’t work for me.

I’ve tried VK_hdr_layer with quake2rtx and it prints to the console that it couldn’t find a valid surface. Haven’t tried gamescope.

Not sure why that example doesn’t work, desktop applications seem to work fine though. If you launch an application with “ENABLE_HDR_WSI=1 app” it’ll expose VK_COLOR_SPACE_HDR_ST2084_EXT, VK_COLOR_SPACE_EXTENDED_SRGB_LINEAR_EXT, and VK_COLOR_SPACE_BT709_LINEAR_EXT. I’ve tried out ST2084 and it works as expected, so if you’re developing HDR applications you can get going using that today. It will be really exciting when this just works without the shim.

Gamescope HDR works in some games, but for me requires git version, otherwise it just freezes after a couple of seconds.
Successfully got hdr working on:
Ghostwire
Returnal
Cyberpunk (sometimes fails to start though)
Elden Ring
Guardians of the Galaxy
Spider man remastered
Horizon Zero Dawn
Need for speed: Heat

Failed (freezes when enabling or launching game):
Death stranding
Medium
Forza Horizon

1 Like

I was also able to get HDR in Quake 2 RTX as well through Gamescope

Are you guys able to get those examples working with flatpak?

I cannot say about flatpak, unfortunately I don’t use it at all, everything is from arch repo or aur for me, with that said I also use wlroots with nvidia patch from aur and gamescope nvidia patch git from aur as well.

Not sure this is the right thread to ask this but, is anyone else getting a black screen on KDE Plasma 6 when using version 6.9rc2/6.9rc3(did not try 6.9rc1) of the linux kernel?

Normally, on kernels 6.8.x, I get to the sddm login screen in non-HDR mode, then I login and the screen flashes black and then comes back in HDR mode.

On 6.9 I get to the login screen just fine but once I login the screen flashes black and never comes back.

I am using the nvidia driver 550.67.