We face an issue similar to Full range RGB strange behavior on Nvidia Jetson AGX Xavier Developer Kit
When we try to output full-range RGB data over Displayport output of the Jetson, we notice that the output is not bit-perfect.
We also output a gradient color bar, and also notice that some values of the gradient are not correct on the input. We check this by grabbing the DP output using a frame grabber.
However, in our case /sys/kernel/debug/tegradc.common/tegra_win.1/degamma/degamma is already set to 0, and disabling the cmu by setting /sys/class/graphics/fb1/device/cmu_enable to 0 leads to an even more destorted image.
How can the processing blocks in the display controller all be bypassed, such that frames in the framebuffer are sent to Displayport without any modifications?
Previous issue is on HDMI but not DP. As I remember, full range and limited range also matters. Have you checked your case is on limited range or full range?
Yes, previous issue is on HDMI, but most part of the output processing line (including pre-comp and post-comp) are the same.
Our case is also on full-range.
It’s on a custom board. I will also try to reproduce on the devkit.
For example when the framebuffer to output contains the value (R,G,B)=(230,0,0), on the DP input I read back the value (231,0,0).
Hi,
Your observation is correct that there is minor deviation. It looks neglectable and should not be noticeable in naked eye. We will check with our team and see if we can have further enhancement in the future.
Thank you.
It’s indeed a minor devition, but we use some exact colors for encodings and therefore need to sent the pixeldata untouched to the DP output.