I am using L4T 32.4.4 on a Jetson Nano based product. I’m adding support for transcoding JPEG images using NvJPEGDecoder::decodeToFd, and I notice that the buffers coming out of the resulting H.264/H.265 encoding are much darker than my source JPEG data. I copy/scale the data out of the fd into a separate buffer via NvBufferTransform, before passing the buffer to the encoder(s).
My existing software does handle full-range vs. limited-range buffers via calling the setExtendedColorFormat function on the H.264/H.265 encoders, and I do create the buffers I feed to the encoders with the correct _ER buffer formats where appropriate. However, the buffer format of the frame output via decodeToFd is a non-extended-range format (e.g. it is NvBufferColorFormat_YUV420), even though JPEG is theoretically always full-range. When I feed this image to either the H.264 or H.265 encoders, I can choose to feed it an extended range buffer, in which case my NvBufferTransform call seems to perform the darkening, or I can feed it a limited rangebuffer, in which case the colors are darkened by the decoding process in my client software, since the H.264/H.265 data signals that it is limited range when in fact the underlying pixels are full range.
Do you have any idea how I can force NvJPEGDecoder to output me a buffer fd with a full range format so I can use this buffer correctly in further stages of my video pipeline?
Hi,
We should support limited range YUV in encoder and decoder. Please check if you can convert full-range data to limited range, and then send to JPEG encoder. And after decoding, it is limited range and you can convert to full range. You can do conversion through hardware converter(by calling NvBufferTransform()).
Thank you for your time on this and for responding. Indeed the H.264/H.265 encoders/decoders do support limited/full range input and output and everything there is working well. The flow in my device is to take input RTP streams, decode them, re-encode them and then output them again via RTP. For combinations of H.264 and H.265 this works fine.
In this case, I just take an encoded JPEG frame in a buffer and send it to the NvJPEGDecoder via encodeToFd. I can’t convert the full range source data to limited range in this case because the source data is JPEG encoded. And the buffer I receive out of the JPEG decoder is marked as being in YUV420 format rather than YUV420_ER format, even though the source pixel data is full range. So I can’t really do any meaningful conversion on that output data with the NvBufferTransform function either, because the FD itself has the format set as limited range and the pixel data is full range.
Thinking further about it, I could probably manually copy the data out of that buffer into another buffer that was created with the full range format, but this seems excessive and unnecessary and my device has many other functions to perform at the same time.
If there is a way to get a correctly marked buffer out of the decoder, that would completely solve this issue. I’m even happy to make code changes or library changes. Whatever is necessary.
I did try to manually copy the buffer and used NvBufferTransform to do the range conversion from limited to extended. However, as you note in another thread (NvBuffer Transform fails with YUV420_ER format. Ratio of downsampling 2), the NvBufferTransform function does not actually support performing the range conversion and so this workaround fails as well.
When you say I should use NvBufferTransform to do the conversion, which formats do you suggest I should use? NV12 and NV12_ER seem to show the same error as YUV420/YUV420_ER:
Hi,
You are right that there is an issue in hardware converter. It fails to convert YUV420 ER to YUV420 limited range. For this conversion please use software converter.
We have checked further and confirmed NvJpegDecoder does not expect YUV420 ER as input, so please check if it it possible to switch your source to YUV420 limited range. If the source only can be YUV420 ER, please try
I responded earlier that the solution worked, but unfortunately I was tricked. I do need to transform the full range data with NvBufferTransform as part of this, and the NvBufferTransform function actually seems to change the pixel format of the source buffer I allocate to a non-ER format, if I give it a full range buffer. So it is generally “working” but the range is still incorrect and the image is darkened.
Do you know if the bug you reference in the hardware converter is fixable by a driver change or something I can implement locally?
Hi,
Do you mean it fails to do YUV420 ER to YUV420 ER conversion through NvBufferTransform()? Not sure which conversion is required in the use-case. Please advise.
Yes, if I do YUV420 ER to YUV420 ER conversion, from DMA buffer to DMA buffer, both allocated with NvBufferCreateEx, NvBufferTransform fails the conversion the first time and prints out the same error message I posted above. Then ifI call NvBufferTransform with the exact same buffer (exact same fd), it succeeds from that point on. I checked the source buffer and after the first failure of NvBufferTransform, the source buffer has had its pixel format changed to YUV420 instead of YUV420 ER. It was quite a surprise.
Thank you for the response and tests for me to run. I will try those, but wanted to report in the meantime that on L4T 28.3.2, I can allocate a temp YUV420_ER buffer, use decodeToFd to get the decoded frame marked with limited-range, use Raw2NvBuffer to copy the decoded fd planes into the YUV420_ER marked buffer, and get a correctly formatted image out of it. I don’t know why the same method wouldn’t work on L4T 32.4.4. I also see on 28.3.2 that using NvBufferTransform on YUV420_ER buffers, or to convert a non-ER buffer to an ER buffer, work fine.
So it looks like I will try the video convert sample and let you know if it fails on L4T 32.4.4. This test will be more difficult because I don’t normally have a way to run that tool on my device. I will package everything up and let you know what I find, probably next week.