How to convert a decoded JPEG NvBuffer into OpenCV Mat?

I apologize if this has been asked before, but I wasn’t able to find a suitable solution that worked for us. I’m trying to convert an NvBuffer decoded using NvJPEGDecoder into a cv::Mat with 3 channels (RGB). Here’s how I am doing it currently and it isn’t working.

  std::array<unsigned char, MAX_IMAGE_WIDTH * MAX_IMAGE_HEIGHT * 3 / 2> encoded_jpeg_data_buffer;
  std::array<unsigned char, MAX_IMAGE_WIDTH * MAX_IMAGE_HEIGHT * 3> decoded_jpeg_data_buffer;

int decoded_dma_buffer_fd = -1;

uint32_t width, height, pixel_format;
if (jpeg_decoder_->decodeToFd(decoded_dma_buffer_fd,, image_size, pixel_format,
        width, height) != 0)
    return false;

NvBuffer2Raw(decoded_dma_buffer_fd, NvBufferColorFormat_YUV420, width, height,;


cv::Mat rgb_mat =
            cv::Mat(MAX_IMAGE_HEIGHT, MAX_IMAGE_WIDTH, CV_8UC3,;
cv::cvtColor(rgb_mat, rgb_mat, cv::COLOR_YUV2RGB);

The encoded_jpeg_data_buffer contains an image captured using a Camera in Libargus, and was encoded into a JPEG using NvJPEGEncoder.

What am I doing wrong? Thanks in advance.

We assume you use Jetpack 4 release. Please allocate NvBuffer in RGBA pitch linear, and call NvbufferTransfrom() to convert decoded YUV420 to RGBA. And then you can mapt it to cv::Mat like this patch:
NVBuffer (FD) to opencv Mat - #6 by DaneLLL

Please give it a try.

Thanks, are there examples of how to use the NvBufferTransform to transform the pixel format? I wasn’t able to find which fields to set in the parameters. Thanks.

This is what I have tried thus far, and I’m getting strange images

decoded_abgr_dma_buffer_fd_ = inative_buffer->createNvBuffer(
            iegl_output_stream_->getResolution(), NvBufferColorFormat_ABGR32, NvBufferLayout_Pitch);

Decoded jpeg like so

int decoded_dma_buffer_fd = -1;

uint32_t width, height, pixel_format;
if (jpeg_decoder_->decodeToFd(decoded_dma_buffer_fd,, image_size, pixel_format,
        width, height) != 0)
    return false;

Transformed to the ABGR buffer

void* decoded_jpeg_data_buffer = nullptr;
NvBufferTransformParams transform_params = {0};
NvBufferTransform(decoded_dma_buffer_fd, decoded_abgr_dma_buffer_fd_, &transform_params);

    decoded_abgr_dma_buffer_fd_, 0, NvBufferMem_Read, &decoded_jpeg_data_buffer);
NvBufferMemSyncForCpu(decoded_abgr_dma_buffer_fd_, 0, &decoded_jpeg_data_buffer);

Created a cv::Mat and cvtColored it to RGB

cv::Mat tmp_mat;
cv::Mat rgb_mat;

tmp_mat = cv::Mat(cv::Size(MAX_IMAGE_WIDTH, MAX_IMAGE_HEIGHT), CV_8UC4, decoded_jpeg_data_buffer);
cv::cvtColor(tmp_mat, rgb_mat, cv::COLOR_RGBA2RGB);

And I’m getting something that looks like this. Either byte ordering or channel ordering seems to be wrong.


This may not fit the decoded YUV. Please make sure width and height are correct in cv::Mat. And you can save the RGBA data to a file and check if it is good.

I am a bit confused. Would should the values for this cv::Size be? I have them set equal to the dimensions from iegl_output_stream_->getResolution() function call above.

Also, how can I write the RGBA decoded buffer to a file? Could you provide an example?

I tried this but it didn’t work. Also tried with png extension which also didn’t work.

auto image_output_file = std::make_unique<std::ofstream>("decoded-rgba-image.jpg");
image_output_file->write((char*)&decoded_jpeg_data_buffer, image_size);

I figured out the issue. I need to pass in the pitch of the decoded planes as the step when constructing the cv::Mat. like so

NvBufferParams buffer_params;
NvBufferGetParams(decoded_abgr_dma_buffer_fd_, &buffer_params);

cv::Mat tmp_mat = cv::Mat(cv::Size(MAX_IMAGE_WIDTH, MAX_IMAGE_HEIGHT), CV_8UC4, decoded_jpeg_data_buffer, buffer_params.pitch[0]);

Hopefully this is useful to others in the future! Thanks @DaneLLL for your help.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.