Image formats when JPEG Compressing with Jetson Multimedia APIs

Hello,

I am currently trying to live JPEG compress cameras from a V4L source. I’d initially tried to use the gstreamer pipelines for this, using the nvidia JPEG compress components. Unfortunately, due to some hardware bottleneck we’ve found on the particular AGX machine we’ve purchased we’re fairly certain we can’t feed it in directly to the encoder without limiting us to 6 cameras (vs the 7 cameras we need).

I’ve written a wrapper class around the NvJpegEncoder class, and have an image in RGB8 (as in, an array where every 3 bytes are a pixel with R, G and B values specifically, each one byte). Unfortunately, when I run the code I get an immediate error saying:

[ERROR] (src/jetson/NvBuffer.cpp:428) Unsupported pixel format 859981650
[ERROR] (src/jetson/NvBuffer.cpp:226) Buffer 0, Plane 4 already allocated
[ERROR] (src/jetson/NvJpegEncoder.cpp:177) <jpenenc> Buffer not in proper format

Relevant source code here:

JpegEncoder::JpegEncoder(uint32_t quality) : quality_(quality), jpegenc_(NvJPEGEncoder::createJPEGEncoder("jpenenc")) {
  if (!jpegenc_) {
    throw std::runtime_error("Could not create Jpeg Encoder");
  }
}

std::size_t JpegEncoder::encode(const Image& image, rust::Slice<uint8_t> buffer) const {
  std::size_t out_buf_size = buffer.length();
t
  NvBuffer nvbuffer(V4L2_PIX_FMT_RGB24, image.width, image.width, 0);

  nvbuffer.allocateMemory();
  std::memcpy(
      nvbuffer.planes[0].data, image.data.data(), std::min((size_t)nvbuffer.planes[0].length, image.data.size()));

  auto data = buffer.data();

  if (jpegenc_->encodeFromBuffer(nvbuffer, JCS_YCbCr, &data, out_buf_size, quality_) < 0) {
    throw std::runtime_error("Error while encoding from buffer");
  }

  return out_buf_size;
}

My question is, an I just unable to use the Jetson Multimedia APIs for JPEG encoding if I have the data in this format? Additionally, I know the sensor outputs V4L2_PIX_FMT_SGRBG12 when I fetch the data from V4L2. I thought about sending that in instead, but, I don’t see that listed as a supported format in the NvBuffer either. What are my options here?

Hi,
Hardware JPEG encoder supports NvBufferColorFormat_YUV420 and NvBufferColorFormat_NV12. If your source frame data is in different format, please convert to the formats.

Thanks @DaneLLL for your response. Given that my sensor only seems to output mosaiced images with the V4L2_PIX_FMT_SGRBG12 format, what would be the best way to convert this to an NV12 or YUV420 format in a way where I can handle a large number of cameras at a time? What part of the Jetson Multimedia APIs would help in this regard?

Hi,
For Bayer sensor, we would suggest use ISP engine to do debayering and get frame data in YUV420. So that the buffer can be encoded into JPEG directly. Please check
Sensor Driver Programming Guide

Once the driver is ready, you can run this sample for JPEG encoding:

/usr/src/jetson_multimedia_api/samples/09_camera_jpeg_capture

Thanks @DaneLLL. I’ll take a look at that. When you say use the ISP engine to do debayering do you mean I’d have to go through NVArgus? Or is what the 07_video_convert sample does equivalent? I’m a bit wary of using the ISP through nvargus as we’re experiencing some bandwidth issues when we go beyond 6 cameras, or even fewer if we enable HDR (the previous thread we did with the relevant issue is here).

Hi,
Yes, I mean Argus. If you experience bandwidth issue, please try to run VI/ISP/NVCSI engines at max clocks:
Jetson/l4t/Camera BringUp - eLinux.org

And also set VIC at max clock:
Nvvideoconvert issue, nvvideoconvert in DS4 is better than Ds5? - #3 by DaneLLL

And execute sudo nvpmodel -m 0, sudo jetson_clocks.

See if you can get target performance after executing the steps.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.