What is the nvvidconv output?

• Jetson
• Deepstream 5.0
**• JetPack 4.4 **
**• TensorRT 7.0 **

I am currently using this command for GStreamer:

appsrc name=src is-live=true block=true caps=video/x-raw,width=640,height=480,format=RGB ! videoconvert ! nvvidconv ! video/x-raw(memory:NVMM),format=NV12 ! videoconvert ! appsink name=sink

When I get the output byte size from GStreamer it is 1008 bytes. What data struct does nvvidconv map to?

For example, when I use nvvideoconvert on my x86_64 machine, I get the GStreamer output as 64 bytes which maps to the NvBufSurface struct.

For your pipeline, the last videoconvert input format is ‘video/x-raw(memory:NVMM),format=NV12’, this format is Nvidia defined video format and it is special for Nvidia multimedia and GPU hardware. If you want to output normal video format frames, please use nvvidconv which can support many video format. https://docs.nvidia.com/jetson/archives/l4t-archived/l4t-3231/index.html#page/Tegra%20Linux%20Driver%20Package%20Development%20Guide%2Faccelerated_gstreamer.html%23wwpID0E0UM0HA

1 Like

Hey @Fiona.Chen,

I understand what you are saying. What I’m trying to understand is the datastruct that outputs from nvvidconv that is equivalent to 1008 bytes. I’m manually creating a GSTREAMER pipeline and am trying to get the ptr from the output directly from the appsink.

How did you get the 1008 bytes in your appsink?

My code goes as this:

Incoming frame comes in as a (void*) called indata and insize.

GstBuffer *buffer = gst_buffer_new_and_alloc(insize);
gst_buffer_fill(buffer, 0, indata, insize);
gst_app_src_push_buffer(GST_APP_SRC(_gStreamerData._appSrc), buffer);
GstSample* gstSinkSample = gst_app_sink_try_pull_sample((GstAppSink *)_gStreamerData._appSink,     250000000);
GstBuffer* gstSinkBuffer = gst_sample_get_buffer(gstSinkSample);
GstMapInfo outBufferMap;
gst_buffer_map(gstSinkBuffer, &outBufferMap, GST_MAP_READ);
auto outsize = outBufferMap.size;
auto outdata = outBufferMap.data

outsize ends up equalling to 1008 bytes so it must map to some struct. If I were to use nvvideoconvert by changing nvvidconv to nvvideoconvert in the pipeline string I posted above on x86, this same code above outputs 64 bytes which maps to NvBufSurface.

The NV private frame data needs NV provided interface to be mapped from GstBuffer. Please refer to deepstream-image-meta-test sample codes for how to get NvBufSurface from GstBuffer.
The NvBufSurface is defined in /opt/nvidia/deepstream/deepstream-5.0/sources/includes/nvbufsurface.h

@Fiona.Chen, My problem isn’t mapping the data. I don’t have an issue doing that. I was trying to say that the same code works on x86_64 using the nvvideoconvert plugin on deepstream 5. But when i’m on the JETSON using the nvvidconv plugin, I get a outsize of 1008 bytes. This does not map to NvBufSurface. I was wondering what struct the output maps to or if this is some sort of error.

  1. Does the nvvidconv output a NvBufSurface or does it output another struct?
  2. If it does map to the NvBufSurface, why do I get 1008 bytes outputted by the nvvidconv plugin?
  3. If it does NOT map to that struct, what does it map to?

Question 3 and 1 are saying the same thing but I’m trying to get my problem across.

Please update to Deepstream5.0. The nvvideoconvert plugin works for both Jetson and T4.