We are using CSI cameras that natively output images in the UYVY color format. The ‘nvvideoconvert’ plugin does not implement support for this color format. The deepstream sample applications use the standard gstreamer ‘videoconvert’ plugin to perform conversions from UYVY. Unfortunately this results in high CPU utilization since ‘videoconvert’ is done on the CPU. The ‘nvvidconv’ plugin does support hardware accelerated colorspace conversion but does not appear to work with nvstreammux. When we attempt to use it we encounter the following error:
ERROR: from element /GstPipeline:pipeline0/GstNvStreamMux:m: Input buffer number of surfaces (0) must be equal to mux->num_surfaces_per_frame (1)
Set nvstreammux property num-surfaces-per-frame appropriately
Is there a way to get hardware accelerated UYVY colorspace conversion to work on the Xavier when the data is being routed through ‘nvstreammux’? Will ‘nvvideoconvert’ be enhanced to support this format?
Please let me know if you still facing any problems on using DeepStream.
Maybe you can get less framerate than supported because of videoconvert gstreamer elements.
I’ve found the most performant workaround is to just use nvconvert to convert back to non-NVMM memory before sending the data down the rest of the pipeline. A solution such as:
Greetings Jon.
I was trying the pipeline as I am also working on an e-con camera with UYVY format for the nano. I was wondering if you encountered the following error when running the pipeline:
(gst-launch-1.0:8217): GLib-GObject-WARNING **: 14:08:14.118: cannot register existing type ‘GstInterpolationMethod’
I know that it happens because of nvvidconv & nvvideoconvert compatability with deepstream. Though even with the error the pipeline seems to run just fine. So I was wondering if it was safe to simply ignore it?