Using both deepstream plugins and hardware overlays

Hi,

I would like to acquire video, run a neuronal network, and display video with a custom OSD on the top. Unfortunately, it does not work as intended.

I succeeded to :

  • acquire video and display : nvarguscamerasrc & nvdrmvideosink.
  • display a video using gstreamer with a custom OSD over it as an overlay : nvdrmvideosink, qt eglfs kms, win_mask.
  • acquire video, run neuronal network and display this video : nvarguscamerasrc, nvinfer, nvoverlaysink

The problems I met:

  • nvinfer is not compatible with nvdrmvideosink.
  • nvoverlaysink always display on the top overlay, so over my custom OSD :’( . display-id
  • if using X11, hardware overlays are unusable. win_mask is “ignored”. nvoverlaysink is on top of X11 app. X11 app is over everything else.
  • nvdsosd is not compatible with nvdrmvideosink.

I’m confused in many ways. For example:

DeepStream SDK FAQ - #15 by Fiona.Chen

Tell this pipeline works but it does not on my TX2:
gst-launch-1.0 filesrc location=/opt/nvidia/deepstream/deepstream-5.0/samples/streams/sample_1080p_h264.mp4 ! qtdemux ! h264parse ! nvv4l2decoder bufapi-version=1 ! nvvideoconvert ! m.sink_0 nvstreammux name=m batch-size=1 width=1920 height=1080 ! nvinfer config-file-path= /opt/nvidia/deepstream/deepstream- 5.0/samples/configs/deepstream-app/config_infer_primary.txt ! nvdrmvideosink conn_id=0 plane_id=1 set_mode=0 -e

Pipeline is PREROLLING …
nvbuf_utils: nvbuffer Payload Type not supported
NvBufferGetParamsEx Failed
Failed to get buffer parameters from fd

ERROR: from element /GstPipeline:pipeline0/GstNvInfer:nvinfer0: Internal data stream error.

Without nvdrmvideosink, it works :
gst-launch-1.0 videotestsrc ! nvvideoconvert ! 'video/x-raw(memory:NVMM), format=RGBA, width=1920, height=1080' ! m.sink_0 nvstreammux name=m batch-size=1 width=1920 height=1080 ! nvinfer config-file-path= /opt/nvidia/deepstream/deepstream-5.0/samples/configs/deepstream-app/config_infer_primary.txt ! fakesink -e # OK

Works too with nvoverlaysink, but again, nvoverlaysink always display over my OSD…

Is it possible to run both nvinfer AND hardware overlay (using win_mask) on TX2?

Setup : L4T 32.4.3. TX2 8GB. EGLFS.

I just test with my TX2 board with JetPack 4.5 and DeepStream 5.0.1. The following pipeline can work:

gst-launch-1.0 filesrc location=/opt/nvidia/deepstream/deepstream-5.0/samples/streams/sample_1080p_h264.mp4 ! qtdemux ! h264parse ! nvv4l2decoder ! nvvideoconvert ! m.sink_0 nvstreammux name=m batch-size=1 width=1920 height=1080 ! nvinfer config-file-path= /opt/nvidia/deepstream/deepstream-5.0/samples/configs/deepstream-app/config_infer_primary.txt ! nvdrmvideosink conn_id=0 plane_id=1 set_mode=0 -e

Hi,

Thank you for the reply.

For testing purpose, I continue on using TX2 devkit with ubuntu (L4T 32.4.3).
On ubuntu/X11 (again, not my target but interesting) :

gst-launch-1.0 filesrc location=/opt/nvidia/deepstream/deepstream-5.0/samples/streams/sample_1080p_h264.mp4 ! qtdemux ! h264parse ! nvv4l2decoder ! nvvideoconvert ! m.sink_0 nvstreammux name=m batch-size=1 width=1920 height=1080 ! nvinfer config-file-path= /opt/nvidia/deepstream/deepstream-5.0/samples/configs/deepstream-app/config_infer_primary.txt ! nvdrmvideosink conn_id=0 plane_id=1 set_mode=0 -e  # (Fiona Chen's pipeline)
...
Pipeline is PREROLLING ...
NvMMLiteOpen : Block : BlockType = 261 
NVMEDIA: Reading vendor.tegra.display-size : status: 6 
NvMMLiteBlockCreate : Block : BlockType = 261 
nvbuf_utils: nvbuffer Payload Type not supported
NvBufferGetParamsEx Failed
Failed to get buffer parameters from fd 
0:01:13.884879872  9604   0x55b9cecd40 WARN                 nvinfer gstnvinfer.cpp:1975:gst_nvinfer_output_loop:<nvinfer0> error: Internal data stream error.
...

For testing purpose, I use ubuntu turned into EGLFS. To do it, I follow commands from : Which videosink for Jetson TX2 in EGLFS?.

Fiona Chen’s pipeline give me the same error above. Then I tried some other pipelines:

gst-launch-1.0 filesrc location=/opt/nvidia/deepstream/deepstream-5.0/samples/streams/sample_1080p_h264.mp4 ! qtdemux ! h264parse ! nvv4l2decoder ! nvdrmvideosink conn_id=0 plane_id=1 set_mode=0 -e  # OK

gst-launch-1.0 filesrc location=/opt/nvidia/deepstream/deepstream-5.0/samples/streams/sample_1080p_h264.mp4 ! qtdemux ! h264parse ! nvv4l2decoder ! nvvideoconvert ! nvdrmvideosink conn_id=0 plane_id=1 set_mode=0 -e  # NOK (I added nvvideoconvert to the working pipeline)
gst-launch-1.0 filesrc location=/opt/nvidia/deepstream/deepstream-5.0/samples/streams/sample_1080p_h264.mp4 ! qtdemux ! h264parse ! nvv4l2decoder ! m.sink_0 nvstreammux name=m batch-size=1 width=1920 height=1080 ! nvinfer config-file-path= /opt/nvidia/deepstream/deepstream-5.0/samples/configs/deepstream-app/config_infer_primary.txt ! nvdrmvideosink conn_id=0 plane_id=1 set_mode=0 -e  # NOK (I added nvstreammux/nvinfer to the working pipeline)

I upgraded this devkit to ubuntu / L4T 32.4.4 (last version using apt). The following command tells me I’m using JetPack 4.4.

sudo apt-cache show nvidia-jetpack

Fiona Chen’s command works now! I tried also some other pipeline above and it works too.

My target OS/board runs under L4T r32.4.3. I guess it’s fixed on 32.4.4. I need to upgrade.

This pipeline was for checking one part of the problem.
I still need to check if plane_id works and stacks UI as intented.
In my TODO after upgrading…

If you work on L4T r32.4.3, the pipeline should be:
gst-launch-1.0 filesrc location=/opt/nvidia/deepstream/deepstream-5.0/samples/streams/sample_1080p_h264.mp4 ! qtdemux ! h264parse ! nvv4l2decoder bufapi-version=1 ! nvvideoconvert ! m.sink_0 nvstreammux name=m batch-size=1 width=1920 height=1080 ! nvinfer config-file-path= /opt/nvidia/deepstream/deepstream-5.0/samples/configs/deepstream-app/config_infer_primary.txt ! nvdrmvideosink conn_id=0 plane_id=1 set_mode=0 -e