Nvv4l2decoder drops all frames on a RTSP + nvstreammux pipeline

Hi,

I’m using a Jetson Nano to (first phase) read video from multiple cameras and publish them on a web a site (probably will scale down resolutions and framerate).

The thing is that I’m trying to use nvsteammux because on the next phase I’ll try to add some inference and tracking steps.

Reading each camera and saving to a mp4 on different pipelines works well:

gst-launch-1.0 -e nvarguscamerasrc ! nvvidconv ! x264enc ! mp4mux ! filesink location=csi_cam.mp4
gst-launch-1.0 -e rtspsrc location="rtsp://user:password@192.168.200.12:554" drop-on-latency=1 ! rtph264depay ! decodebin ! nvvidconv ! x264enc ! mp4mux ! filesink location=rtsp_cam.mp4

But when I add a nvstreammux element (and after that a nvmultistreamtiler to mix them), the rtsp part of the inputs doesn’t generate anything and the result is just a video from the CSI camera. Also tested each input separately, for instance:

gst-launch-1.0 -e nvarguscamerasrc bufapi-version=1 maxperf=1 ! m.sink_0 nvstreammux name=m live-source=1 width=1920 height=1080 batch_size=1 ! nvmultistreamtiler width=2560 height=1920 ! nvvideoconvert ! x264enc ! mp4mux ! filesink location=csi_cam.mp4

does generate some video (framerate seems to drop down though), whereas:

gst-launch-1.0 -e rtspsrc location="rtsp://user:password@192.168.200.12:554" ! rtph264depay ! decodebin ! m.sink_0 nvstreammux name=m live-source=1 width=1920 height=1080 batch_size=1 batched-push-timeout=400000 ! nvmultistreamtiler width=2560 height=1920 ! nvvideoconvert ! x264enc ! mp4mux ! filesink location=rtsp_cam.mp4

does not.

When adding some debugging GST_DEBUG=decodebin:7, it reports that frames are being dropped by nvv4l2decoder:

0:00:00.536314530 22108   0x7f50004370 DEBUG              decodebin gstdecodebin2.c:5588:gst_decode_bin_handle_message: Forwarding msg qos message: 0x7f40392910, time 99:99:99.999999999, seq-num 264, element 'nvv4l2decoder0', GstMessageQOS, live=(boolean)false, running-time=(guint64)317513471, stream-time=(guint64)233237794, timestamp=(guint64)317513471, duration=(guint64)18446744073709551615, jitter=(gint64)-317513472, proportion=(double)0.5, quality=(int)1000000, format=(GstFormat)GST_FORMAT_BUFFERS, processed=(guint64)1, dropped=(guint64)1;
0:00:00.556726784 22108   0x7f50004370 DEBUG              decodebin gstdecodebin2.c:5588:gst_decode_bin_handle_message: Forwarding msg qos message: 0x7f40392990, time 99:99:99.999999999, seq-num 265, element 'nvv4l2decoder0', GstMessageQOS, live=(boolean)false, running-time=(guint64)350792162, stream-time=(guint64)266516485, timestamp=(guint64)350792162, duration=(guint64)18446744073709551615, jitter=(gint64)-350792163, proportion=(double)0.5, quality=(int)1000000, format=(GstFormat)GST_FORMAT_BUFFERS, processed=(guint64)1, dropped=(guint64)2;
0:00:00.596653045 22108   0x7f50004370 DEBUG              decodebin gstdecodebin2.c:5588:gst_decode_bin_handle_message: Forwarding msg qos message: 0x7f40392a10, time 99:99:99.999999999, seq-num 268, element 'nvv4l2decoder0', GstMessageQOS, live=(boolean)false, running-time=(guint64)384053976, stream-time=(guint64)299778299, timestamp=(guint64)384053976, duration=(guint64)18446744073709551615, jitter=(gint64)-384053977, proportion=(double)0.5, quality=(int)1000000, format=(GstFormat)GST_FORMAT_BUFFERS, processed=(guint64)1, dropped=(guint64)3;
0:00:00.641135901 22108   0x7f50004370 DEBUG              decodebin gstdecodebin2.c:5588:gst_decode_bin_handle_message: Forwarding msg qos message: 0x7f40392a90, time 99:99:99.999999999, seq-num 271, element 'nvv4l2decoder0', GstMessageQOS, live=(boolean)false, running-time=(guint64)417307768, stream-time=(guint64)333032091, timestamp=(guint64)417307768, duration=(guint64)18446744073709551615, jitter=(gint64)-417307769, proportion=(double)0.5, quality=(int)1000000, format=(GstFormat)GST_FORMAT_BUFFERS, processed=(guint64)1, dropped=(guint64)4;

I also implemented a very similar pipeline in C, and capturing QoS messages shows a very similar output:

Received message on bus: source nvv4l2decoder0, msg_type qos
live: 0, running time: 2360412960, stream time: 1994349901, timestamp: 2360412960, duration: -1
format: 4, processed: 1, dropped: 53
Received message on bus: source nvv4l2decoder0, msg_type qos
live: 0, running time: 2393746293, stream time: 2027683234, timestamp: 2393746293, duration: -1
format: 4, processed: 1, dropped: 54
Received message on bus: source nvv4l2decoder0, msg_type qos
live: 0, running time: 2427079626, stream time: 2061016567, timestamp: 2427079626, duration: -1
format: 4, processed: 1, dropped: 55
Received message on bus: source nvv4l2decoder0, msg_type qos
live: 0, running time: 2460412959, stream time: 2094349900, timestamp: 2460412959, duration: -1
format: 4, processed: 1, dropped: 56
Received message on bus: source nvv4l2decoder0, msg_type qos
live: 0, running time: 2493746293, stream time: 2127683234, timestamp: 2493746293, duration: -1
format: 4, processed: 1, dropped: 57

I’m fairly new to gstreamer and GPUs and have not been able to find more information on this problem. Any advice is greatly appreciated.

It seems to be some configuration on my Hikvision camera. I tested my C pipeline with an external (internet) RTSP stream, and it worked.

So far I’ve been able to track the problem to something that occurs within nvstreammux when data is passed to its chain function:

0:00:17.480377332 30875   0x55ba96a000 DEBUG         GST_SCHEDULING gstpad.c:4320:gst_pad_chain_data_unchecked: calling chainfunction &gst_proxy_pad_chain_default with buffer buffer: 0x7f380a0500, pts 0:00:00.240194347, dts 99:99:99.999999999, dur 0:00:00.033333333, size 64, offset none, offset_end none, flags 0x40
0:00:17.480407801 30875   0x55ba96a000 DEBUG         GST_SCHEDULING gstpad.c:4320:gst_pad_chain_data_unchecked: calling chainfunction &gst_nvstreammux_chain with buffer buffer: 0x7f380a0500, pts 0:00:00.240194347, dts 99:99:99.999999999, dur 0:00:00.033333333, size 64, offset none, offset_end none, flags 0x40
0:00:17.480426760 30875   0x55ba96a000 LOG               GST_BUFFER gstbuffer.c:1721:gst_buffer_map_range: buffer 0x7f380a0500, idx 0, length -1, flags 0001
0:00:17.480441969 30875   0x55ba96a000 LOG               GST_BUFFER gstbuffer.c:213:_get_merged_memory: buffer 0x7f380a0500, idx 0, length 1
0:00:17.480473063 30875   0x55ba96a000 LOG               GST_BUFFER gstbuffer.c:1721:gst_buffer_map_range: buffer 0x7f380a0500, idx 0, length -1, flags 0001
0:00:17.480494730 30875   0x55ba96a000 LOG               GST_BUFFER gstbuffer.c:213:_get_merged_memory: buffer 0x7f380a0500, idx 0, length 1
0:00:17.480540408 30875   0x55ba96a000 LOG               GST_BUFFER gstbuffer.c:1721:gst_buffer_map_range: buffer 0x7f3816a9c0, idx 0, length -1, flags 0001
0:00:17.480580513 30875   0x55ba96a000 LOG               GST_BUFFER gstbuffer.c:213:_get_merged_memory: buffer 0x7f3816a9c0, idx 0, length 1
0:00:17.481582929 30875   0x55ba96a000 DEBUG         GST_SCHEDULING gstpad.c:4326:gst_pad_chain_data_unchecked: called chainfunction &gst_nvstreammux_chain with buffer 0x7f380a0500, returned error
0:00:17.481607461 30875   0x55ba96a000 DEBUG         GST_SCHEDULING gstpad.c:4326:gst_pad_chain_data_unchecked: called chainfunction &gst_proxy_pad_chain_default with buffer 0x7f380a0500, returned error
0:00:17.481636211 30875   0x55ba96a000 DEBUG               GST_PADS gstpad.c:6201:gst_pad_pause_task: pause task
0:00:17.481676420 30875   0x55ba96a000 DEBUG                   task gsttask.c:688:gst_task_set_state: Changing task 0x55ba82a710 to state 2
0:00:17.481699129 30875   0x55ba96a000 INFO                    task gsttask.c:316:gst_task_func: Task going to paused

I don’t know how to debug inside nvstreamdemux, so I’m not sure how to continue.

Nevermind. I found the solution on post How to run RTP Camera in deepstream on Nano - #34 by neophyte1. Replaced with the prebuilt libraries from that post and it worked.

1 Like

Hi,
For more information, the patch is for DeepStream SDK 4.0. If you are on r32.2, you may also consider to use DS4.0.1.

Or upgrade to r32.3.1 + DS4.0.2.

Hi DaneLLL. I’ll try to test it on the latest DS release.

Thanks.

Tested it on r32.3.1 + DS4.0.2 and it worked!