RTSP Stream Pulling Problem

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) Jetson
• DeepStream Version 6.4

I used nvurisrcbin to pull 36 RTSP streams (18 Hikvision cameras, main stream and sub stream), and after running for a period of time, an error occurred:

216:20:08.541309950     1 0xffff2c002320 WARN             nvurisrcbin gstdsnvurisrcbin.cpp:1253:watch_source_status:<nv-uri-src-bin> warning: No data from source rtsp://admin:zaq12wsx@192.168.100.220:554/Streaming/Channels/101 since last 45 sec. Trying reconnection
216:20:08.548055045     1 0xffff400045e0 WARN                 rtspsrc gstrtspsrc.c:6585:gst_rtspsrc_send:<src> error: Unhandled error
216:20:08.548098087     1 0xffff400045e0 WARN                 rtspsrc gstrtspsrc.c:6585:gst_rtspsrc_send:<src> error: Option not supported (551)
216:20:08.548145545     1 0xffff400045e0 WARN                 rtspsrc gstrtspsrc.c:8669:gst_rtspsrc_pause:<src> error: Could not send message. (Generic error)
216:21:03.258961600     1 0xffff400045e0 WARN                 rtspsrc gstrtspsrc.c:5624:gst_rtspsrc_loop_udp:<src> warning: The server closed the connection.
216:21:30.893560169     1 0xfffed00380c0 WARN                 rtspsrc gstrtspsrc.c:3458:on_timeout_common:<src> source 634e4b3e, stream 634e4b3e in session 0 timed out
216:23:06.611418176     1 0xffff400045e0 WARN                 rtspsrc gstrtspsrc.c:5624:gst_rtspsrc_loop_udp:<src> warning: The server closed the connection.
216:25:09.007767611     1 0xffff400045e0 WARN                 rtspsrc gstrtspsrc.c:5624:gst_rtspsrc_loop_udp:<src> warning: The server closed the connection.

Then the entire pipeline stopped running.And the logs are constantly being output

216:25:09.007767611     1 0xffff400045e0 WARN                 rtspsrc gstrtspsrc.c:5624:gst_rtspsrc_loop_udp:<src> warning: The server closed the connection.

In this situation, I don’t know how to handle it. I’m not sure if it’s an issue with the RTSP stream or my own code. Can I ask for your advice?

The RTSP server side reply “GST_RTSP_STS_OPTION_NOT_SUPPORTED” to the client’s request. The session has be disconnected by the RTSP server. You may need to remove the corresponding rtsp source and re-add it to the pipeline.

subprojects/gst-plugins-base/gst-libs/gst/rtsp/gstrtspdefs.h · 1.20 · GStreamer / gstreamer · GitLab

@Fiona.Chen

Are there any relevant examples in Deepstream?

There is source remove/add sample in deepstream_reference_apps/runtime_source_add_delete at DS_6.4 · NVIDIA-AI-IOT/deepstream_reference_apps.

If there is bus error received, you need to handle the error in the bus callback. It is GStreamer operations, please google by yourself.

@Fiona.Chen

This message bus did not receive any related messages, and my pipeline died directly after reporting

0:21:42.327485347 1 0xfffe48003a40 WARN rtspsrc gstrtspsrc.c:5819:gst_rtspsrc_loop_udp: warning: The server closed the connection.

I was not given the opportunity to handle it on the message bus

Is there any other logs?

@Fiona.Chen

0:13:04.234910739 1 0xfffe24018ec0 WARN nvurisrcbin gstdsnvurisrcbin.cpp:1286:watch_source_status: warning: No data from source rtsp://admin:zaq12wsx@192.168.100.224:554/Streaming/Channels/101 since last 45 sec. Trying reconnection
0:13:04.256671295 1 0xfffe48003a40 WARN rtspsrc gstrtspsrc.c:6789:gst_rtspsrc_send: got NOT IMPLEMENTED, disable method PAUSE
0:13:28.578238074 1 0xfffe48003a40 WARN rtspsrc gstrtspsrc.c:5819:gst_rtspsrc_loop_udp: warning: The server closed the connection.
0:15:31.122769941 1 0xfffe48003a40 WARN rtspsrc gstrtspsrc.c:5819:gst_rtspsrc_loop_udp: warning: The server closed the connection.
0:17:34.879871645 1 0xfffe48003a40 WARN rtspsrc gstrtspsrc.c:5819:gst_rtspsrc_loop_udp: warning: The server closed the connection.
0:19:38.635313904 1 0xfffe48003a40 WARN rtspsrc gstrtspsrc.c:5819:gst_rtspsrc_loop_udp: warning: The server closed the connection.
0:21:42.327485347 1 0xfffe48003a40 WARN rtspsrc gstrtspsrc.c:5819:gst_rtspsrc_loop_udp: warning: The server closed the connection.

@Fiona.Chen

I don’t know why the RTSP server disconnected as you said. I am using nvurisrcbin and the reconnection time is set to 45 seconds. If RTSP drops, it will reconnect. Why did it disconnect directly? And why was the disconnection not received in the message bus

The nvurisrcbin reconnection can only reconnect the stream when the session is alive. If the RTSP server closed the session, the nvurisrcbin reconnection is of no use. You need to remove the source and connect the source again manually.

Current problem is that there is no error reported from rtspsrc, so you may need to find other ways to detect the RTSP server close issue.

@Fiona.Chen

The key is that if this camera breaks, it’s okay. I have a total of 36 RTSP streams, and now if only one is broken, the entire application layer will die, which is very troublesome. Thank you very much for your reply.

The DeepStream side app is just a RTSP client. The seesion is closed by the RTSP server and no correct information is provided to the client, so the RTSP client does not know why the session is closed either. Please consult the RTSP server vendor for the reason and how to detect such case.

@Fiona.Chen

I am using Deepstream-6.2 version and using nvurisrcbin for streaming. I only saw the rtsp-reconnect-interval attribute in nvurisrcbin, but I did not see the attribute for the maximum number of reconnections. What is the maximum number of reconnections for nvurisrcbin when the RTSP stream drops?

With DeepStream 6.2 version, there is no limitation of attempt times.

@Fiona.Chen

I have another question to ask:
I now want to implement inference for 36 RTSP streams. My current approach is to connect 36 RTSP streams with a streammux; Another idea now is to divide the 36 RTSP streams into three groups, with each group consisting of 12 RTSP streams connected to one Streammux, resulting in a total of three Streammux. Each Streammux will then be connected to an Nvidian. Is this better than the first plan?

Depends on the details and your requirement. You only mentions the streammux, will you do inference in each pipeline?

@Fiona.Chen

As shown in the above figure, all Nvinfer loaded in the figure are the same model.

For the components you listed, it depends on your model. Sometimes the model engine with bigger batch size is faster than multiple engines with less batch size, sometimes not. It is not decided by DeepStream frameowrk. It is decided by all the components you use in your pipelines.

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks.