Multichannel RTSP flow in the DeepStream2.0

1、The deepstream-test1 in the DeepStream_Release/sources/apps only support one RTSP flow. If I want to support multichannel RTSP flow using deepstream-test1,how to slove it. Can you give me a source code of multichannel RTSP flow .
I can’t use the multichannel congfig file directly in the /DeepStream_Release/samples/configs/deepstream-app,because my model is tensorflow model.

Thanks

Hi wwlaoxi,
For now we only support single RTSP source. What is your usecase required multi RTSP sources?

1、I must support multichannel RTSP in the field of security monitoring
2、My model is tensorflow model.
3、The GPU is Tesla P4
So I want to support upport multichannel RTSP source.
Can you tell me the method or source for the upport multichannel RTSP interface。

Hi wwlaoxi,
deepstream-test1 launches one gstreamer pipeline. Have you tried to modify the code to launch multi gstreamer pipelines? It should not be a problem to launch multiple pipeline in single process.
This is wrong. We don’t have support in on DeepStream 2.0. Please run built deepstream-app

Can you provide reference source code for the support multichannel RTSP using multiple pipeline in single process? Becasue I am a rookie, I don’t know how to launch multiple pipeline in single process for the deepstream-test1.

Thanks

Goodmorning everyone,

I’m trying to do a similar project so I’m very interested.

Actually my idea was to launch 4 different RTSP streams in 4 different windows using DeepStream (the latest version) but I do not think this solution that I propose could be correct. I’m using a custom config file for the nvgstiva app provided.

For reference to the topic I am attaching the link of my discussion on the forum if you are interested:

https://devtalk.nvidia.com/default/topic/1036526/deepstream-sdk-on-jetson/viewing-4-rtsp-videostream-using-the-config-file-in-deepstream/post/5269435/#5269435

Thanks for the attention and i’ll wait for new developments

Hi, i am able to create multi gstream pipeline, however, the display as an issue. An example on how to use gst-tiler would be great for multiple RTSP streams. Thanks.

Yes, I meet this problem now.
One rtspsrc(ip camera) works well, but two would exit with error.
Can any one give any advice or code?

Thank you.

1, You can use this code to one rtsp:
https://blog.csdn.net/quantum7/article/details/82151637

2, You can config txt to use deepstream-app with multi camera.
https://devtalk.nvidia.com/default/topic/1038991/deepstream-for-tesla/how-can-use-multi-cameras-with-deepstream2-0-/
But, there is no souce code. Why nvidia doesn’t supply it ?

Thank @quantum6 for sharing information.

On DeepStream2.0, please run deepstream-app for multi-input case.

I want to confirm what I’m reading here. multi-rtsp sources are only supported in the deepstream-app (which there is no source code for). We cannot currently setup a gstreamer pipeline which will open multiple rtsp sources and process them with a custom plugin (e.g. the dsexample). Because I have been able to get multi- file sources to work; just not multiple rtspsrcs. That is the following works:
gst-launch-1.0 filesrc location=/home/Videos/temp1.MP4 ! qtdemux ! h264parse ! nvdec_h264 ! m.sink_0 nvstreammux name=m batch-size=1 ! nvvidconv ! dsexample full-frame=1 ! nvosd ! nvmultistreamtiler rows=2 columns=1 width=1920 height=1080 ! nveglglessink filesrc location=/home/Videos/temp2.MP4 ! qtdemux ! h264parse ! nvdec_h264 ! m.sink_1

however I get an error with the following:
gst-launch-1.0 rtspsrc location=rtsp://1.1.0.0/media ! qtdemux ! h264parse ! nvdec_h264 ! m.sink_0 nvstreammux name=m batch-size=1 ! nvvidconv ! dsexample full-frame=1 ! nvosd ! nvmultistreamtiler rows=2 columns=1 width=1920 height=1080 ! nveglglessink rtspsrc location=rtsp://1.1.0.0/media ! qtdemux ! h264parse ! nvdec_h264 ! m.sink_1

Hi virgiliovill,
rtspsrc should go with rtph264depay. Please replace qtdemux with rtph264depay and try again.

A reference post(Tegra platforms):
https://devtalk.nvidia.com/default/topic/1014789/jetson-tx1/-the-cpu-usage-cannot-down-use-cuda-decode-/post/5188538/#5188538