rtsp output working but a lot of frame errors ?

I use deepstream to process 4 ip cameras, everything is ok, I can display or record live 2 row/2 colons video but when I set up rtsp output sink, vlc or my home automation system cannot decode well the stream ? the live rtsp video is not complete and i get only a full frame time to time ?

it is not a network problem as all my ip cameras live videos are clear.

i tried several bit rates.

[sink2]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=RTSPStreaming
type=4
#1=h264 2=h265
codec=1
sync=0
bitrate=2000000

set below properties in case of RTSPStreaming

rtsp-port=8554
#5400
udp-port=5400

error example :

2019-09-25 21:40:10 ERROR (stream_worker) [libav.h264] decode_slice_header error
2019-09-25 21:40:10 ERROR (stream_worker) [libav.h264] no frame!
2019-09-25 21:40:10 ERROR (stream_worker) [libav.h264] non-existing PPS 0 referenced
2019-09-25 21:40:10 ERROR (stream_worker) [libav.h264] non-existing PPS 0 referenced
2019-09-25 21:40:10 ERROR (stream_worker) [libav.h264] decode_slice_header error
2019-09-25 21:40:10 ERROR (stream_worker) [libav.h264] no frame!
2019-09-25 21:40:10 ERROR (stream_worker) [libav.h264] non-existing PPS 0 referenced
2019-09-25 21:40:10 ERROR (stream_worker) [libav.h264] non-existing PPS 0 referenced
2019-09-25 21:40:10 ERROR (stream_worker) [libav.h264] decode_slice_header error
2019-09-25 21:40:10 ERROR (stream_worker) [libav.h264] no frame!
2019-09-25 21:40:10 ERROR (stream_worker) [libav.h264] non-existing PPS 0 refere

thank you for your help

Hi,
Do you run a Jetson platform or PC with NVIDIA GPU card? We have verified it through vlc and don’t see any issue.
A user also shares his result:
https://devtalk.nvidia.com/default/topic/1063634/

Hello,

thank you for your fast repply.

I am using on nano. I have a tx2 and one xavier but not tested yet.

I see he was able to stream without any problem, I will check with him.

I got this kind of stream :

http://wifibot.com/download/nvidiadp.png

Thank you,

Laurent

in fact the rtsp output is ok with I use a file as multiple source or only one rtspcamera, but when I have more than one IP camera, the output stream contains a lot of frame errors.

it seems that I their is some timing problem muxing 2 or more rtsp camera in my case ?

I saw the user video, it is working ??

the cameras resolution is 1280x720 at 10 fps.

my config file:

# Copyright (c) 2019 NVIDIA Corporation.  All rights reserved.
#
# NVIDIA Corporation and its licensors retain all intellectual property
# and proprietary rights in and to this software, related documentation
# and any modifications thereto.  Any use, reproduction, disclosure or
# distribution of this software and related documentation without an express
# license agreement from NVIDIA Corporation is strictly prohibited.

[application]
enable-perf-measurement=1
perf-measurement-interval-sec=5
#gie-kitti-output-dir=streamscl

[tiled-display]
enable=1
rows=1
columns=2
width=1280
height=720
gpu-id=0
#(0): nvbuf-mem-default - Default memory allocated, specific to particular platform
#(1): nvbuf-mem-cuda-pinned - Allocate Pinned/Host cuda memory, applicable for Tesla
#(2): nvbuf-mem-cuda-device - Allocate Device cuda memory, applicable for Tesla
#(3): nvbuf-mem-cuda-unified - Allocate Unified cuda memory, applicable for Tesla
#(4): nvbuf-mem-surface-array - Allocate Surface Array memory, applicable for Jetson
nvbuf-memory-type=0

[source0]
enable=1
#Type - 1=CameraV4L2 2=URI 3=MultiURI 4=RTSP
type=4
#uri=rtsp://admin:Gimxxxx@192.168.1.247/Streaming/Channels/1
uri=rtsp://admin:Gimxxxx@192.168.1.247/Streaming/Channels/3
#uri=file://../../streams/sample_1080p_h264.mp4
#num-sources=2
gpu-id=0
# (0): memtype_device   - Memory type Device
# (1): memtype_pinned   - Memory type Host Pinned
# (2): memtype_unified  - Memory type Unified
cudadec-memtype=0

[source1]
enable=1
#Type - 1=CameraV4L2 2=URI 3=MultiURI 4=RTSP
type=4
uri=rtsp://admin:wifixxxx@192.168.1.248:554/1/h264major
gpu-id=0
# (0): memtype_device   - Memory type Device
# (1): memtype_pinned   - Memory type Host Pinned
# (2): memtype_unified  - Memory type Unified
cudadec-memtype=0

[sink0]
enable=0
#Type - 1=FakeSink 2=EglSink 3=File
type=2
#5
sync=0
#1
source-id=0
gpu-id=0
qos=0
nvbuf-memory-type=0
overlay-id=1

[sink1]
enable=0
type=3
#1=mp4 2=mkv
container=1
#1=h264 2=h265
codec=1
sync=0
#iframeinterval=10
bitrate=2000000
output-file=out.mp4
source-id=0

[sink2]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=RTSPStreaming
type=4
#1=h264 2=h265
codec=1
sync=0
bitrate=4000000
# set below properties in case of RTSPStreaming
rtsp-port=8554
udp-port=5400

[osd]
enable=1
gpu-id=0
border-width=1
text-size=15
text-color=1;1;1;1;
text-bg-color=0.3;0.3;0.3;1
font=Serif
show-clock=0
clock-x-offset=800
clock-y-offset=820
clock-text-size=12
clock-color=1;0;0;0
nvbuf-memory-type=0

[streammux]
gpu-id=0
##Boolean property to inform muxer that sources are live
live-source=1
batch-size=2
##time out in usec, to wait after the first buffer is available
##to push the batch even if the complete batch is not formed
batched-push-timeout=40000
## Set muxer output width and height
width=1920
height=1080

##Enable to maintain aspect ratio wrt source, and allow black borders, works
##along with width, height properties
enable-padding=0
nvbuf-memory-type=0

# config-file property is mandatory for any gie section.
# Other properties are optional and if set will override the properties set in
# the infer config file.
[primary-gie]
enable=1
gpu-id=0
model-engine-file=../../models/Primary_Detector_Nano/resnet10.caffemodel_b8_fp16.engine
batch-size=2
#Required by the app for OSD, not a plugin property
bbox-border-color0=1;0;0;1
bbox-border-color1=0;1;1;1
bbox-border-color2=0;0;1;1
bbox-border-color3=0;1;0;1
interval=4
gie-unique-id=1
nvbuf-memory-type=0
config-file=config_infer_primary_nano.txt

[tracker]
enable=1
tracker-width=480
tracker-height=272
#ll-lib-file=/opt/nvidia/deepstream/deepstream-4.0/lib/libnvds_mot_iou.so
ll-lib-file=/opt/nvidia/deepstream/deepstream-4.0/lib/libnvds_mot_klt.so
#ll-config-file required for IOU only
#ll-config-file=iou_config.txt
gpu-id=0

[tests]
file-loop=0

Hi,
Do you use the same IP cameras running in the same mode(720p, 10fps)? It looks like the framerate of sources does not match. Also if you do not upgrade to DS4.0.1, please do the upgrade.

Hello,

I use for my tests only different IP cameras form several brand (2 different trendnet model, 1 x wansview , 1x reolink, 1 x no brand ptz camera), all are handled well in DS4.0.1 for inference.

Yes I fell that it is because the streams are not exactly the same, this can affect RTSP sink ?

Saving the output mosaic into a file is ok.

What I can test is to setup the same camera as 2 separate rtsp input ?

Yes I use DS4.0.1.

Best Regards,

Laurent

I finally tested with the same camera providing stream for 2 sources I have the same problem ?

the rtsp output is still scanbled but only at the half botom of the image ?

http://wifibot.com/download/2sources.png

Deepstream-app version 4.0.1
DeepStreamSDK 4.0.1

[source0]
enable=1
#Type - 1=CameraV4L2 2=URI 3=MultiURI 4=RTSP
type=4
uri=rtsp://admin:Gimtf7am94@192.168.1.247/Streaming/Channels/3
#num-sources=2
gpu-id=0
# (0): memtype_device   - Memory type Device
# (1): memtype_pinned   - Memory type Host Pinned
# (2): memtype_unified  - Memory type Unified
cudadec-memtype=0

[source1]
enable=1
#Type - 1=CameraV4L2 2=URI 3=MultiURI 4=RTSP
type=4
uri=rtsp://admin:Gimtf7am94@192.168.1.247/Streaming/Channels/3
gpu-id=0
# (0): memtype_device   - Memory type Device
# (1): memtype_pinned   - Memory type Host Pinned
# (2): memtype_unified  - Memory type Unified
cudadec-memtype=0

**PERF: 10.03 (9.93) 10.03 (9.93)
**PERF: 9.97 (9.94) 9.97 (9.94)
**PERF: 10.01 (9.95) 10.01 (9.95)
**PERF: 10.13 (9.97) 10.13 (9.97)
**PERF: 10.01 (9.97) 10.01 (9.97)
**PERF: 9.87 (9.96) 9.87 (9.96)
**PERF: 10.13 (9.98) 10.13 (9.98)
**PERF: 10.01 (9.98) 10.01 (9.98)
**PERF: 9.88 (9.97) 9.88 (9.97)
**PERF: 10.10 (9.98) 10.10 (9.98)
**PERF: 10.00 (9.98) 10.00 (9.98)
**PERF: 10.03 (9.99) 10.03 (9.99)
**PERF: 10.00 (9.99) 10.00 (9.99)
**PERF: 9.97 (9.99) 9.97 (9.99)

Hi,
It might be an issue in network bandwidth. Looks like the h264 stream is not complete(maybe the bottom half part is lost). The bandwidth is good for single source, but insufficient in multiple sources.

Or do you run ‘sudo jetson_clocks’? Maybe the loading is high and performance is insufficient while CPU is running DFS( dynamic frequency scaling ).

Hello,

yes I run ‘sudo jetson_clocks’ before.

but the strange think is that the rtsp output is perfect when I use 8 sources from a file.

When we stream more than 1 rtsp sources, could it be a problem with I frames ?

Laurent

hello,

it is working now, i had to decrease I Frames rate on some cameras ??

Laurent

Hi,
I frames have larger size. It is more like an issue in bandwidth or buffering. You may try to modify deepstream-app to configure some properties in rtspsrc or queue plugin.

yes you are wright ,less I frame is less data, so i will try to modify the deepstream-app.

thank you for your help

laurent