Deepstream-app :: video quality on 2 endpoints are different

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) Jetson Xavier NX
• DeepStream Version DeepStream 6.0
• JetPack Version (valid for Jetson only) 4.6
• TensorRT Version 8.0.1-1
• Issue Type( questions, new requirements, bugs) questions
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

Hi,

I am currently running deepstream-app on Jetson Nx Xavier board.
In the Jetson device, I also run the Node.js application with node-rtsp-stream, which runs the ffmpeg with “rtsp_transport tcp” option. Also, there is an external webserver for web dashboard, and this webserver is also using ffmpeg with “rtsp_transport tcp” option for getting rtsp video stream from deepstream-app.

When running this application, we found that the quality of the video that is generated by Node.js server on the Jetson device is good enough, however, the quality of video in webserver is not that much good.

So, the questions are:

  1. Since we uses ffmpeg with “rtsp_transport tcp” option, we are receiving rtsp stream via tcp. However it seems like the deepstream-app only uses udp sink for rtsp streaming. How could this work? Does rtsp sink (or rtsp server) internally uses tcp for transmission?

  2. Do you think the reason of bad video quality in webserver is because we are connected 2 receivers at 1 rtsp sink? Or are we missing something? Below is the config that we used for deepstream-app.

################################################################################
# Copyright (c) 2019-2020, NVIDIA CORPORATION. All rights reserved.
#
# Permission is hereby granted, free of charge, to any person obtaining a
# copy of this software and associated documentation files (the "Software"),
# to deal in the Software without restriction, including without limitation
# the rights to use, copy, modify, merge, publish, distribute, sublicense,
# and/or sell copies of the Software, and to permit persons to whom the
# Software is furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.  IN NO EVENT SHALL
# THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
# DEALINGS IN THE SOFTWARE.
################################################################################

[application]
enable-perf-measurement=1
perf-measurement-interval-sec=5
# gie-kitti-output-dir=./streamscl
# kitti-track-output-dir=./kitti_track

[tiled-display]
enable=1
rows=1
columns=2
width=1280
height=720
gpu-id=0
#(0): nvbuf-mem-default - Default memory allocated, specific to particular platform
#(1): nvbuf-mem-cuda-pinned - Allocate Pinned/Host cuda memory, applicable for Tesla
#(2): nvbuf-mem-cuda-device - Allocate Device cuda memory, applicable for Tesla
#(3): nvbuf-mem-cuda-unified - Allocate Unified cuda memory, applicable for Tesla
#(4): nvbuf-mem-surface-array - Allocate Surface Array memory, applicable for Jetson
nvbuf-memory-type=0

[source0]
enable=1
#Type - 1=CameraV4L2 2=URI 3=MultiURI
type=4
uri=rtsp://192.168.17.100:554/profile2/media.smp
num-sources=1
gpu-id=0
# (0): memtype_device   - Memory type Device
# (1): memtype_pinned   - Memory type Host Pinned
# (2): memtype_unified  - Memory type Unified
cudadec-memtype=0
smart-record=2
smart-rec-dir-path=/opt/nvidia/deepstream/deepstream-6.0/sources/objectDetector_Yolo/sr
smart-rec-file-prefix=sr0
smart-rec-cache=60
#smart-rec-default-duration= 30

[source1]
enable=1
#Type - 1=CameraV4L2 2=URI 3=MultiURI
type=4
uri=rtsp://192.168.17.101:554/profile2/media.smp
num-sources=1
gpu-id=0
# (0): memtype_device   - Memory type Device
# (1): memtype_pinned   - Memory type Host Pinned
# (2): memtype_unified  - Memory type Unified
cudadec-memtype=0
smart-record=2
smart-rec-dir-path=/opt/nvidia/deepstream/deepstream-6.0/sources/objectDetector_Yolo/sr
smart-rec-file-prefix=sr1
smart-rec-cache=60

[sink0]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File
type=2
sync=1
nvbuf-memory-type=0
overlay-id=1



[sink1]
enable=1
type=3
#1=mp4 2=mkv
container=1
#1=h264 2=h265
codec=1
#encoder type 0=Hardware 1=Software
enc-type=0
sync=0
#iframeinterval=10
bitrate=2000000
#H264 Profile - 0=Baseline 2=Main 4=High
#H265 Profile - 0=Main 1=Main10
profile=2
source-id=0
#when run deepstream-app 
output-file=test.mp4
width=1280
height=720


[sink2]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=RTSPStreaming
type=4
#1=h264 2=h265
codec=1
#encoder type 0=Hardware 1=Software
enc-type=0
sync=0
bitrate=2560000
#H264 Profile - 0=Baseline 2=Main 4=High
#H265 Profile - 0=Main 1=Main10
profile=4
# set below properties in case of RTSPStreaming
rtsp-port=8123
udp-port=5400
udp-buffer-size=2560000
width=1280
height=720


[osd]
enable=1
gpu-id=0
border-width=1
text-size=15
text-color=1;1;1;1;
text-bg-color=0.3;0.3;0.3;1
font=Serif
show-clock=0
clock-x-offset=800
clock-y-offset=820
clock-text-size=12
clock-color=1;0;0;0
nvbuf-memory-type=0

[streammux]
gpu-id=0
##Boolean property to inform muxer that sources are live
live-source=1
batch-size=2
##time out in usec, to wait after the first buffer is available
##to push the batch even if the complete batch is not formed
batched-push-timeout=40000
## Set muxer output width and height
width=1280
height=720
##Enable to maintain aspect ratio wrt source, and allow black borders, works
##along with width, height properties
enable-padding=0
nvbuf-memory-type=0

# config-file property is mandatory for any gie section.
# Other properties are optional and if set will override the properties set in
# the infer config file.
[primary-gie]
enable=1
gpu-id=0
#model-engine-file=model_b1_gpu0_int8.engine
labelfile-path=labels.txt
batch-size=2
#Required by the app for OSD, not a plugin property
bbox-border-color0=1;0;0;1
bbox-border-color1=0;1;1;1
bbox-border-color2=0;0;1;1
bbox-border-color3=0;1;0;1
interval=2
gie-unique-id=1
nvbuf-memory-type=0
config-file=config_infer_primary_yoloV3.txt

[tracker]
enable=1
# For NvDCF and DeepSORT tracker, tracker-width and tracker-height must be a multiple of 32, respectively
tracker-width=640
tracker-height=384
ll-lib-file=/opt/nvidia/deepstream/deepstream-6.0/lib/libnvds_nvmultiobjecttracker.so
# ll-config-file required to set different tracker types
# ll-config-file=../../samples/configs/deepstream-app/config_tracker_IOU.yml
ll-config-file=../../samples/configs/deepstream-app/config_tracker_NvDCF_perf.yml
# ll-config-file=../../samples/configs/deepstream-app/config_tracker_NvDCF_accuracy.yml
# ll-config-file=../../samples/configs/deepstream-app/config_tracker_DeepSORT.yml
gpu-id=0
enable-batch-process=1
enable-past-frame=1
display-tracking-id=1

[tests]
file-loop=0

Thanks.

1 about “However it seems like the deepstream-app only uses udp sink for rtsp streaming, How could this work?”, do you mean how deepstream-app receive rtsp stream? for deepstream-app src, "select-rtp-protocol=4 " will enable rtsp_transport tcp, deepstream-app is opensource, you can find select_rtp_protocol in create_rtsp_src_bin. for deepstream sink, how to send data depends on client’s protocol request.
2 need to narrow down the issue, does sink1’s file played good? if change sink2’ output to file, does it played good?

For “However it seems like the deepstream-app only uses udp sink for rtsp streaming, How could this work?”, I mean how deepstream-app sends rtsp stream. According to your answer, what I understand is that if we use ffmpeg with rtsp_transport tcp option, ffmpeg will be able to receive deepstream-app’s rtsp stream via tcp connection, right? Did I misunderstand somthing?

Also, for “does sink1’s file played good? if change sink2’ output to file, does it played good?”, answer is yes for both. The video that is saved by sink1 is played well, and sink2 outputs to file properly.

1 yes, there is rtsp negotiation, sending protocol depends on client’s protocol request.
2 about “quality of the video that is generated by Node.js server on the Jetson device is good enough”, please elaborate your deployment. do your mean ffmpeg push rtsp source to node.js, then remote side uses player to play rtsp from that node.js?
3 if saved file played well, the issue should be in rtsp transfering module. please use udp to play video. and you can use ffmpeg or gstreamer to play rtsp on jetson device, need to check if it is related to the network.

For 2, basically Node.js server runs on the jetson (so deepstream and node.js runs on the same device) for the local web service.
And for the remote side, ffmpeg and RTSP-simple-server is used. Jetson device and remote server are connected with VPN. What remote server does is, it first receives the rtsp signal by using ffmpeg, and passes the rtsp packets to the rtsp-simple-server. Then, the rtsp-simple-server converts the RTSP to HLS, and display the result on the web pages.

The video on the local side (result of Node.js that runs on the jetson nx board) looks great, but the video on the remote server just crashes.
I was assuming that this is because the rtsp is sent via UDP not TCP, which makes the losses.

about “the quality of video in webserver is not that much good.”, there are some question:
1 dose rtsp-simple-server get rtsp video from deepstream or note.js server? if from note.js server, it seemed not a deepstream issue, if from deepstream, it should be the network issue becauase the video(result of Node.js that runs on the jetson nx board) on the local side looks great.
2 did note.js server and rtsp-simple-server do transcode? transcoding will lower video quality.

  1. The rtsp-simple-server gets the rtsp from Deepstream. And yes, I also think this might be related to the network.

  2. Both node.js server and rtsp-simple-server do not transcode.

  1. you can use ffmpeg or gstreamer tool to play rtsp directly on remote server. if bad , need to check network, if good, need to check rtsp-simple-server.
  2. you can use the network tool to check. about “I was assuming that this is because the rtsp is sent via UDP not TCP, which makes the losses.”, usually we use UDP to send video data because video bitrate is high.

Fair enough. Thanks for helping me!