Is it possible to use a Dahua IP camera with nvidia Deepstream?

Hi,
I have this camera:
brand : Dahua
model : IPC-HDW1230T1-S4

I have been trying for a few days to use it with DeepStream but I get a black window display. At the beginning I thought that the problem had to do with the string of characters that are in the RTSP URI, which are:

rtsp://admin:12345678W@192.168.1.108:554/cam/realmonitor?channel=1&subtype=1

But I can run that URI with gstreamer command, changing the “&” character either by “/”, “+” or “:” and they do work, like this:

gst-launch-1.0 uridecodebin uri=rtsp://admin:12345678W@192.168.1.108:554/cam/realmonitor?channel=1**/**subtype=1 ! autovideosink

gst-launch-1.0 uridecodebin uri=rtsp://admin:12345678W@192.168.1.108:554/cam/realmonitor?channel=1**+**subtype=1 ! autovideosink

gst-launch-1.0 uridecodebin uri=rtsp://admin:12345678W@192.168.1.108:554/cam/realmonitor?channel=1**:**subtype=1 ! autovideosink

They display the video stream. I can see it with the jetson nano and gstreamer. But I can not see it with deepstream (it opens the DeepStreamTest5App window but all black). I know that I’m changing the correct config file, because I have test it with the bigBuckBunny rtsp uri and with my iphone as an IP camera, and their URI do work with the deepstream app.
I’m using this repository

Thank you

Hi,
Please check if you can run deepstream-test3:

/opt/nvidia/deepstream/deepstream-5.0/sources/apps/sample_apps/deepstream-test3

Or modify either sample config file in

/opt/nvidia/deepstream/deepstream-5.0/samples/configs/deepstream-app

to type=2 or type=4 in source.
For more information, please take a look at developer guide:
DeepStream Reference Application - deepstream-app — DeepStream DeepStream Version: 5.0 documentation (nvidia.com)

Hi,
I can run deepstream-test3-app in:

/opt/nvidia/deepstream/deepstream-5.0/sources/apps/sample_apps/deepstream-test3

with:

$ ./deepstream-test3-app rtsp://admin:12345678W@192.168.1.108:554/cam/realmonitor?channel=1/subtype=1

it works, it display the videoStream.
What should I try next?

1 Like

it done

Hi,
So it should work in deepstream-app. You can modify config file to launch the camera. Here is a reference:

Hi,
I can run deepstream-app and deepstream-test5-app but only with the sink of type 2 and 3:

  • type=2 (EGL based windowed sink (nveglglessink))
  • type=3 (Encode + File Save (encoder + muxer + filesink)

But I want to use the sink type=6 . Which by this config file of the tutorial, it has this form:

[sink1]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=UDPSink 5=nvoverlaysink 6=MsgConvBroker
type=6
sync=0
msg-conv-config=msgconv_config.txt
#(0): PAYLOAD_DEEPSTREAM - Deepstream schema payload
#(1): PAYLOAD_DEEPSTREAM_MINIMAL - Deepstream schema payload minimal
#(256): PAYLOAD_RESERVED - Reserved type
#(257): PAYLOAD_CUSTOM   - Custom schema payload
msg-conv-payload-type=1
msg-broker-proto-lib=/opt/nvidia/deepstream/deepstream/lib/libnvds_azure_edge_proto.so
topic=mytopic
#Optional:
#msg-broker-config=../../../../libs/azure_protocol_adaptor/module_client/cfg_azure.txt

I run:

$ docker restart NVIDIADeepStreamSDK

but I get the black window display using the URI of my IP camera, whereas with the BigBuckBunny URI it display correctly.

What might be causing this problem?

Hi,
So you have [sink0] setting to type=2 and [sink1] setting to type=6. It does not display correctly in nveglgessink. Is this correct?

Yes, that is correct, I have both activated, like this:

[sink0]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=UDPSink 5=nvoverlaysink
type=2
sync=0
source-id=0
gpu-id=0
qos=0
nvbuf-memory-type=0
overlay-id=1

[sink1]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=UDPSink 5=nvoverlaysink 6=MsgConvBroker
type=6
sync=0
msg-conv-config=msgconv_config.txt
#(0): PAYLOAD_DEEPSTREAM - Deepstream schema payload
#(1): PAYLOAD_DEEPSTREAM_MINIMAL - Deepstream schema payload minimal
#(256): PAYLOAD_RESERVED - Reserved type
#(257): PAYLOAD_CUSTOM   - Custom schema payload
msg-conv-payload-type=1
msg-broker-proto-lib=/opt/nvidia/deepstream/deepstream/lib/libnvds_azure_edge_proto.so
topic=mytopic
#Optional:
#msg-broker-config=../../../../libs/azure_protocol_adaptor/module_client/cfg_azure.txt

Hi,
Would like to get more information. Please help do the tests:

  1. Try type=5 in [sink0] + type=6 in [sink1]. Would like to know if nvoverlaysink works.
  2. Not to use Dahua IP camera and use test-mp4 to launch RTSP server:
    gst-rtsp-server/test-mp4.c at master · GStreamer/gst-rtsp-server · GitHub
$ ./test-mp4 sample_1080p_h264.mp4

Probably it is not specific to Dahua IP camera.

Hi,
with the first test.- It doesn’t open any window.
with the second test.- In the ~/gst-rtsp-server/examples direcotry I ran:

$ gcc test-mp4.c -o test-mp4 $(pkg-config --cflags --libs gstreamer-1.0 gstreamer-rtsp-server-1.0)

and then:

$ ./test-mp4 sample_1080p_h264.mp4

and it shows :

stream ready at rtsp://127.0.0.1:8554/test

Hi,
For second test, if you run test-mp4 on Jetson Nano, please modify uri=rtsp://127.0.0.1:8554/test in config file. It is to get stream from the RTSP server instead of Duhua IP camera.

And do you run JP4.4.1(r32.4.4+DS5.0.1)?

Hi,
You mean that if I was able to run test-mp4 on Jetson Nano (as show in my previous answer).

Now, in source[0] of my config file I should modify the URI to uri=rtsp://127.0.0.1:8554/test, right?

Yes I’m running JP4.4.1(r32.4.4+DS5.0.1)

Hi,
You may run the RTSP server on Jetson Nano or other host PC in LAN. And change URI in config file to get video stream from the server instead of the IP camera.

Hi,
Ok, I did, But still getting the black windows display.

Hi,
So it is not specific to the IP camera. If setting up the server through test-mp4 and enable two sinks(one is type=2 and the other is type=6), it cannot show display. Is this correct?

Yes, that is correct. What should I do now?

Hi,
Please confirm you set live-source=1 in [streammux].

And please share full config file for reference.

Hi,
Yes, live-source=1 is set in [streammux].
This is my config file:

# Copyright (c) 2018 NVIDIA Corporation.  All rights reserved.
#
# NVIDIA Corporation and its licensors retain all intellectual property
# and proprietary rights in and to this software, related documentation
# and any modifications thereto.  Any use, reproduction, disclosure or
# distribution of this software and related documentation without an express
# license agreement from NVIDIA Corporation is strictly prohibited.

[application]
enable-perf-measurement=1
perf-measurement-interval-sec=5
#gie-kitti-output-dir=streamscl

[tiled-display]
enable=1
rows=1
columns=1
width=1280
height=640
gpu-id=0
#(0): nvbuf-mem-default - Default memory allocated, specific to particular platform
#(1): nvbuf-mem-cuda-pinned - Allocate Pinned/Host cuda memory, applicable for Tesla
#(2): nvbuf-mem-cuda-device - Allocate Device cuda memory, applicable for Tesla
#(3): nvbuf-mem-cuda-unified - Allocate Unified cuda memory, applicable for Tesla
#(4): nvbuf-mem-surface-array - Allocate Surface Array memory, applicable for Jetson
nvbuf-memory-type=0

[source0]
enable=1
#Type - 1=CameraV4L2 2=URI 3=MultiURI 4=RTSP 5=CSI
type=4
#uri=rtsp://admin:12345678W@192.168.1.108:554/cam/realmonitor?channel=1/subtype=1
#uri=rtsp://192.168.1.7:8554/live
#uri=rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mov
uri=rtsp://127.0.0.1:8554/test
rtsp-reconnect-interval-sec=60
#latency=1000
#type=5
#camera-width=3280
#camera-height=2464
#camera-fps-n=20
#camera-fps-d=1
#camera-csi-sensor-id=0
#gpu-id=0
# (0): memtype_device   - Memory type Device
# (1): memtype_pinned   - Memory type Host Pinned
# (2): memtype_unified  - Memory type Unified
cudadec-memtype=0

[sink0]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=UDPSink 5=nvoverlaysink
type=2 
sync=0 
source-id=0 
gpu-id=0
qos=0
nvbuf-memory-type=0 
overlay-id=1

[sink1]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=UDPSink 5=nvoverlaysink 6=MsgConvBroker
type=6
sync=0
msg-conv-config=msgconv_config.txt
#(0): PAYLOAD_DEEPSTREAM - Deepstream schema payload
#(1): PAYLOAD_DEEPSTREAM_MINIMAL - Deepstream schema payload minimal
#(256): PAYLOAD_RESERVED - Reserved type
#(257): PAYLOAD_CUSTOM   - Custom schema payload
msg-conv-payload-type=1
msg-broker-proto-lib=/opt/nvidia/deepstream/deepstream/lib/libnvds_azure_edge_proto.so
topic=mytopic
#Optional:
#msg-broker-config=../../../../libs/azure_protocol_adaptor/module_client/cfg_azure.txt

[sink2]
enable=0
type=1
#1=mp4 2=mkv
container=1
#1=h264 2=h265
codec=1
sync=0
bitrate=2000000
output-file=CustomVisionOut.mp4
source-id=0

[sink3]
enable=0
#Type - 1=FakeSink 2=EglSink 3=File 4=RTSPStreaming
type=4
#1=h264 2=h265
codec=1
sync=0
bitrate=1500000
# set below properties in case of RTSPStreaming
rtsp-port=8554
udp-port=5400

[osd]
enable=1
gpu-id=0
border-width=4
text-size=15
text-color=1;1;1;1;
text-bg-color=0.3;0.3;0.3;1
font=Arial
show-clock=0
clock-x-offset=800
clock-y-offset=820
clock-text-size=12
clock-color=1;0;0;0
nvbuf-memory-type=0

[streammux]
##Boolean property to inform muxer that sources are live
live-source=1
batch-size=1
##time out in usec, to wait after the first buffer is available
##to push the batch even if the complete batch is not formed
batched-push-timeout=40000
## Set muxer output width and height
width=1280
height=720
## If set to TRUE, system timestamp will be attached as ntp timestamp
## If set to FALSE, ntp timestamp from rtspsrc, if available, will be attached
attach-sys-ts-as-ntp=1
# config-file property is mandatory for any gie section.
# Other properties are optional and if set will override the properties set in
# the infer config file.

[primary-gie]
enable=1
model-engine-file=../../CUSTOM_VISION_AI/model.onnx_b1_gpu0_fp32.engine
#Required to display the PGIE labels, should be added even when using config-file
#property
batch-size=1
#Required by the app for OSD, not a plugin property
bbox-border-color0=1;0;0;1
bbox-border-color1=0;1;1;1
bbox-border-color2=0;0;1;1
bbox-border-color3=0;1;0;1
interval=0
#Required by the app for SGIE, when used along with config-file property
gie-unique-id=1
config-file=config_infer_primary_CustomVisionAI.txt

[tracker]
enable=0
tracker-width=480
tracker-height=272
ll-lib-file=/opt/nvidia/deepstream/deepstream/lib/libnvds_mot_klt.so
#ll-config-file required for DCF/IOU only
#ll-config-file=iou_config.txt
gpu-id=0

[tests]
file-loop=1

Hi,
Could you try deepstream-test5? We have verified it with the sample. A bit strange you fail in the usecase.

Hi,
With deepstream-test5, in the directory apps/sample_apps/deepstream-test5 I’m trying with:

$ ./deepstream-test5-app -c configs/test5_config_file_src_infer.txt

and with

$ ./deepstream-test5-app -c configs/test5_dec_infer-resnet_tracker_sgie_tiled_display_int8.txt

But I get this error in both cases:

** ERROR: <main:1451>: Failed to set pipeline to PAUSED
Quitting
ERROR from sink_sub_bin_sink2: Could not initialize supporting library.
Debug info: /dvs/git/dirty/git-master_linux/deepstream/sdk/src/gst-plugins/gst-nvmsgbroker/gstnvmsgbroker.c(359): legacy_gst_nvmsgbroker_start (): /GstPipeline:pipeline/GstBin:sink_sub_bin2/GstNvMsgBroker:sink_sub_bin_sink2:
unable to open shared library
ERROR from sink_sub_bin_sink2: GStreamer error: state change failed and some element failed to post a proper error message with the reason for the failure.
Debug info: gstbasesink.c(5265): gst_base_sink_change_state (): /GstPipeline:pipeline/GstBin:sink_sub_bin2/GstNvMsgBroker:sink_sub_bin_sink2:
Failed to start
App run failed

and with my config file:

$ ./deepstream-test5-app -c configs/DSConfig-CustomVisionAI.txt

I get this error:

Failed to load config file: No such file or directory
** ERROR: <gst_nvinfer_parse_config_file:1242>: failed

(deepstream-test5-app:21301): GLib-CRITICAL **: 11:34:24.358: g_strrstr: assertion 'haystack != NULL' failed
Error: Time:Fri Jan  8 11:34:24 2021 File:/home/nvidia/azure/azure-iot-sdk-c/iothub_client/src/iothub_client_core_ll.c Func:retrieve_edge_environment_variabes Line:177 Environment IOTEDGE_AUTHSCHEME not set
Error: Time:Fri Jan  8 11:34:24 2021 File:/home/nvidia/azure/azure-iot-sdk-c/iothub_client/src/iothub_client_core_ll.c Func:IoTHubClientCore_LL_CreateFromEnvironment Line:1186 retrieve_edge_environment_variabes failed
Error: Time:Fri Jan  8 11:34:24 2021 File:/home/nvidia/azure/azure-iot-sdk-c/iothub_client/src/iothub_client_core.c Func:create_iothub_instance Line:924 Failure creating iothub handle
ERROR: iotHubModuleClientHandle is NULL! connect failed
** ERROR: <main:1451>: Failed to set pipeline to PAUSED
Quitting
ERROR from sink_sub_bin_sink2: Could not configure supporting library.
Debug info: /dvs/git/dirty/git-master_linux/deepstream/sdk/src/gst-plugins/gst-nvmsgbroker/gstnvmsgbroker.c(388): legacy_gst_nvmsgbroker_start (): /GstPipeline:pipeline/GstBin:sink_sub_bin2/GstNvMsgBroker:sink_sub_bin_sink2:
unable to connect to broker library
ERROR from sink_sub_bin_sink2: GStreamer error: state change failed and some element failed to post a proper error message with the reason for the failure.
Debug info: gstbasesink.c(5265): gst_base_sink_change_state (): /GstPipeline:pipeline/GstBin:sink_sub_bin2/GstNvMsgBroker:sink_sub_bin_sink2:
Failed to start
App run failed