With the following config file, working well, but need two windows to display;
# Copyright (c) 2019 NVIDIA Corporation. All rights reserved.
#
# NVIDIA Corporation and its licensors retain all intellectual property
# and proprietary rights in and to this software, related documentation
# and any modifications thereto. Any use, reproduction, disclosure or
# distribution of this software and related documentation without an express
# license agreement from NVIDIA Corporation is strictly prohibited.
[application]
enable-perf-measurement=1
perf-measurement-interval-sec=5
#gie-kitti-output-dir=streamscl
[tiled-display]
enable=0
rows=1
columns=2
width=1280
height=720
gpu-id=0
#(0): nvbuf-mem-default - Default memory allocated, specific to particular platform
#(1): nvbuf-mem-cuda-pinned - Allocate Pinned/Host cuda memory applicable for Tesla
#(2): nvbuf-mem-cuda-device - Allocate Device cuda memory applicable for Tesla
#(3): nvbuf-mem-cuda-unified - Allocate Unified cuda memory applicable for Tesla
#(4): nvbuf-mem-surface-array - Allocate Surface Array memory, applicable for Jetson
nvbuf-memory-type=0
[source0]
enable=1
#Type - 1=CameraV4L2 2=URI 3=MultiURI 4=RTSP
type=3
uri=rtsp://db:db@192.168.3.33:554
num-sources=1
#drop-frame-interval=2
gpu-id=0
latency = 200
# (0): memtype_device - Memory type Device
# (1): memtype_pinned - Memory type Host Pinned
# (2): memtype_unified - Memory type Unified
cudadec-memtype=0
[source1]
enable=1
#Type - 1=CameraV4L2 2=URI 3=MultiURI 4=RTSP
type=3
uri=rtsp://db:db@192.168.3.37:554
num-sources=1
#drop-frame-interval=2
gpu-id=0
latency = 200
# (0): memtype_device - Memory type Device
# (1): memtype_pinned - Memory type Host Pinned
# (2): memtype_unified - Memory type Unified
cudadec-memtype=0
[sink0]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File
type=2
sync=0
source-id=0
gpu-id=0
qos=0
nvbuf-memory-type=0
overlay-id=1
[sink1]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File
type=2
sync=1
source-id=1
gpu-id=0
nvbuf-memory-type=0
[osd]
enable=1
gpu-id=0
border-width=1
text-size=15
text-color=1;1;1;1;
text-bg-color=0.3;0.3;0.3;1
font=Serif
show-clock=0
clock-x-offset=800
clock-y-offset=820
clock-text-size=12
clock-color=1;0;0;0
nvbuf-memory-type=0
[streammux]
gpu-id=0
##Boolean property to inform muxer that sources are live
live-source=1
batch-size=4
##time out in usec, to wait after the first buffer is available
##to push the batch even if the complete batch is not formed
batched-push-timeout=40000
## Set muxer output width and height
width=1920
height=1080
##Enable to maintain aspect ratio wrt source, and allow black borders, works
##along with width, height properties
enable-padding=0
nvbuf-memory-type=0
# config-file property is mandatory for any gie section.
# Other properties are optional and if set will override the properties set in
# the infer config file.
[primary-gie]
enable=1
gpu-id=0
model-engine-file=../../models/Primary_Detector/resnet10.caffemodel_b4_int8.engine
batch-size=4
#Required by the app for OSD, not a plugin property
bbox-border-color0=1;0;0;1
bbox-border-color1=0;1;1;1
bbox-border-color2=0;0;1;1
bbox-border-color3=0;1;0;1
interval=0
gie-unique-id=1
nvbuf-memory-type=0
config-file=config_infer_primary.txt
[tracker]
enable=0
tracker-width=640
tracker-height=368
#ll-lib-file=/opt/nvidia/deepstream/deepstream-4.0/lib/libnvds_mot_iou.so
#ll-lib-file=/opt/nvidia/deepstream/deepstream-4.0/lib/libnvds_nvdcf.so
ll-lib-file=/opt/nvidia/deepstream/deepstream-4.0/lib/libnvds_mot_klt.so
#ll-config-file required for DCF/IOU only
#ll-config-file=tracker_config.yml
#ll-config-file=iou_config.txt
gpu-id=0
#enable-batch-process applicable to DCF only
enable-batch-process=1
[secondary-gie0]
enable=1
model-engine-file=../../models/Secondary_VehicleTypes/resnet18.caffemodel_b16_int8.engine
gpu-id=0
batch-size=16
gie-unique-id=4
operate-on-gie-id=1
operate-on-class-ids=0;
config-file=config_infer_secondary_vehicletypes.txt
[secondary-gie1]
enable=1
model-engine-file=../../models/Secondary_CarColor/resnet18.caffemodel_b16_int8.engine
batch-size=16
gpu-id=0
gie-unique-id=5
operate-on-gie-id=1
operate-on-class-ids=0;
config-file=config_infer_secondary_carcolor.txt
[secondary-gie2]
enable=1
model-engine-file=../../models/Secondary_CarMake/resnet18.caffemodel_b16_int8.engine
batch-size=16
gpu-id=0
gie-unique-id=6
operate-on-gie-id=1
operate-on-class-ids=0;
config-file=config_infer_secondary_carmake.txt
[tests]
file-loop=0
If make “[tiled-display] enable=1” and “[sink1] enable=0”,get error:
**PERF: FPS 0 (Avg) FPS 1 (Avg)
**PERF: 0.00 (0.00) 0.00 (0.00)
** INFO: <bus_callback:163>: Pipeline ready
** INFO: <bus_callback:149>: Pipeline running
Opening in BLOCKING MODE
Opening in BLOCKING MODE
NvMMLiteOpen : Block : BlockType = 261
NvMMLiteOpen : Block : BlockType = 261
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 261
NvMMLiteBlockCreate : Block : BlockType = 261
Creating LL OSD context new
** INFO: <bus_callback:149>: Pipeline running
ERROR from tiled_display_tiler: GstNvTiler: FATAL ERROR; NvTiler::Composite failed
Debug info: /dvs/git/dirty/git-master_linux/deepstream/sdk/src/gst-plugins/gst-nvtiler/gstnvtiler.cpp(665): gst_nvmultistreamtiler_transform (): /GstPipeline:pipeline/GstBin:tiled_display_bin/GstNvMultiStreamTiler:tiled_display_tiler
Quitting
0:02:14.712615904 9267 0x27617540 WARN nvinfer gstnvinfer.cpp:1439:gst_nvinfer_process_objects:<secondary_gie_0> Untracked objects in metadata. Cannot infer on untracked objects in asynchronous mode.
0:02:14.712748192 9267 0x276174f0 WARN nvinfer gstnvinfer.cpp:1439:gst_nvinfer_process_objects:<secondary_gie_1> Untracked objects in metadata. Cannot infer on untracked objects in asynchronous mode.
0:02:14.712993440 9267 0x276174a0 WARN nvinfer gstnvinfer.cpp:1439:gst_nvinfer_process_objects:<secondary_gie_2> Untracked objects in metadata. Cannot infer on untracked objects in asynchronous mode.
0:02:14.751439968 9267 0x27616de0 WARN nvinfer gstnvinfer.cpp:1830:gst_nvinfer_output_loop:<primary_gie_classifier> error: Internal data stream error.
0:02:14.751494432 9267 0x27616de0 WARN nvinfer gstnvinfer.cpp:1830:gst_nvinfer_output_loop:<primary_gie_classifier> error: streaming stopped, reason error (-5)
ERROR from primary_gie_classifier: Internal data stream error.
Debug info: /dvs/git/dirty/git-master_linux/deepstream/sdk/src/gst-plugins/gst-nvinfer/gstnvinfer.cpp(1830): gst_nvinfer_output_loop (): /GstPipeline:pipeline/GstBin:primary_gie_bin/GstNvInfer:primary_gie_classifier:
streaming stopped, reason error (-5)
App run failed
I try https://devtalk.nvidia.com/default/topic/1058226/deepstream-sdk/multi-rtsp-uri-on-jetson-nano/
but don’t work;
1.How to change the config file to make two IP cameras display on one window?
2.The working config file above is very slow. How to solve it?
Hi,
For this it requires tiled-display, but certain IP cameras cannot run with it. We are checking this.
Same error is reported in
https://devtalk.nvidia.com/default/topic/1058086/deepstream-sdk/how-to-run-rtp-camera-in-deepstream-on-nano/post/5367501/#5367501
You may adapt your setting to source12_1080p_dec_infer-resnet_tracker_tiled_display_fp16_tx2.txt. It runs a low-weight model:
model-engine-file=../../models/Primary_Detector_Nano/resnet10.caffemodel_b12_fp16.engine
Also please set sync=0 in [sink*]
Why RTSP source works fine with “Deepstream-app” but it doesn’t works with “Deepstream-test3”?
I receive this error in Deepstream-test3:
Now playing: rtsp://admin:password@192.168.30.61,
Using winsys: x11
Creating LL OSD context new
0:00:00.575908178 9969 0x558f93ec40 INFO nvinfer gstnvinfer.cpp:519:gst_nvinfer_logger:<primary-nvinference-engine> NvDsInferContext[UID 1]:initialize(): Trying to create engine from model files
0:00:25.371918051 9969 0x558f93ec40 INFO nvinfer gstnvinfer.cpp:519:gst_nvinfer_logger:<primary-nvinference-engine> NvDsInferContext[UID 1]:generateTRTModel(): Storing the serialized cuda engine to file at /opt/nvidia/deepstream/deepstream-4.0/samples/models/Primary_Detector/resnet10.caffemodel_b1_int8.engine
Decodebin child added: source
Running...
Decodebin child added: decodebin0
Decodebin child added: rtph264depay0
Decodebin child added: h264parse0
Decodebin child added: capsfilter0
Decodebin child added: nvv4l2decoder0
Seting bufapi_version
Opening in BLOCKING MODE
NvMMLiteOpen : Block : BlockType = 261
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 261
In cb_newpad
Creating LL OSD context new
Frame Number = 0 Number of objects = 8 Vehicle Count = 8 Person Count = 0
Timestamp: 159397413
ERROR from element nvtiler: GstNvTiler: FATAL ERROR; NvTiler::Composite failed
Error details: /dvs/git/dirty/git-master_linux/deepstream/sdk/src/gst-plugins/gst-nvtiler/gstnvtiler.cpp(665): gst_nvmultistreamtiler_transform (): /GstPipeline:dstest3-pipeline/GstNvMultiStreamTiler:nvtiler
Returned, stopping playback
0:00:25.834905255 9969 0x558f3a8720 WARN nvinfer gstnvinfer.cpp:1830:gst_nvinfer_output_loop:<primary-nvinference-engine> error: Internal data stream error.
0:00:25.835119596 9969 0x558f3a8720 WARN nvinfer gstnvinfer.cpp:1830:gst_nvinfer_output_loop:<primary-nvinference-engine> error: streaming stopped, reason error (-5)
Frame Number = 1 Number of objects = 8 Vehicle Count = 8 Person Count = 0
Timestamp: 199397413
Deleting pipeline
Hi,
For clearness, please make a new post about deepstream-test3.
Hi,
My source12_1080p_dec_infer-resnet_tracker_tiled_display_fp16_tx2.txt as follows:
[tiled-display]
enable=1
rows=1
columns=1
width=1280
height=720
gpu-id=0
#(0): nvbuf-mem-default - Default memory allocated, specific to particular platform
#(1): nvbuf-mem-cuda-pinned - Allocate Pinned/Host cuda memory, applicable for Tesla
#(2): nvbuf-mem-cuda-device - Allocate Device cuda memory, applicable for Tesla
#(3): nvbuf-mem-cuda-unified - Allocate Unified cuda memory, applicable for Tesla
#(4): nvbuf-mem-surface-array - Allocate Surface Array memory, applicable for Jetson
nvbuf-memory-type=0
[source0]
enable=1
#Type - 1=CameraV4L2 2=URI 3=MultiURI 4=RTSP
type=3
#uri=file://../../streams/sample_1080p_h264.mp4
uri=rtsp://****:****@169.254.5.64
num-sources=1
#drop-frame-interval=2
gpu-id=0
# (0): memtype_device - Memory type Device
# (1): memtype_pinned - Memory type Host Pinned
# (2): memtype_unified - Memory type Unified
cudadec-memtype=0
camera-width=640
camera-height=480
camera-fps-n=30
camera-fps-d=1
camera-v4l2-dev-node=1
[sink0]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File
type=2
sync=0
source-id=0
gpu-id=0
qos=0
nvbuf-memory-type=0
overlay-id=1
[sink1]
enable=0
type=3
#1=mp4 2=mkv
container=1
#1=h264 2=h265
codec=1
sync=0
#iframeinterval=10
bitrate=2000000
output-file=out.mp4
source-id=0
[sink2]
enable=0
#Type - 1=FakeSink 2=EglSink 3=File 4=RTSPStreaming
type=4
#1=h264 2=h265
codec=1
sync=0
bitrate=4000000
# set below properties in case of RTSPStreaming
rtsp-port=8554
udp-port=5400
[osd]
enable=1
gpu-id=0
border-width=1
text-size=15
text-color=1;1;1;1;
text-bg-color=0.3;0.3;0.3;1
font=Serif
show-clock=0
clock-x-offset=800
clock-y-offset=820
clock-text-size=12
clock-color=1;0;0;0
nvbuf-memory-type=0
[streammux]
gpu-id=0
##Boolean property to inform muxer that sources are live
live-source=1
batch-size=1
##time out in usec, to wait after the first buffer is available
##to push the batch even if the complete batch is not formed
batched-push-timeout=40000
## Set muxer output width and height
width=1920
height=1080
##Enable to maintain aspect ratio wrt source, and allow black borders, works
##along with width, height properties
enable-padding=0
nvbuf-memory-type=0
# config-file property is mandatory for any gie section.
# Other properties are optional and if set will override the properties set in
# the infer config file.
[primary-gie]
enable=1
gpu-id=0
model-engine-file=../../models/Primary_Detector_Nano/resnet10.caffemodel_b12_fp16.engine
batch-size=1
#Required by the app for OSD, not a plugin property
bbox-border-color0=1;0;0;1
bbox-border-color1=0;1;1;1
bbox-border-color2=0;0;1;1
bbox-border-color3=0;1;0;1
interval=4
gie-unique-id=1
nvbuf-memory-type=0
config-file=config_infer_primary_nano.txt
[tracker]
enable=0
tracker-width=480
tracker-height=272
#ll-lib-file=/opt/nvidia/deepstream/deepstream-4.0/lib/libnvds_mot_iou.so
ll-lib-file=/opt/nvidia/deepstream/deepstream-4.0/lib/libnvds_mot_klt.so
#ll-config-file required for IOU only
#ll-config-file=iou_config.txt
gpu-id=0
[tests]
file-loop=0
When running this code:deepstream-app -c source12_1080p_dec_infer-resnet_tracker_tiled_display_fp16_tx2.txt
I got the result as follows:
Using winsys: x11
Creating LL OSD context new
cb_sourcesetup set 100 latency
Runtime commands:
h: Print this help
q: Quit
p: Pause
r: Resume
NOTE: To expand a source in the 2D tiled display and view object details, left-click on the source.
To go back to the tiled display, right-click anywhere on the window.
**PERF: FPS 0 (Avg)
**PERF: 0.00 (0.00)
** INFO: <bus_callback:163>: Pipeline ready
** INFO: <bus_callback:149>: Pipeline running
Opening in BLOCKING MODE
NvMMLiteOpen : Block : BlockType = 261
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 261
** INFO: <bus_callback:149>: Pipeline running
Creating LL OSD context new
**PERF: 0.00 (0.00)
**PERF: 0.00 (0.00)
the display is black,why?
Hi,
Please check if you are able to launch your rtsp source in a pipeline:
$ gst-launch-1.0 rtspsrc location=rtsp://127.0.0.1:8554/test ! rtph264depay ! h264parse ! nvv4l2decoder ! nvoverlaysink
When I modify this:
[sink2]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=RTSPStreaming
type=4
#1=h264 2=h265
codec=1
sync=0
bitrate=4000000
# set below properties in case of RTSPStreaming
rtsp-port=8554
udp-port=5400
I got errors:
*** DeepStream: Launched RTSP Streaming at rtsp://localhost:8554/ds-test ***
** ERROR: <main:651>: Failed to set pipeline to PAUSED
Quitting
ERROR from sink_sub_bin_udpsink2: Could not get/set settings from/on resource.
Debug info: gstmultiudpsink.c(1278): gst_multiudpsink_configure_client (): /GstPipeline:pipeline/GstBin:processing_bin_0/GstBin:sink_bin/GstBin:sink_sub_bin2/GstUDPSink:sink_sub_bin_udpsink2:
Could not join multicast group: Error joining multicast group: No such device
ERROR from sink_sub_bin_udpsink2: GStreamer error: state change failed and some element failed to post a proper error message with the reason for the failure.
Debug info: gstbasesink.c(5265): gst_base_sink_change_state (): /GstPipeline:pipeline/GstBin:processing_bin_0/GstBin:sink_bin/GstBin:sink_sub_bin2/GstUDPSink:sink_sub_bin_udpsink2:
Failed to start
App run failed
if didn’t have above errors, I’m sure it can able to launch my rtsp source in a pipeline:
$ gst-launch-1.0 rtspsrc location=rtsp://127.0.0.1:8554/ds-test ! rtph264depay ! h264parse ! nvv4l2decoder ! nvoverlaysink
Hi,
You should try your rtspsrc:
gst-launch-1.0 rtspsrc location=<b>uri=rtsp://****:****@169.254.5.64</b> ! rtph264depay ! h264parse ! nvv4l2decoder ! nvoverlaysink
Not rtsp://127.0.0.1:8554/ds-test
I try my rtspsrc:
gst-launch-1.0 rtspsrc location=uri=rtsp://****:****@169.254.5.64 ! rtph264depay ! h264parse ! nvv4l2decoder ! nvoverlaysink
It works well, it can open IP camera
Hi,IP cameras cannot run with the “tiled-display”,this issue is solve?