Hi
I’m trying to run Object Detection using SSD. There is a README in that folder and I followed the steps mentioned under it. After generating the uff file for my model and compiling the custom library when I run deepstream-app -c deepstream_app_config_ssd.txt
it gives the following error.
Using winsys: x11
Creating LL OSD context new
0:00:15.773741240 4337 0x20feef30 ERROR nvinfer gstnvinfer.cpp:511:gst_nvinfer_logger:<primary_gie_classifier> NvDsInferContext[UID 1]:log(): INVALID_ARGUMENT: Can not find binding of given name
0:00:15.773851033 4337 0x20feef30 WARN nvinfer gstnvinfer.cpp:515:gst_nvinfer_logger:<primary_gie_classifier> NvDsInferContext[UID 1]:checkEngineParams(): Could not find output layer 'MarkOutput_0' in engine
Runtime commands:
h: Print this help
q: Quit
p: Pause
r: Resume
**PERF: FPS 0 (Avg)
**PERF: 0.00 (0.00)
** INFO: <bus_callback:189>: Pipeline ready
Opening in BLOCKING MODE
NvMMLiteOpen : Block : BlockType = 261
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 261
Creating LL OSD context new
** INFO: <bus_callback:175>: Pipeline running
**PERF: 6.84 (6.84)
0:00:17.179839501 4337 0x20b7d720 WARN nvinfer gstnvinfer.cpp:1830:gst_nvinfer_output_loop:<primary_gie_classifier> error: Internal data stream error.
0:00:17.179947211 4337 0x20b7d720 WARN nvinfer gstnvinfer.cpp:1830:gst_nvinfer_output_loop:<primary_gie_classifier> error: streaming stopped, reason error (-5)
ERROR from primary_gie_classifier: Internal data stream error.
Debug info: /dvs/git/dirty/git-master_linux/deepstream/sdk/src/gst-plugins/gst-nvinfer/gstnvinfer.cpp(1830): gst_nvinfer_output_loop (): /GstPipeline:pipeline/GstBin:primary_gie_bin/GstNvInfer:primary_gie_classifier:
streaming stopped, reason error (-5)
Quitting
ERROR from osd_queue: Internal data stream error.
Debug info: gstqueue.c(988): gst_queue_handle_sink_event (): /GstPipeline:pipeline/GstBin:processing_bin_0/GstBin:osd_bin/GstQueue:osd_queue:
streaming stopped, reason error (-5)
App run failed
I’m trying to stream the network output using RTSP (similar to what’s given in sample notebooks) but I’m unable to do so. Can you please help me with it? I’m also attaching how my deepstream_app_config_ssd.txt file looks like
[application]
enable-perf-measurement=1
perf-measurement-interval-sec=1
gie-kitti-output-dir=streamscl
[tiled-display]
enable=0
rows=1
columns=1
width=1280
height=720
gpu-id=0
nvbuf-memory-type=0
[source0]
enable=1
#Type - 1=CameraV4L2 2=URI 3=MultiURI
type=2
num-sources=15
uri=file://../../samples/streams/sample_1080p_h264.mp4
gpu-id=0
cudadec-memtype=0
[sink0]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File
type=2
sync=1
source-id=0
gpu-id=0
nvbuf-memory-type=0
[sink1]
enable=0
type=3
#1=mp4 2=mkv
container=1
#1=h264 2=h265
codec=1
sync=0
#iframeinterval=10
bitrate=2000000
output-file=out.mp4
source-id=0
[sink2]
enable=0
#Type - 1=FakeSink 2=EglSink 3=File 4=RTSPStreaming
type=4
#1=h264 2=h265
codec=1
sync=0
bitrate=4000000
# set below properties in case of RTSPStreaming
rtsp-port=8554
#udp-port=5400
[osd]
enable=1
gpu-id=0
border-width=1
text-size=15
text-color=1;1;1;1;
text-bg-color=0.3;0.3;0.3;1
font=Serif
show-clock=0
clock-x-offset=800
clock-y-offset=820
clock-text-size=12
clock-color=1;0;0;0
nvbuf-memory-type=0
[streammux]
gpu-id=0
##Boolean property to inform muxer that sources are live
live-source=0
batch-size=1
##time out in usec, to wait after the first buffer is available
##to push the batch even if the complete batch is not formed
batched-push-timeout=40000
## Set muxer output width and height
width=1920
height=1080
##Enable to maintain aspect ratio wrt source, and allow black borders, works
##along with width, height properties
enable-padding=0
nvbuf-memory-type=0
[primary-gie]
enable=1
gpu-id=0
batch-size=1
bbox-border-color0=1;0;0;1
bbox-border-color1=0;1;1;1
bbox-border-color2=0;0;1;1
bbox-border-color3=0;1;0;1
gie-unique-id=1
interval=0
labelfile-path=ssd_coco_labels.txt
model-engine-file=sample_ssd_relu6.uff_b1_fp32.engine
config-file=config_infer_primary_ssd.txt
nvbuf-memory-type=0
Thanks a lot!