Deploy Deepstream 4.0 with Custom Dataset using DetectNet V2, FP32 mode, usb-camera, App run failed

I have successfully run tlt-converter and have my own resnet engine. But when I tried to deploy it on Deepstream 4.0, it crashed with so many error.

(deepstream-app:19823): GStreamer-WARNING **: 18:22:29.754: Name 'src_cap_filter' is not unique in bin 'src_sub_bin0', not adding
Error: Could not parse labels file path
Failed to parse group property
** ERROR: <gst_nvinfer_parse_config_file:943>: failed
Creating LL OSD context new
0:00:00.240359016 19823     0x221eb6d0 WARN                 nvinfer gstnvinfer.cpp:658:gst_nvinfer_start:<primary_gie_classifier> error: Configuration file parsing failed
0:00:00.240422769 19823     0x221eb6d0 WARN                 nvinfer gstnvinfer.cpp:658:gst_nvinfer_start:<primary_gie_classifier> error: Config file path: /home/keeper/Desktop/Coba1/config_infer_primary1.txt
** ERROR: <main:651>: Failed to set pipeline to PAUSED
Quitting
ERROR from primary_gie_classifier: Configuration file parsing failed
Debug info: /dvs/git/dirty/git-master_linux/deepstream/sdk/src/gst-plugins/gst-nvinfer/gstnvinfer.cpp(658): gst_nvinfer_start (): /GstPipeline:pipeline/GstBin:primary_gie_bin/GstNvInfer:primary_gie_classifier:
Config file path: /home/keeper/Desktop/Coba1/config_infer_primary1.txt
App run failed

Config file

[property]
gpu-id=0
# preprocessing parameters.
net-scale-factor=0.0039215697906911373
model-color-format=0

# model paths.
labelfile-path=/home/keeper/Desktop/Coba1/labels.txt
tlt-encoded-model=/home/keeper/Desktop/Coba1/resnet18_detector.etlt
tlt-model-key=bjdtNHBlYXIwZ3Z2YW1scDg2ZHZzN3FkMXY6MTVhNDg1ZTYtNDUyNC00YTUwLTg0NWUtOTRhYWIzMDAzN2Vi
input-dims=3;720;1280;0 # where c = number of channels, h = height of the model input, w = width of model input, 0: implies CHW format.
uff-input-blob-name=input_1
batch-size=4 
## 0=FP32, 1=INT8, 2=FP16 mode
network-mode=0
num-detected-classes=3
interval=0
gie-unique-id=1
is-classifier=0
output-blob-names=output_cov/Sigmoid;output_bbox/BiasAdd
#enable_dbscan=0

[class-attrs-all]
threshold=0.2
group-threshold=1
## Set eps=0.7 and minBoxes for enable-dbscan=1
eps=0.2
#minBoxes=3
roi-top-offset=0
roi-bottom-offset=0
detected-min-w=0
detected-min-h=0
detected-max-w=0
detected-max-h=0

program to run usb-camera

# Copyright (c) 2018 NVIDIA Corporation.  All rights reserved.
#
# NVIDIA Corporation and its licensors retain all intellectual property
# and proprietary rights in and to this software, related documentation
# and any modifications thereto.  Any use, reproduction, disclosure or
# distribution of this software and related documentation without an express
# license agreement from NVIDIA Corporation is strictly prohibited.

[application]
enable-perf-measurement=1
perf-measurement-interval-sec=5
#gie-kitti-output-dir=streamscl

[tiled-display]
enable=1
rows=1
columns=1
width=1280
height=720

[source0]
enable=1
#Type - 1=CameraV4L2 2=URI 3=MultiURI
type=1
camera-width=1280
camera-height=720
camera-fps-n=30
camera-fps-d=1
camera-v4l2-dev-node=0

[sink0]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=RTSPStreaming 5=Overlay
type=5
sync=0
display-id=0
offset-x=0
offset-y=0
width=0
height=0
overlay-id=1
source-id=0

[sink1]
enable=0
type=3
#1=mp4 2=mkv
container=1
#1=h264 2=h265 3=mpeg4
codec=1
sync=0
bitrate=2000000
output-file=out.mp4
source-id=0

[sink2]
enable=0
#Type - 1=FakeSink 2=EglSink 3=File 4=RTSPStreaming 5=Overlay
type=4
#1=h264 2=h265
codec=1
sync=0
bitrate=4000000
# set below properties in case of RTSPStreaming
rtsp-port=8554
udp-port=5400

[osd]
enable=1
border-width=2
text-size=15
text-color=1;1;1;1;
text-bg-color=0.3;0.3;0.3;1
font=Serif
show-clock=0
clock-x-offset=800
clock-y-offset=820
clock-text-size=12
clock-color=1;0;0;0

[streammux]
##Boolean property to inform muxer that sources are live
live-source=1
batch-size=1
##time out in usec, to wait after the first buffer is available
##to push the batch even if the complete batch is not formed
batched-push-timeout=40000
## Set muxer output width and height
width=1280
height=720

# config-file property is mandatory for any gie section.
# Other properties are optional and if set will override the properties set in
# the infer config file.
[primary-gie]
enable=1
model-engine-file=/home/keeper/Desktop/Coba1/resnet18_detector.engine
#Required to display the PGIE labels, should be added even when using config-file
#property
batch-size=1
#Required by the app for OSD, not a plugin property
bbox-border-color0=1;0;0;1
bbox-border-color1=0;1;1;1
bbox-border-color2=0;0;1;1
bbox-border-color3=0;1;0;1
interval=0
#Required by the app for SGIE, when used along with config-file property
gie-unique-id=1
config-file=config_infer_primary1.txt

[tests]
file-loop=0

Have any idea how to solve it?

Hi yoshuakevin,
Could you please paste your running command? Thanks.
BTW, did you ever try deepstream-test1-app?

For how to deploy detectnet_v2’s etlt model with deepstream-test1-app, there are several related topics in TLT forum.
https://devtalk.nvidia.com/default/topic/1065722/transfer-learning-toolkit/not-able-to-deploy-etlt-file-in-deepstream-test-app-1/post/5396850/#5396850
https://devtalk.nvidia.com/default/topic/1066887/transfer-learning-toolkit/tlt-converter-error-uffparser-and-nbsp-nvdsinfer-error-nvdsinfer_custom_lib_failed-deepstream/post/5403416/#5403416

Hello Morganh, here is my running command

deepstream-app -c /home/keeper/Desktop/Coba1/source1_usb_dec_infer_resnet_fp32.txt

I have tried deepstream-test1-app, it can run perfectly.

Do I need, calibration.bin for fp16/fp32 mode?

No, calibration.bin is not needed for fp16/fp32 mode.
More info in https://devtalk.nvidia.com/default/topic/1065558/transfer-learning-toolkit/trt-engine-deployment/

okay thank you Morganh. But I still can’t run my program. The error still bother me. DO I have the right config_infer_primary.txt?

I did not test etlt model with deepstream-app. Can you try to figure out the different result between deepstream-app and deepstream-test1-app?
BTW, what’s the config file name for “program to run usb-camera” you mentioned?

Please refer to https://devtalk.nvidia.com/default/topic/1067600/deepstream-sdk/lack-of-fps-after-successfully-deploy-tlt-to-deepstream-/post/5407847/?offset=4#5407907 too.