I have a problem:
- Installed Jetson Inference with network Mobile SSD v2 && SSD-Inception-v2 on UFF format
- Make objectDetect_SSD
- But after run deepstream-app i have a error:
/opt/nvidia/deepstream/deepstream-4.0/bin/deepstream-app -c deepstream_app_config_ssd.txt
Using winsys: x11
Creating LL OSD context new
0:00:01.199948324 18335 0x140b4d0 WARN nvinfer gstnvinfer.cpp:515:gst_nvinfer_logger:<primary_gie_classifier> NvDsInferContext[UID 1]:useEngineFile(): Failed to read from model engine file
0:00:01.200114890 18335 0x140b4d0 INFO nvinfer gstnvinfer.cpp:519:gst_nvinfer_logger:<primary_gie_classifier> NvDsInferContext[UID 1]:initialize(): Trying to create engine from model files
0:00:03.280244897 18335 0x140b4d0 ERROR nvinfer gstnvinfer.cpp:511:gst_nvinfer_logger:<primary_gie_classifier> NvDsInferContext[UID 1]:log(): Assertion failed: mConcatAxisID == 1 || mConcatAxisID == 2 || mConcatAxisID == 3
flattenConcat.cpp:29
Aborting...
My config is
[property]
gpu-id=0
net-scale-factor=0.0078431372
offsets=127.5;127.5;127.5
model-color-format=0
model-engine-file=sample_ssd_relu6.engine
labelfile-path=ssd_coco_labels.txt
uff-file=sample_ssd_relu6.uff
uff-input-dims=3;300;300;0
uff-input-blob-name=Input
batch-size=1
## 0=FP32, 1=INT8, 2=FP16 mode
network-mode=0
num-detected-classes=91
interval=0
gie-unique-id=1
is-classifier=0
output-blob-names=NMS
parse-bbox-func-name=NvDsInferParseCustomSSD
custom-lib-path=nvdsinfer_custom_impl_ssd/libnvdsinfer_custom_impl_ssd.so