Hi I have a SSD_Mobilenet_V2.pb, Kindly give me the easiest step on how to port it on Deepstream SDK. So that I can test it using a sample pipeline in NVINFER plugin.
**• Hardware Platform (Jetson / GPU)**T4
• DeepStream Version 5.0
• JetPack Version (valid for Jetson only)
• TensorRT Version 7.0
**• NVIDIA GPU Driver Version (valid for GPU only)**450.36.06
@GalibaSashi
You can convert the tensorflow model into ONNX format first, and then either deploy the ONNX model into DeepStream (DeepStream can transform ONNX into tensorRT engine automatically), or convert ONNX into engine separately via /usr/src/tensorrt/bin/trtexec
and then deploy into DeepStream.
To deploy ONNX or tensorRT engine, specify them in config text file like this:
onnx-file=/path/to/your/onnx_file.onnx
model-engine-file=/path/to/your/tensorrt_engine.engine
Here is a sample of the configuration:
[property]
gpu-id=0
net-scale-factor=0.0078431372
offsets=127.5;127.5;127.5
model-color-format=0
model-engine-file=sample_ssd_relu6.uff_b1_gpu0_fp32.engine
labelfile-path=ssd_coco_labels.txt
uff-file=sample_ssd_relu6.uff
uff-input-dims=3;300;300;0
uff-input-blob-name=Input
batch-size=1
## 0=FP32, 1=INT8, 2=FP16 mode
network-mode=0
num-detected-classes=91
interval=0
gie-unique-id=1
is-classifier=0
output-blob-names=MarkOutput_0
parse-bbox-func-name=NvDsInferParseCustomSSD
custom-lib-path=nvdsinfer_custom_impl_ssd/libnvdsinfer_custom_impl_ssd.so
#scaling-filter=0
#scaling-compute-hw=0
[class-attrs-all]
threshold=0.5
roi-top-offset=0
roi-bottom-offset=0
detected-min-w=0
detected-min-h=0
detected-max-w=0
detected-max-h=0
## Per class configuration
#[class-attrs-2]
#threshold=0.6
#roi-top-offset=20
#roi-bottom-offset=10
#detected-min-w=40
#detected-min-h=40
#detected-max-w=400
#detected-max-h=800
I get an error
deepstream-app -c deepstream_app_config_ssd_frozen_onnx.txt
** ERROR: <parse_labels_file:242>: Failed to open label file ‘/opt/nvidia/deepstream/deepstream-5.0/sources/objectDetector_SSD/ONNX_SSD_Test/ssd_labels.txt’:No such file or directory
** ERROR: <parse_gie:1108>: Failed while parsing label file ‘/opt/nvidia/deepstream/deepstream-5.0/sources/objectDetector_SSD/ONNX_SSD_Test/ssd_labels.txt’
** ERROR: <parse_gie:1125>: parse_gie failed
** ERROR: <parse_config_file:505>: parse_config_file failed
** ERROR: main:623: Failed to parse config file ‘deepstream_app_config_ssd_frozen_onnx.txt’
Quitting
App run failed
bhmk@bhmk-PowerEdge-R740:/opt/nvidia/deepstream/deepstream-5.0/sources/objectDetector_SSD/ONNX_SSD_Test$ mv ssd_coco_labels.txt ssd_labels.txt
bhmk@bhmk-PowerEdge-R740:/opt/nvidia/deepstream/deepstream-5.0/sources/objectDetector_SSD/ONNX_SSD_Test$ deepstream-app -c deepstream_app_config_ssd_frozen_onnx.txt
Warn: ‘threshold’ parameter has been deprecated. Use ‘pre-cluster-threshold’ instead.
ERROR: …/nvdsinfer/nvdsinfer_model_builder.cpp:1408 Deserialize engine failed because file path: /opt/nvidia/deepstream/deepstream-5.0/sources/objectDetector_SSD/ONNX_SSD_Test/ssd_traffic.caffemodel_b30_gpu0_int8.engine open error
0:00:00.410308682 837 0x5623846fc780 WARN nvinfer gstnvinfer.cpp:599:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1566> [UID = 1]: deserialize engine from file :/opt/nvidia/deepstream/deepstream-5.0/sources/objectDetector_SSD/ONNX_SSD_Test/ssd_traffic.caffemodel_b30_gpu0_int8.engine failed
0:00:00.410337573 837 0x5623846fc780 WARN nvinfer gstnvinfer.cpp:599:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1673> [UID = 1]: deserialize backend context from engine from file :/opt/nvidia/deepstream/deepstream-5.0/sources/objectDetector_SSD/ONNX_SSD_Test/ssd_traffic.caffemodel_b30_gpu0_int8.engine failed, try rebuild
0:00:00.410347119 837 0x5623846fc780 INFO nvinfer gstnvinfer.cpp:602:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1591> [UID = 1]: Trying to create engine from model files
Input filename: /opt/nvidia/deepstream/deepstream-5.0/sources/objectDetector_SSD/ONNX_SSD_Test/ssd.frozen.onnx
ONNX IR version: 0.0.5
Opset version: 10
Producer name: tf2onnx
Producer version: 1.7.0
Domain:
Model version: 0
Doc string:
Unsupported ONNX data type: UINT8 (2)
ERROR: image_tensor:0:189 In function importInput:
[8] Assertion failed: convertDtype(onnxDtype.elem_type(), &trtDtype)
ERROR: …/nvdsinfer/nvdsinfer_model_builder.cpp:390 Failed to parse onnx file
ERROR: …/nvdsinfer/nvdsinfer_model_builder.cpp:971 failed to build network since parsing model errors.
ERROR: …/nvdsinfer/nvdsinfer_model_builder.cpp:872 failed to build network.
0:00:00.528346445 837 0x5623846fc780 ERROR nvinfer gstnvinfer.cpp:596:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1611> [UID = 1]: build engine file failed
0:00:00.528381245 837 0x5623846fc780 ERROR nvinfer gstnvinfer.cpp:596:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1697> [UID = 1]: build backend context failed
0:00:00.528394075 837 0x5623846fc780 ERROR nvinfer gstnvinfer.cpp:596:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::initialize() <nvdsinfer_context_impl.cpp:1024> [UID = 1]: generate backend failed, check config file settings
0:00:00.528442201 837 0x5623846fc780 WARN nvinfer gstnvinfer.cpp:781:gst_nvinfer_start:<primary_gie> error: Failed to create NvDsInferContext instance
0:00:00.528450128 837 0x5623846fc780 WARN nvinfer gstnvinfer.cpp:781:gst_nvinfer_start:<primary_gie> error: Config file path: /opt/nvidia/deepstream/deepstream-5.0/sources/objectDetector_SSD/ONNX_SSD_Test/config_ssd_frozen_onnx.txt, NvDsInfer Error: NVDSINFER_CONFIG_FAILED
** ERROR: main:651: Failed to set pipeline to PAUSED
Quitting
ERROR from primary_gie: Failed to create NvDsInferContext instance
Debug info: gstnvinfer.cpp(781): gst_nvinfer_start (): /GstPipeline:pipeline/GstBin:primary_gie_bin/GstNvInfer:primary_gie:
Config file path: /opt/nvidia/deepstream/deepstream-5.0/sources/objectDetector_SSD/ONNX_SSD_Test/config_ssd_frozen_onnx.txt, NvDsInfer Error: NVDSINFER_CONFIG_FAILED
App run failed
Kindly help by giving reference