Tensorflow for DeepStream 5.0

Is there any possibility to use TensorFlow model in DeepStream.
TensorFlow —> ONNX —> TensorRT —> DeepStream ?

YES.
1. Please convert your model into onnx format with tf2onnx or keras2onnx.
2. Please feed it into Deeptream with onnx-file argument.

You can find more detail in our document here:
https://docs.nvidia.com/metropolis/deepstream/plugin-manual/index.html#page/DeepStream%20Plugins%20Development%20Guide/deepstream_plugin_details.3.01.html#wwpID0E0REB0HA

Thanks.

1 Like

Thanks @AastaLLL

Hi,

We are following a similar Tensorflow>ONNX>DeepStream path but are unable to run the ONNX file in Deepstream5.

We converted the tf2 pretrainined model SSD_MobileNEt_V2_320x320 using tf2onnx (using opset 11).

The ONNX file is converted just fine, but when we run it in DeepStream5 with the onnx-file argument, we are getting an error:

0:00:00.463757533 830896 0x7f21fc002300 INFO nvinfer gstnvinfer.cpp:602:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1591> [UID = 1]: Trying to create engine from model files

Input filename: /mnt/sda5/Checkpoints/onnx/model.onnx
ONNX IR version: 0.0.6
Opset version: 11
Producer name: tf2onnx
Producer version: 1.6.2
Domain:
Model version: 0
Doc string:

Unsupported ONNX data type: UINT8 (2)
ERROR: image_tensor:0:189 In function importInput:
[8] Assertion failed: convertDtype(onnxDtype.elem_type(), &trtDtype)
ERROR: …/nvdsinfer/nvdsinfer_model_builder.cpp:390 Failed to parse onnx file
ERROR: …/nvdsinfer/nvdsinfer_model_builder.cpp:971 failed to build network since parsing model errors.
ERROR: …/nvdsinfer/nvdsinfer_model_builder.cpp:872 failed to build network.
0:00:02.364467310 830896 0x7f21fc002300 ERROR nvinfer gstnvinfer.cpp:596:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1611> [UID = 1]: build engine file failed
0:00:02.364486983 830896 0x7f21fc002300 ERROR nvinfer gstnvinfer.cpp:596:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1697> [UID = 1]: build backend context failed
0:00:02.364520097 830896 0x7f21fc002300 ERROR nvinfer gstnvinfer.cpp:596:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::initialize() <nvdsinfer_context_impl.cpp:1024> [UID = 1]: generate backend failed, check config file settings
0:00:02.364549920 830896 0x7f21fc002300 WARN nvinfer gstnvinfer.cpp:781:gst_nvinfer_start:<primary_gie> error: Failed to create NvDsInferContext instance
0:00:02.364554399 830896 0x7f21fc002300 WARN nvinfer gstnvinfer.cpp:781:gst_nvinfer_start:<primary_gie> error: Config file path: /home/alexc/workspace/craddle/config/work/face-mask/infer_primary.txt, NvDsInfer Error: NVDSINFER_CONFIG_FAILED
** ERROR: <set_display:25>: Failed to set pipeline to PAUSED
** INFO: <bus_callback:163>: Pipeline ready

ERROR from primary_gie: Failed to create NvDsInferContext instance
Debug info: gstnvinfer.cpp(781): gst_nvinfer_start (): /GstPipeline:pipeline/GstBin:primary_gie_bin/GstNvInfer:primary_gie:
Config file path: /home/alexc/workspace/craddle/config/work/face-mask/infer_primary.txt, NvDsInfer Error: NVDSINFER_CONFIG_FAILED

Hi pushkar.chatterji,

Please open a new topic for your issue. Thanks