IOT tensorflow models inference

Hi, (28.4 MB)

We are trying to run inference on IOT tensorflow models - Mobilenetv1SSD and DeepLabV3, we have generated onnx models for both of these but when trying to convert to trt model we are getting error.

onnx models and error logs have been attached, Could you please look into it and provide some solution.



This looks TensorRT related. We are moving this post to the TensorRT forum and we will get back to you.

Thank you.


When we try from our end, we get the following error:

[01/10/2023-05:45:17] [E] [TRT] parsers/onnx/ModelImporter.cpp:729: --- End node ---
[01/10/2023-05:45:17] [E] [TRT] parsers/onnx/ModelImporter.cpp:731: ERROR: parsers/onnx/ModelImporter.cpp:168 In function parseGraph:
[6] Invalid Node - Postprocessor/BatchMultiClassNonMaxSuppression/map/while/MultiClassNonMaxSuppression/SortByField/TopKV2
This version of TensorRT only supports input K as an initializer. Try applying constant folding on the model using Polygraphy:
[01/10/2023-05:45:17] [E] Failed to parse onnx file

Currently, TopK with dynamic K is not supported by TensorRT.

Please refer to the the post below for more information.

Thank you.