We are trying to run inference on IOT tensorflow models - Mobilenetv1SSD and DeepLabV3, we have generated onnx models for both of these but when trying to convert to trt model we are getting error.
onnx models and error logs have been attached, Could you please look into it and provide some solution.
When we try from our end, we get the following error:
[01/10/2023-05:45:17] [E] [TRT] parsers/onnx/ModelImporter.cpp:729: --- End node ---
[01/10/2023-05:45:17] [E] [TRT] parsers/onnx/ModelImporter.cpp:731: ERROR: parsers/onnx/ModelImporter.cpp:168 In function parseGraph:
[6] Invalid Node - Postprocessor/BatchMultiClassNonMaxSuppression/map/while/MultiClassNonMaxSuppression/SortByField/TopKV2
This version of TensorRT only supports input K as an initializer. Try applying constant folding on the model using Polygraphy: https://github.com/NVIDIA/TensorRT/tree/master/tools/Polygraphy/examples/cli/surgeon/02_folding_constants
[01/10/2023-05:45:17] [E] Failed to parse onnx file
Currently, TopK with dynamic K is not supported by TensorRT.
Please refer to the the post below for more information.