TensorRT not soupport TopK, this version of TensorRT only supports input K as an initializer


when I trying to convert onnx model to TensorRT, there is a error such as:

[04/26/2023-11:27:14] [E] [TRT] ModelImporter.cpp:732: ERROR: ModelImporter.cpp:168 In function parseGraph:
[6] Invalid Node - TopK_573
This version of TensorRT only supports input K as an initializer. Try applying constant folding on the model using Polygraphy: TensorRT/tools/Polygraphy/examples/cli/surgeon/02_folding_constants at master · NVIDIA/TensorRT · GitHub
[04/26/2023-11:27:14] [E] Failed to parse onnx file

Then I used Polygraphy tools to process the onnx model according to the above link to convert the model again, but the problem still exists.

Does anyone know how to repair this problem?


TensorRT Version:
GPU Type: 3090 24G
Operating System + Version: Windows

Request you to share the ONNX model and the script if not shared already so that we can assist you better.
Alongside you can try few things:

  1. validating your model with the below snippet


import sys
import onnx
filename = yourONNXmodel
model = onnx.load(filename)
2) Try running your model with trtexec command.

In case you are still facing issue, request you to share the trtexec “”–verbose"" log for further debugging

Thank you for your help.

This is my onnx model:
rtmdetnano.onnx (3.8 MB)

Did you solve the parsing problem?

Use the latest Tensorrt version 8.6+

1 Like