TensorRT 8.5.1.7 not soupport TopK, this version of TensorRT only supports input K as an initializer

Description

when I trying to convert onnx model to TensorRT, there is a error such as:

[04/26/2023-11:27:14] [E] [TRT] ModelImporter.cpp:732: ERROR: ModelImporter.cpp:168 In function parseGraph:
[6] Invalid Node - TopK_573
This version of TensorRT only supports input K as an initializer. Try applying constant folding on the model using Polygraphy: TensorRT/tools/Polygraphy/examples/cli/surgeon/02_folding_constants at master · NVIDIA/TensorRT · GitHub
[04/26/2023-11:27:14] [E] Failed to parse onnx file

Then I used Polygraphy tools to process the onnx model according to the above link to convert the model again, but the problem still exists.

Does anyone know how to repair this problem?

Environment

TensorRT Version: 8.5.1.7
GPU Type: 3090 24G
Operating System + Version: Windows

Hi,
Request you to share the ONNX model and the script if not shared already so that we can assist you better.
Alongside you can try few things:

  1. validating your model with the below snippet

check_model.py

import sys
import onnx
filename = yourONNXmodel
model = onnx.load(filename)
onnx.checker.check_model(model).
2) Try running your model with trtexec command.

In case you are still facing issue, request you to share the trtexec “”–verbose"" log for further debugging
Thanks!

Thank you for your help.

This is my onnx model:
rtmdetnano.onnx (3.8 MB)

Did you solve the parsing problem?

Use the latest Tensorrt version 8.6+

1 Like