AGX ORIN onnx-->TensorRT not soupport TopK

when I trying to convert onnx model to tensorRT, there is a error such as:
INVALID_NODE: Invalid Node - /rpn/TopK.
This version of TensorRT only supports input K as an initializer. Try applying constant folding on the model using Polygraphy: https://github.com/NVIDIA/TensorRT/tree/master/tools/Polygraphy/examples/cli/surgeon/02_folding_constants.

And the error still exist after I’m applying constant folding on the model using Polygraphy.

Is there anyone know how to fix it? Thanks.

The environment such as:
pytorch 1.13.0
tensorrt 8.5.3.1
JetPack 5.0.2

Hi,

There is a newer TensorRT 8.5 (JetPack 5.1) available.
Would you mind giving it a try?

Thanks.

After I update JatPack 5.0.2 to 5.1 , the error cahnged as:
Process finished with exit code 139 (interrupted by signal 11: SIGSEGV)

Could you teach me how to fix this? Thanks.

The environment such as:
pytorch 1.13.0
tensorrt 8.5.3.1
JetPack 5.1-b147

Hi,

Sorry for the late update.

We want to reproduce this issue internally first.
Could you share the ONNX model (before/after constant folding) with us?

Thanks.

Due to I had back to JetPack 5.0.2, I can’t share the onnx model to you based on JetPack 5.1 . But I’m pleasure to share the onnx model based on JetPack 5.0.2.
And I found that JetPack 5.0.2+tensorRT8.5 don’t support TopK(), torchvision.ops.nms(), tochvison.ops.MultiScaleRoIAlign(), torech.where() , when I convert onnx model to tensorRT.
These operators are what I have to use in my model, but I don’t know how to fix it. Would you like to help me, thanks a lot.

Hi,

ONNX can be used cross different JePack version.
We can test it on both JetPack 5.0.2 and JetPack 5.1.

Please let us know once the model is ready.
Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.