Assertion failed: node.output().size() == 1 && "TensorRT does not support the indices output in MaxPool!"

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU): Jetson Tx2
• DeepStream Version Deepstream-5.1
• JetPack Version (valid for Jetson only): Jetpack 4.5
• TensorRT Version TensorRT 7.1.3

I have an pytorch → ONNX converted file (.onnx) file. I am trying to run inference using nvinfer in Deepstream. I get an error "Assertion failed: node.output().size() == 1 && “TensorRT does not support the indices output in MaxPool!” and pipeline stops. Before this error, I get a warning " Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32". Request help in resolving the issue

Hi,

First, it’s recommended to upgrade your environment to JetPack4.6 (TensorRT 8.0) and Deepstream 6.0.
There are some improvements and new supported layers.

For your error, please try your model with trtexec binary as well.

$ /usr/src/tensorrt/bin/trtexec --onnx=[your/model]

If you find the same error from TensorRT, please check if all the layers used in your model are supported:

Thanks.

The same issue persists even after upgrading the Jetpack version and running ‘$ /usr/src/tensorrt/bin/trtexec --onnx=[your/model]’. I get the error at MaxPool2D layer and it seems to be supported in TensorRT. Any alternative on debugging please?

Also wanted to know whether TensorRT 8.0 supports MaxUnpool2D ?

Hi,

Based on our document below, 2D pooling is supported but unpool is not.

Does your model meet the following requirement?

Conditions And Limitations

The number of input kernel dimensions determine 2D or 3D. For 2D pooling, input and output tensors should have three or more dimensions. For 3D pooling, input and output tensors should have four or more dimensions.

Thanks.

Thank you. I was able to solve it through other means

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.