quick try for whether TensorRT supporting some ONNX models


Do we have some methods to quick try whether TensorRT support some ONNX models?like trtexec?

Hi bigcat,

Yes you can use trtexec to quickly test ONNX model in TensorRT - see example command below:

$ cd /usr/src/tensorrt/bin
$ ./trtexec --onnx=<path to the ONNX model> --fp16

Optionally you can specify the name of desired output layer with the --output argument


I tried a resnet.onnx, but it failed as below, seems Flatten layer is not supported, but i have checked the TensorRT support matrix and the result is that Flatten layer in ONNX is supported.

[i]aiib@aiib:/usr/src/tensorrt/bin$ sudo ./trtexec --onnx=resnet.onnx --fp16
onnx: resnet.onnx

Input filename: resnet.onnx
ONNX IR version: 0.0.5
Opset version: 10
Producer name: PaddlePaddle
Producer version:
Model version: 0
Doc string:

WARNING: ONNX model has a newer ir_version (0.0.5) than this parser was built against (0.0.3).
While parsing node number 174 [Flatten -> “@HUB_resnet_v2_50_imagenet@fc_0.w_0@flatten_0”]:
ERROR: /home/erisuser/p4sw/sw/gpgpu/MachineLearning/DIT/release/5.0/parsers/onnxOpenSource/builtin_op_importers.cpp:755 In function importFlatten:
[8] Assertion failed: inputs.at(0).is_tensor()
failed to parse onnx file
Engine could not be created
Engine could not be created[/i]


I tried resnet50 onnx model from onnx zoo on Jetson Nano
Here is the link (using master tar file):

Using same command as given by dusty:
./trtexec --onnx=/home/nvidia/models/resnet50.onnx --fp16

Also, I tried vgg19 from following link (using master tar file):

./trtexec --onnx=/home/nvidia/models/vgg19.onnx --fp16

Both networks worked.
Please let us know.


I have checked that the two ResNet are not the same. My ResNet has Flatten layer, I send the model to you, and please help to check
resnet.onnx.zip (90.8 MB)


Do we have some updates now ?


Do we have some updates now?


Do you have some updates now?

This issue has blocked us moving forward, please share your solution, thanks


Flatten is not default supported in TensorRT.
But you can find the related plugin in our sample.