TensorRT documentation broken link and onnx compatibility question


I was trying to import an ONNX model to TensorRT, specifically the Faster RCNN model which has an ONNX version of 1.5 and an Opset version of 10. Here are the links:



In the TensorRT documentation, in section 2.2.5, it mentions an Opset version compatible up to version 7 and gives a broken link to
" ONNX Model Opset Version Converter"

Could I import my Faster RCNN model with the current versions?
ONNX version = 1.5
Opset version = 10
TensorRT version =
Cuda = 10.1
Cudnn version: 7.6.3

I also read some comments that people pass the model from onnx to caffe and load it, could this be an option?
Jetpack 4.4 release could potentially solve some of my issues, any hint on when it be released?

Thank you for your time

Thanks for pointing out the issue in documentation. You can refer below link:

I think yes, you can import your Faster RCNN model with the current versions. Please let us know in case you face any issues.

That is also one option, please refer to below sample for your reference:

You can also use ONNX parser and implement custom layer for unsupported operations.

Please stay tuned to NVIDIA annoucements.


1 Like