Importing ONNX model inTensorRT -- Jetson TX2

ONNX page say it is supported by Nvidia TensorRT but I cant find any documentation or tutorial available describing import of ONNX model in TensorRT.
although TensorRT-3 user guide has documentation on importing UFF model in TensorRT but no documentation on importing ONNX model in TensorRT

I will really appreciate any help on this topic

Thanks in advance

Hi yogesh111, support for ONNX will be coming to a future version of TensorRT.

Hi dusty_nv, can you estimate when the next version of tensorrt with support for onnx would be ready? Our team is trying to put some pytorch model into tensorrt, we are thinking if we should wait for the next version (since pytorch can export to onnx) or write our own parser for pytorch now.

@hengcherkeng at this time we are unable to say when the release for Jetson would be, however ONNX may be tested in GPU Cloud container, see: