No importer registered for op: Gather ERROR: failed to parse onnx file

When i run the tensorrt/samples/samplesONNXMNIST/samplesONNXMNIST.cpp. I meet this errors:

Input filename: …/data/mnist/resnet18.onnx
ONNX IR version: 0.0.4
Opset version: 9
Producer name: pytorch
Producer version: 1.1
Domain:
Model version: 0
Doc string:

WARNING: ONNX model has a newer ir_version (0.0.4) than this parser was built against (0.0.3).
While parsing node number 69 [Gather -> “192”]:
ERROR: /home/erisuser/p4sw/sw/gpgpu/MachineLearning/DIT/release/5.0/parsers/onnxOpenSource/ModelImporter.cpp:142 In function importNode:
[8] No importer registered for op: Gather
ERROR: failed to parse onnx file

I was transferring an onnx model to tensorrt. The onnx model comes from pytorch pretrained model. I tried standard resnet50, and use The torch.onnx.export() function. When parsing this generated onnx model, it always appears the above error.but i use the mnist.onnx which is the offical model.it is ok.
Besides,I run it on tx2. but tx2 the lastest version just support tensorRT5.0. I can not install tensorRT5.1 on tx2.Because these files are compiled for x86-64 architecture not arm64 which TX2 used. Please help me. Thanks first!

Hi,

ONNX parser support Gather until TensorRT5.1.
Please wait for our next JetPack release to get the layer support.
You can find our future schedule on this topic:
https://devtalk.nvidia.com/default/topic/1055628/jetson-nano/new-jetson-software-modules-and-pricing/

Alternatively, maybe you can try to compile a v5.1 ONNX parser for Jetson.
But it may have some dependency issue between parser(v5.1) and inference(v5.0) library.
https://github.com/onnx/onnx-tensorrt/tree/8b52755beb781366f3bd2a34b3a0be0fcd4c0c33

Thanks.

Hi,

JetPack4.2.1 just release now.
There is a TensorRT5.1 for Jetson platform now.

Thanks.