TensorRt 5.0.2.6 No importer registered for op: Gather

While parsing node number 110 [Gather → “293”]:
— Begin node —
input: “292”
input: “291”
output: “293”
op_type: “Gather”
attribute {
name: “axis”
i: 0
type: INT
}
doc_string: “/home/beiqi.qh/workplace/ssd.pytorch/ssd_resnet18.py(98): forward\n/home/beiqi.qh/workplace/opt/anaconda2/lib/python2.7/site-packages/torch/nn/modules/module.py(465): _slow_forward\n/home/beiqi.qh/workplace/opt/anaconda2/lib/python2.7/site-packages/torch/nn/modules/module.py(475): call\n/home/beiqi.qh/workplace/opt/anaconda2/lib/python2.7/site-packages/torch/jit/init.py(146): forward\n/home/beiqi.qh/workplace/opt/anaconda2/lib/python2.7/site-packages/torch/nn/modules/module.py(477): call\n/home/beiqi.qh/workplace/opt/anaconda2/lib/python2.7/site-packages/torch/jit/init.py(115): get_trace_graph\n/home/beiqi.qh/workplace/opt/anaconda2/lib/python2.7/site-packages/torch/onnx/utils.py(191): _trace_and_get_graph_from_model\n/home/beiqi.qh/workplace/opt/anaconda2/lib/python2.7/site-packages/torch/onnx/utils.py(223): _model_to_graph\n/home/beiqi.qh/workplace/opt/anaconda2/lib/python2.7/site-packages/torch/onnx/utils.py(280): _export\n/home/beiqi.qh/workplace/opt/anaconda2/lib/python2.7/site-packages/torch/onnx/utils.py(104): export\n/home/beiqi.qh/workplace/opt/anaconda2/lib/python2.7/site-packages/torch/onnx/init.py(27): export\nPytorch2OnnxExport.py(18): \n”

— End node —
ERROR: /home/erisuser/p4sw/sw/gpgpu/MachineLearning/DIT/release/5.0/parsers/onnxOpenSource/ModelImporter.cpp:142 In function importNode:
[8] No importer registered for op: Gather
ERROR: could not parse input engine.

Hello,

It appears you hit this:

NodeImportResult ModelImporter::importNode(::ONNX_NAMESPACE::NodeProto const& node,
                                           std::vector<TensorOrWeights>& inputs) {
  if( !_op_importers.count(node.op_type()) ) {
    return MAKE_ERROR("No importer registered for op: " + node.op_type(),
                      ErrorCode::kUNSUPPORTED_NODE);
  }

Looks like “Gather” is an unsupported node for onnx parser? Are you using the onnxparer that came with TRT or GitHub - onnx/onnx-tensorrt: ONNX-TensorRT: TensorRT backend for ONNX?

I’ve the same problem and used both trt 5.0.2 and onnx-tensorrt parser. But no luck, it says No importer registered for op: Gather

Has there been any resolution to this? I get this with Slice as well.

Update: a quick fix for a Pytorch Resnet was to stop computing the batch size with tensor.size, which seems to be introducing these ops into the graph. Try hardcoding the batch size.

run into the same problem

TensorRT 5.1 has support for the Gather op (https://docs.nvidia.com/deeplearning/sdk/tensorrt-release-notes/tensorrt-5.html#rel_5-1-2-RC)! So, upgrading to that may help you. Please report back if that works.

Hi guys , Is it be solved?I also meet the same problem:
[8] No importer registered for op: Gather
ERROR: failed to parse onnx file

i run it on tx2. but tx2 the lastest version just support tensorRT5.0. I can not install tensorRT5.1 on tx2.Because these files are compiled for x86-64 architecture not arm64 which TX2 used. Please help me. Thanks first!

Hi @NVES, how can we know if we are using onnxparser that comes with TRT or not?. In my case I am using TRT 7.0.0, I have applied ONNX GS to replace nms nodes in the YOLOV3 model with the TRT plugin BatchedNMS_TRT, and then used trtexec to convert the updated onnx model to trt engine but I am getting the error {8] No importer registered for op: BatchedNMS_TRT even though the plugin is supported in TRT7.0.0, see here:

[02/03/2021-22:41:50] [W] [TRT] /workspace/onnx-tensorrt/onnx2trt_utils.cpp:235: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
[02/03/2021-22:41:50] [W] [TRT] /workspace/onnx-tensorrt/onnx2trt_utils.cpp:261: One or more weights outside the range of INT32 was clamped
[02/03/2021-22:41:50] [W] [TRT] /workspace/onnx-tensorrt/onnx2trt_utils.cpp:261: One or more weights outside the range of INT32 was clamped
While parsing node number 467 [BatchedNMS_TRT]:
ERROR: /workspace/onnx-tensorrt/ModelImporter.cpp:134 In function parseGraph:
[8] No importer registered for op: BatchedNMS_TRT
[02/03/2021-22:41:50] [E] Failed to parse onnx file
[02/03/2021-22:41:50] [E] Parsing model failed
[02/03/2021-22:41:50] [E] Engine creation failed
[02/03/2021-22:41:50] [E] Engine set up failed
&&&& FAILED TensorRT.trtexec # trtexec --onnx=onnx-tensorrt/models/yolov3-10-with-plugin.onnx --saveEngine=/workspace/onnx-tensorrt/models/optimized_yolov3.trt --int8