No importer registered for op: GatherND, while converting ONNX to Engine


I’m trying to convert Keras model into Tensorrt Engine (Keras → ONNX → Engine).
Model contains tf.gather with batch_dims set to non-zero.

# I want to take 
# batch_size x N
params = tf.constant([['b11', 'b12', 'b13'], ['b21', 'b22', 'b23']])

# batch_size x N
indices = tf.constant([[0, 1], [1, 2]])

# collect relevant param per batch
tf.gather(params, indices, batch_dims=1)

[['b11', 'b12'],   ['b22', 'b23']]

I’m able to convert model to ONNX format with opset version = 12. As it mentioned here: onnx/ at master · onnx/onnx · GitHub.

And a bit confused by naming. Why tf.gather corresponds to GatherND op in ONNX graph? For example, GatherElements satisfies my need, but how to set it instead GatherND?

But, I can’t convert ONNX model to Engine. With the following error
No importer registered for op: GatherND. Attempting to import as plugin

As I got from here it is not supported by onnx-tensorrt: onnx-tensorrt/ at 7.2.1 · onnx/onnx-tensorrt · GitHub ? If it is not implemented, what is the best way to solve issue? Maybe replace with similar node if exists? Or just cut this part of graph and do post-processing?

I’d be grateful for any advice :)


TensorRT Version:
GPU Type: 1050
Nvidia Driver Version: 470.63.01
CUDA Version: 11.4
Operating System + Version: 18.04
Python Version (if applicable): 3.6.9
TensorFlow Version (if applicable): 1.15.5
ONNX: 1.7.0

Baremetal or Container (if container which image + tag):

Request you to share the ONNX model and the script if not shared already so that we can assist you better.
Alongside you can try few things:

  1. validating your model with the below snippet

import sys
import onnx
filename = yourONNXmodel
model = onnx.load(filename)
2) Try running your model with trtexec command.
In case you are still facing issue, request you to share the trtexec “”–verbose"" log for further debugging


Thanks for response :)

Here is an experiment code.

  1. This produces no error message. Result is None
model = onnx.load(filename)
  1. Here is log from trtexec trtexec.log
[TRT] INVALID_ARGUMENT: getPluginCreator could not find plugin GatherND version 1
While parsing node number 14 [GatherND -> "decode_layer_7"]:

simple_tf_gather_cnn.onnx (24.3 KB)


GatherND operator is currently not supported in TensorRT, you may need to implement custom plugin.
You can check TRT supported operators here onnx-tensorrt/ at master · onnx/onnx-tensorrt · GitHub

Please refer to below links related custom plugin implementation and sample:


But here onnx-tensorrt/ at master · onnx/onnx-tensorrt ( It says GatherND is supported.

I face the same problem using the onnx multipose movenet model version that you can find here

This model is tested and working while using onnxruntime_gpu python API (Jetson Zoo -

I’m using a Jetson Xavier NX with the last Jetpack 4.6 and here is my complete output and trtexec command line

/usr/src/tensorrt/bin/trtexec --onnx=saved_model_192x256/model_float32.onnx --minShapes=256,192,3 --optShapes=256,192,3 --maxShapes=256,192,3 --dumpOutput --saveEngine=model_192x256.trt

[10/14/2021-13:23:30] [I] [TRT] ----------------------------------------------------------------
[10/14/2021-13:23:30] [W] [TRT] onnx2trt_utils.cpp:364: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
[10/14/2021-13:23:30] [I] [TRT] No importer registered for op: GatherND. Attempting to import as plugin.
[10/14/2021-13:23:30] [I] [TRT] Searching for plugin: GatherND, plugin_version: 1, plugin_namespace:
[10/14/2021-13:23:30] [E] [TRT] 3: getPluginCreator could not find plugin: GatherND version: 1


It should be supported by Tensorrt as well (support matrix). I see for Tensorrt 8.0 it is not available, so for Jetpack 4.6 it won’t work. But for Tensorrt 8.2 it is already implemented, so I hope new version of Jetpack will support it.

I’ve used Gather v1 and it works for me with batch size == 1 (which is enough for me for now). And waiting for new updates :)

hello, have you ever solved the problem.