I’m using polygraphy API to convert my onnx model to a tensorRT engine but I get the following error:
[02/02/2022-13:05:09] [TRT] [E] ModelImporter.cpp:773: While parsing node number 82 [NonZero -> "311"]:
[02/02/2022-13:05:09] [TRT] [E] ModelImporter.cpp:774: --- Begin node ---
[02/02/2022-13:05:09] [TRT] [E] ModelImporter.cpp:775: input: "310"
output: "311"
name: "NonZero_82"
op_type: "NonZero"
[02/02/2022-13:05:09] [TRT] [E] ModelImporter.cpp:776: --- End node ---
[02/02/2022-13:05:09] [TRT] [E] ModelImporter.cpp:779: ERROR: builtin_op_importers.cpp:4870 In function importFallbackPluginImporter:
[8] Assertion failed: creator && "Plugin not found, are the plugin name, version, and namespace correct?"
[E] In node 82 (importFallbackPluginImporter): UNSUPPORTED_NODE: Assertion failed: creator && "Plugin not found, are the plugin name, version, and namespace correct?"
If I generate the graph using polygraphy inspect and look for the node in the error message this is what I find:
Node 82 | NonZero_82 [Op: NonZero]
{310}
-> {311}
That is the first NonZero Operation in the graph.
If I compare this with the onnx graph I find a correspondence with:
%310 : Bool(1, 2024, strides=[2024, 1], requires_grad=0, device=cuda:0) = onnx::Less(%309, %290) # C:\Users\arosasco\PycharmProjects\pcr\utils\misc.py:29:0
%311 : Long(2, *, device=cpu) = onnx::NonZero(%310)
%312 : Long(*, 2, device=cpu) = onnx::Transpose[perm=[1, 0]](%311)
%313 : Float(*, strides=[1], requires_grad=0, device=cuda:0) = onnx::GatherND(%309, %312) # C:\Users\arosasco\PycharmProjects\pcr\utils\misc.py:29:0
%314 : Bool(1, 2024, strides=[2024, 1], requires_grad=0, device=cuda:0) = onnx::Greater(%290, %309) # C:\Users\arosasco\PycharmProjects\pcr\utils\misc.py:29:0
That is referring to the first line of this function:
def onnx_minimum(x1, x2):
x1[x1 > x2] = x2[x2 < x1]
return x1
Is that operation really not convertible to TensorRT?