Description
After debugging my network, I found the line which prevents it from being converted to TensorRT engine is a line which uses a ‘grid sample’ object:
torch.nn.functional.grid_sample()
After exporting the onnx and sanitising it, I run trtexec but I get the error:
No importer registered for op: GridSample. Attempting to import as plugin.
What is the solution please?
Link to model:
Environment
TensorRT Version: 8.6.1.6
GPU Type: RTX 3080
Nvidia Driver Version:
CUDA Version: 11.6
CUDNN Version:
Operating System + Version: Windows 10
Python Version (if applicable):
TensorFlow Version (if applicable): 3.10.8
PyTorch Version (if applicable): 1.12.1
Baremetal or Container (if container which image + tag):
Relevant Files
Please attach or include links to any models, data, files, or scripts necessary to reproduce your issue. (Github repo, Google Drive, Dropbox, etc.)
Steps To Reproduce
Please include:
- Exact steps/commands to build your repro
- Exact steps/commands to run your repro
- Full traceback of errors encountered
Hi,
Could you please check the opset version you’re using while exporting the ONNX model.
Please make sure you are using the latest Opset version 17 and try again.
If you still face the issue, please share with us the ONNX model and complete verbose logs.
Thank you.
Hello,
I just exported the onnx again using opset v. 17. After sanitising it, if I attempt to convert it to a trt engine, it gives this error:
[05/26/2023-11:24:09] [I] [TRT] No importer registered for op: GridSample. Attempting to import as plugin.
[05/26/2023-11:24:09] [I] [TRT] Searching for plugin: GridSample, plugin_version: 1, plugin_namespace:
[05/26/2023-11:24:09] [E] [TRT] ModelImporter.cpp:773: While parsing node number 875 [GridSample -> "/GridSample_output_0"]:
[05/26/2023-11:24:09] [E] [TRT] ModelImporter.cpp:774: --- Begin node ---
[05/26/2023-11:24:09] [E] [TRT] ModelImporter.cpp:775: input: "/Reshape_99_output_0"
input: "/Concat_98_output_0"
output: "/GridSample_output_0"
name: "/GridSample"
op_type: "GridSample"
attribute {
name: "align_corners"
i: 1
type: INT
}
attribute {
name: "mode"
s: "bilinear"
type: STRING
}
attribute {
name: "padding_mode"
s: "zeros"
type: STRING
}
[05/26/2023-11:24:09] [E] [TRT] ModelImporter.cpp:776: --- End node ---
[05/26/2023-11:24:09] [E] [TRT] ModelImporter.cpp:779: ERROR: builtin_op_importers.cpp:4890 In function importFallbackPluginImporter:
[8] Assertion failed: creator && "Plugin not found, are the plugin name, version, and namespace correct?"
[05/26/2023-11:24:09] [E] Failed to parse onnx file
[05/26/2023-11:24:09] [I] Finished parsing network model. Parse time: 8.25613
[05/26/2023-11:24:09] [E] Parsing model failed
[05/26/2023-11:24:09] [E] Failed to create engine from model or file.
[05/26/2023-11:24:09] [E] Engine set up failed
&&&& FAILED TensorRT.trtexec [TensorRT v8601] # trtexec.exe --onnx=IGCV_san.onnx --saveEngine=IGCV_san.trt
Here is the onnx model exported with opset v. 17:
Hi,
We could successfully build the engine on TensorRT 8.6.
[05/30/2023-07:53:17] [I]
&&&& PASSED TensorRT.trtexec [TensorRT v8601] # trtexec --onnx=model.onnx --verbose
Are you facing this issue only on windows ?
Thank you.
Thanks. Yes I’m using Windows.
Good Morning,
If this is an identified issue on Windows, is there an upcoming fix please?
Thanks.
Hi,
Could you please reinstall TensorRT and try again to make sure the TensorRT libraries are correctly installed.
Please share with us the complete verbose logs if you face the issue again.
Thank you.
hi! hi! I export opset16 -onnx,and use onnx_graphsurgeon to directly modify the opset to 20,then use trtexec --onnx xx—engine, meeting the same problem:Error Code 3: API Usage Error (Parameter check failed at: optimizer/api/network.cpp::addGridSample::1474, condition: input.getDimensions().nbDims == 4