When I try to convert my model from onnx to tensorrt format I get the same error, I even tried to apply polygraphy but I did not get good results. To make the conversion from pytorch to onnx I used the opset version number 16:
[02/11/2023-15:24:59] [E] [TRT] ModelImporter.cpp:775: input:“/transform/Gather_output_0”
TensorRT Version: 188.8.131.52
GPU Type: T4
Nvidia Driver Version: 510.47.03
CUDA Version: 11.6
CUDNN Version: 8.4
Operating System + Version: Ubuntu 20.04.5
PyTorch Version: 1.13.1
model in onnx formart:
after applying polygraphy:
Steps To Reproduce
trtexec --onnx=model_san.onnx --saveEngine=model.plan --fp16 --workspace=7000 --buildOnly --verbose
We recommend that you try the most recent TensorRT version 8.5.3.
I hope the following similar posts will help you.
I think padding related node is causing error, we don’t support 2D shape tensors yet. We can try workaround constant-fold with polygraphy. After this we are able to successfully generate engine. Please try
polygraphy surgeon sanitize --fold-constants grid_sample.onnx -o 2001833/folded.onnx
For more details,
I try conver a pytorch model to TensorRT, Pytorch->onnx->TensorRT.
I convert the model to onnx succefully, but when I try to convert onnx to TensorRT using trtexec, I got :
 Invalid Node - Pad_14
[shuffleNode.cpp::symbolicExecute::387] Error code 4: Internal Error (Reshape_3: IShuffleLayer applied to shape tensor must have 0 or 1 reshape dimensions: dimensions were [-1,2]
I inspect the model by netron:
It seems the pytorch function F.pad get the incompatib …
I tried with TensorRT version 8.5.3, now the error is as follows:
any ideas or workaround you can suggest? this is with the model after applying polygraphy, thanks a lot!
We could reproduce a different error on 8.5.3 version.
Please allow us some time to work on this issue.
I’ll be waiting, thanks a lot!
Pytorch MaskRCNN has known issues when importing into TensorRT.
We are working on addressing this issue, and shall be available in future release.
Please stay tuned.
I have seen that this problem has been going on for a while. Any possible release date?
@isaac21 , apologies for the inconvenience caused.
The team is already working on the fixes, and we are targeting to provide a fix in upcoming releases.