Reflect padding in TensorRT

Hello, Is there any alternative to implement reflect padding in tensorRT. I can’t replace reflect padding to constant padding. Currently I am getting below error while converting onnx model to tensorRT.

ERROR: builtin_op_importers.cpp:2191 In function importPad:
[8] Assertion failed: mode == "constant" && value == 0.f && "This version of TensorRT only supports constant 0 padding!"

Hi,
Request you to share the ONNX model and the script if not shared already so that we can assist you better.
Alongside you can try few things:
https://docs.nvidia.com/deeplearning/tensorrt/quick-start-guide/index.html#onnx-export

  1. validating your model with the below snippet

check_model.py

import sys
import onnx
filename = yourONNXmodel
model = onnx.load(filename)
onnx.checker.check_model(model).
2) Try running your model with trtexec command.
https://github.com/NVIDIA/TensorRT/tree/master/samples/opensource/trtexec
In case you are still facing issue, request you to share the trtexec “”–verbose"" log for further debugging
Thanks!

  1. Run the check_model.py it worked fine with no errors.
  2. Logs I get after running trtexec --verbose --onnnx=model.onnx is below:

[08/24/2021-13:14:54] [V] [TRT] ModelImporter.cpp:125: Pad_155 [Pad] inputs: [398 → (1, 128, 28, 28)], [419 → (8)],
ERROR: builtin_op_importers.cpp:2191 In function importPad:
[8] Assertion failed: mode == “constant” && value == 0.f && “This version of TensorRT only supports constant 0 padding!”
[08/24/2021-13:14:54] [E] Failed to parse onnx file
[08/24/2021-13:14:54] [E] Parsing model failed
[08/24/2021-13:14:54] [E] Engine creation failed
[08/24/2021-13:14:54] [E] Engine set up failed
&&&& FAILED TensorRT.trtexec # /usr/src/tensorrt/bin/trtexec --onnx=/home/xavier/new.onnx --verbose

Hi @dharmil26shah ,
I am afraid, currently we dont support reflect padding, but this will be supported in future releases.

Thanks!

Hello @AakankshaS, Is there any way I can use reflect padding in CPU and rest of the network inside the GPU engine file?

Hi @dharmil26shah ,
I don’t think this will work, and you might have to wait for the upcoming TRT releases to have this working.
Thanks!