Can't fuse pad and convolution with same pad mode


I previously converted a caffe open pose model to tensflow via mmdnn (
I can successfully run inference with the saved model with correct output in tensorflow.

I then converted the tensorflow model with trt.TrtGraphConverterV2 for fp16 with segment size of 3.
There were no errors with conversion or loading the outputted model.
When I try to run inference with the same input image as the non-trt version, I receive an error:
“can’t fuse pad and convolution with caffe pad mode”.
If I change the segment size to 1 I get a similar error:
“W tensorflow/compiler/tf2tensorrt/utils/] DefaultLogger Can’t fuse pad and convolution with same pad mode”.

Input tensor I’m using is 1,368,368,3. The model itself has the input layer setup as dynamic ?,16,16

Does tensorrt need the input/output size to be fixed?
I’m new to tensflow and tensort (took the tensorrt course from Nvidia).
My understanding is padding=same means to use the same input size as the last layer output size.
It looks like padding might not be supported from my searches, does it mean I have to hard code values
for each layer? If so, what’s the easiest way to go about doing that?

Thanks for any help possible!


TensorRT Version:
GPU Type: Tesla V100
Nvidia Driver Version: 440.64.00
CUDA Version: 10.2.89
CUDNN Version: (I can’t find it in docker container. I’m using tensorrt that is compiled into tensflow)
Operating System + Version: Ubuntu 18.04.4 LTS
Python Version (if applicable): 3.6.9
TensorFlow Version (if applicable): 2.1.0
Baremetal or Container (if container which image + tag): docker ( 20.02-tf2-py3)

It seems to be a warning, could you please share the error log that you are getting during inference?

Also Padding followed by a Convolution or Deconvolution can be fused into a single Convolution/Deconvolution layer if all the padding sizes are non-negative.
Please refer to below link:

You can also refer to padding TRT layer for more details:


Thanks Sunil! Your comment that this was a warning helped. It helped me realize that I set the working set too high.