Layer fusion issue


I use this command to run

./trtexec --onnx=./build.onnx --explicitBatch --verbose --workspace=2048

I am confused about “Merging layers: Conv_3 + Relu_4 || Conv_0 + Relu_1”.

Here’s part of log message :
[02/18/2021-11:57:18] [V] [TRT] Fusing Conv_3 with Relu_4
[02/18/2021-11:57:18] [V] [TRT] Fusing Conv_0 with Relu_1
[02/18/2021-11:57:18] [V] [TRT] Fusing Conv_8 with Relu_9
[02/18/2021-11:57:18] [V] [TRT] Fusing Conv_11 with Relu_12
[02/18/2021-11:57:18] [V] [TRT] Fusing Conv_17 with Relu_18
[02/18/2021-11:57:18] [V] [TRT] Fusing Conv_14 with Relu_15
[02/18/2021-11:57:18] [V] [TRT] Fusing (Unnamed Layer* 59) [ElementWise] with Relu_54
[02/18/2021-11:57:18] [V] [TRT] Fusing (Unnamed Layer* 65) [ElementWise] with Relu_56
[02/18/2021-11:57:18] [V] [TRT] After vertical fusions: 29 layers
[02/18/2021-11:57:18] [V] [TRT] After final dead-layer removal: 29 layers
[02/18/2021-11:57:18] [V] [TRT] Merging layers: Conv_3 + Relu_4 || Conv_0 + Relu_1
[02/18/2021-11:57:18] [V] [TRT] After tensor merging: 28 layers
[02/18/2021-11:57:18] [V] [TRT] After concat removal: 28 layers

The top conv layers are merged.

So I tried another model below:
The second convolution layers before "Add " op are not merged.

Here’s two question:

  1. Does the type of fusion effect executing time?
  2. What’s the fusion condition?



TensorRT Version : 7.1.3
GPU Type : Xavier
Nvidia Driver Version : Package:nvidia-jetpack, Version: 4.4.1-b50
CUDA Version : 10.2.89
CUDNN Version : 8.0.0
Operating System + Version : Ubuntu 18.04
Python Version (if applicable) :
TensorFlow Version (if applicable) :
PyTorch Version (if applicable) :
Baremetal or Container (if container which image + tag) :

Relevant Files

Please attach or include links to any models, data, files, or scripts necessary to reproduce your issue. (Github repo, Google Drive, Dropbox, etc.)

Steps To Reproduce

Please include:

  • Exact steps/commands to build your repro
  • Exact steps/commands to run your repro
  • Full traceback of errors encountered

Hi @disculus2012,

Hope following link will help you.

Thank you.

Thanks for reply.

I am confused about merge.
Why Conv_3 and Conv_0 are merged?
Which rules does it imply in provided link?

Also, model in the second graph one of the Conv layer (512x6400x3x3 ) has fused conv and Add layer together.
Does this fusion result cause two Conv layer ( 512x6400x3x3 ) not merged?


Hi @disculus2012,

Hope following link will help you. Refer Horizontal layer fusion

Thank you.