Wanted to have clarification about layer fusion within trtexec binary

Hi all, I want to get clarifications on whether the + (plus) symbol before and after a layer name in the trtexec logs indicate that the layers are fused together?.
Please see the below logs. Here it has displayed

[GpuLayer] node_of_gpu_0/conv1_1 + node_of_gpu_0/res_conv1_bn_2 + node_of_gpu_0/pool1_1

Does it mean that layers node_of_gpu_0/conv1_1, node_of_gpu_0/res_conv1_bn_2 and node_of_gpu_0/pool1_1 are all fused together?
Please clarify.

Thanks and Regards

Nagaraj Trivedi

Hi,

Generally yes but you can find more details with the --verbose flag.

For example:

$ /usr/src/tensorrt/bin/trtexec --onnx=/usr/src/tensorrt/data/mnist/mnist.onnx --verbose --buildOnly
...
[11/16/2023-03:13:43] [I] Finish parsing network model
[11/16/2023-03:13:43] [V] [TRT] Original: 18 layers
[11/16/2023-03:13:43] [V] [TRT] After dead-layer removal: 18 layers
[11/16/2023-03:13:43] [V] [TRT] Applying generic optimizations to the graph for inference.
[11/16/2023-03:13:43] [V] [TRT] Running: ConstShuffleFusion on Parameter193
[11/16/2023-03:13:43] [V] [TRT] ConstShuffleFusion: Fusing Parameter193 with Times212_reshape1
[11/16/2023-03:13:43] [V] [TRT] Running: ConstShuffleFusion on Parameter6
[11/16/2023-03:13:43] [V] [TRT] ConstShuffleFusion: Fusing Parameter6 with (Unnamed Layer* 4) [Shuffle]
[11/16/2023-03:13:43] [V] [TRT] Running: ConstShuffleFusion on Parameter88
[11/16/2023-03:13:43] [V] [TRT] ConstShuffleFusion: Fusing Parameter88 with (Unnamed Layer* 10) [Shuffle]
[11/16/2023-03:13:43] [V] [TRT] After Myelin optimization: 15 layers
[11/16/2023-03:13:43] [V] [TRT] Running: MatMulToConvTransform on Times212
[11/16/2023-03:13:43] [V] [TRT] Convert layer type of Times212 from MATRIX_MULTIPLY to CONVOLUTION
[11/16/2023-03:13:43] [V] [TRT] Running: ConstEltFusion on Parameter6 + (Unnamed Layer* 4) [Shuffle]
[11/16/2023-03:13:43] [V] [TRT] ConstEltFusion: Fusing Parameter6 + (Unnamed Layer* 4) [Shuffle] with Plus30
[11/16/2023-03:13:43] [V] [TRT] Running: ConstEltFusion on Parameter88 + (Unnamed Layer* 10) [Shuffle]
[11/16/2023-03:13:43] [V] [TRT] ConstEltFusion: Fusing Parameter88 + (Unnamed Layer* 10) [Shuffle] with Plus112
[11/16/2023-03:13:43] [V] [TRT] Running: ShuffleShuffleFusion on Times212_reshape0
[11/16/2023-03:13:43] [V] [TRT] ShuffleShuffleFusion: Fusing Times212_reshape0 with reshape_before_Times212
[11/16/2023-03:13:43] [V] [TRT] Running: ConvReshapeBiasAddFusion on Times212
[11/16/2023-03:13:43] [V] [TRT] Running: ConvScaleFusion on Convolution28
[11/16/2023-03:13:43] [V] [TRT] ConvScaleFusion: Fusing Convolution28 with Parameter6 + (Unnamed Layer* 4) [Shuffle] + Plus30
[11/16/2023-03:13:43] [V] [TRT] Running: ConvScaleFusion on Convolution110
[11/16/2023-03:13:43] [V] [TRT] ConvScaleFusion: Fusing Convolution110 with Parameter88 + (Unnamed Layer* 10) [Shuffle] + Plus112
[11/16/2023-03:13:43] [V] [TRT] Applying ScaleNodes fusions.
[11/16/2023-03:13:43] [V] [TRT] After scale fusion: 9 layers
[11/16/2023-03:13:43] [V] [TRT] Running: ConvReluFusion on Convolution28 + Parameter6 + (Unnamed Layer* 4) [Shuffle] + Plus30
[11/16/2023-03:13:43] [V] [TRT] ConvReluFusion: Fusing Convolution28 + Parameter6 + (Unnamed Layer* 4) [Shuffle] + Plus30 with ReLU32
[11/16/2023-03:13:43] [V] [TRT] Running: ConvReluFusion on Convolution110 + Parameter88 + (Unnamed Layer* 10) [Shuffle] + Plus112
[11/16/2023-03:13:43] [V] [TRT] ConvReluFusion: Fusing Convolution110 + Parameter88 + (Unnamed Layer* 10) [Shuffle] + Plus112 with ReLU114
[11/16/2023-03:13:43] [V] [TRT] After dupe layer removal: 7 layers
[11/16/2023-03:13:43] [V] [TRT] After final dead-layer removal: 7 layers
[11/16/2023-03:13:43] [V] [TRT] After tensor merging: 7 layers
[11/16/2023-03:13:43] [V] [TRT] After vertical fusions: 7 layers
[11/16/2023-03:13:43] [V] [TRT] After dupe layer removal: 7 layers
[11/16/2023-03:13:43] [V] [TRT] After final dead-layer removal: 7 layers
[11/16/2023-03:13:43] [V] [TRT] After tensor merging: 7 layers
[11/16/2023-03:13:43] [V] [TRT] After slice removal: 7 layers
[11/16/2023-03:13:43] [V] [TRT] After concat removal: 7 layers
[11/16/2023-03:13:43] [V] [TRT] Trying to split Reshape and strided tensor
[11/16/2023-03:13:43] [V] [TRT] Graph construction and optimization completed in 0.00541636 seconds.
...

Thanks.

Hi, thank you for your response. I have verified it. In my case also it is doing the the layer fusion.

I have search taking few words from the log messages related to fusion. For example “Fusing” “fusion” and searched in the entire directory of source code /usr/src/tensorrt/
but could not find in the source that logs this message particularly the any string or sentence containing “fusion” or “Fusing”

Also another clarification which I have is what if I don’t want the layer fusion to happen, then how to disable it?
Please clarify it.
Below are the log messages from where I took the words Fusing and fusion for greping(searching) in the source code.

[11/15/2023-10:24:12] [V] [TRT] Fusing convolution weights from node_of_gpu_0/res5_2_branch2b_1 with scale node_of_gpu_0/res5_2_branch2b_bn_1
[11/15/2023-10:24:12] [V] [TRT] Fusing convolution weights from node_of_gpu_0/res5_2_branch2c_1 with scale node_of_gpu_0/res5_2_branch2c_bn_1
[11/15/2023-10:24:12] [V] [TRT] After scale fusion: 123 layers
[11/15/2023-10:24:12] [V] [TRT] ConvReluFusion: Fusing node_of_gpu_0/conv1_1 with node_of_gpu_0/res_conv1_bn_2
[11/15/2023-10:24:12] [V] [TRT] ConvActPoolFusion: Fusing node_of_gpu_0/conv1_1 + node_of_gpu_0/res_conv1_bn_2 with node_of_gpu_0/pool1_1
[11/1

Thanks and Regards

Nagaraj Trivedi

Hi,

The optimization is done by default and cannot be turnoff.

Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.