Layer maxpool in dla fallback

Please provide the following info (check/uncheck the boxes after creating this topic):
Software Version
DRIVE OS Linux 5.2.6
DRIVE OS Linux 5.2.0
[y] DRIVE OS Linux 5.2.0 and DriveWorks 3.5
NVIDIA DRIVE™ Software 10.0 (Linux)
NVIDIA DRIVE™ Software 9.0 (Linux)
other DRIVE OS version

Target Operating System
[y] Linux

Hardware Platform
[y] NVIDIA DRIVE™ AGX Xavier DevKit (E3550)
NVIDIA DRIVE™ AGX Pegasus DevKit (E3550)

SDK Manager Version

Host Machine Version
[y] native Ubuntu 18.04

I transform model into dla, the layer maxpool fall back to gpu. I can not find the reason.
the log: MaxPool_2: DLA does not support exclusive pooling.
So, what is exclusive pooling?

my maxpool layer prop in onnx is as fllowing:
type: MaxPool
name: MaxPool_2
ceil_mode: 0;
kernel_shape: 3,3

in tensorrt doc:
Pooling layer

  • Only two spatial dimension operations are supported.
  • Both FP16 and INT8 are supported.
  • Operations supported: kMAX, kAVERAGE.
  • Dimensions of the window must be in the range [1, 8].
  • Dimensions of padding must be in the range [0, 7].
  • Dimensions of stride must be in the range [1, 16].
  • Exclusive padding with kAVERAGE pooling is not supported.
  • With INT8 mode, input and output tensor scales must be the same.

Dear @wang_chen2,
Could you share your model?

Sorry, company can not share the model

Dear @wang_chen2,
Could you share a simple model to reproduce the issue on my side.

Sorry, the net is isolated.
I can delete the maxPool

Please try to set to inclusive padding with AverageCountExcludesPadding(false).

Hi, @VickNV,
Thank you very much. The question is solved.