Requirement for padding < kernelSize makes DLA unusable

Continuing the discussion from DLA bugs using deep-lab-v3 style network:

Any network that uses dilated convolutions while keeping the input/output size the same is unable to run on DLA. This excludes anything like dilated resnet, deeplab, etc.

We were hoping this situation would improve with DLA2 but no such luck.

Do you have any solutions to running networks like this on the DLA?

There is no update from you for a period, assuming this is not an issue any more.
Hence we are closing this topic. If need further support, please open a new one.
Thanks

Hi,

Could you share a simple reproducible network so we can check it internally?

Thanks.

It’s the one I DMed you a little over 2 years ago. I’ll bump the message.