d.nbDims >=3 failed

i am running another model, but encounters an issue as attached shows, please help to check and solve it

Hi,

Could you enable the verbose option to get more log and share with us?

./trtexec --verbose ...

Thanks.

Hi,
Please see attachment with --verbose results

Hi,

Sorry for the late.

This error is caused by the tensor dimension incorrect.
It looks like there are lots of plugin layer inside your model.
Could you double check if the output dimension is under your expectation.

Thanks.

Hi,
I have using other two tools to deploy this model, both are successfully without any changes of the models.
The output dimension in the model should have no problem

Hi,

May I know the detail of other tools?

Suppose the plugin implementation is followed by our TensorRT API.
Is it correct?

Thanks.

Hi,
The tools are from other AI accelerator solutions,saying this just wants to show the models and prototxt have no problems

let’s focus on solving this issue, could you give a quick solution to soling this? the thing that an issue takes several days and still cannot find the root cause is really inefficient

Hi,

Could you share the prototxt with us so we can check it for you?
Thanks.

Hi,

Using caffe.flatten replacing torch.view, now the model can be supported, we will check the final results of the model later, thanks

Hi,

We have replaced torch.view layer with caffe.flatten layer, now we have checked flatten layeris not really supported and results in errors

please see attached pictures for errors and the parts of prototxt
flatten.jpg

Hi,

It looks like your model architecture changed.

There is a flattern plugin in our sample: /usr/src/tensorrt/samples/python/uff_ssd/plugin/FlattenConcat.cpp.
Would you mind to check if helps?

Thanks.

Hi,

This is a reshape layer at first with dim as below:
dim: 0
dim: -1
The purpose is to reshape the previous layer to 1x1024 weights for BN layer inputs
But TensorRT cannot support reshape layer with (0,-1) attributes, so we change reshape to flatten layer, but TensorRT only supports flatten layer input to fully connect layers, while in our case flatten layer input to BN layer.

So,

  1. I am trying setting reshape layer attributes to proper NHCW attributes to output 1x1024, can you provide some advice?
  2. Does Nano have other supported OP to support this layer
  3. I will check the custom Flatten layer

Hi,

I have tried different combinations in method 1), now the layer can be supported, I will check the last results later

Thanks