Problem deserializing TensorRT custom plugin on Jetson Nano

Hi, as I already discussed in this thread, I have a problem when trying to run inference, with a TensorRT model on a Jetson Nano

More specifically, the problem seems to be related to the custom plugin which I need to add to the model. The engine serializes with no problems, but when I try to use that for inference i get this assertion error:

#assertionbatchedNMSPlugin.cpp,116
Aborted (core dumped)

Also, the model deserializes correctly in all the platforms that I had the opportunity to test on (rts2080ti, T4, V100), but the Jetson Nano;
so it looks like it is a very platform specific problem

I discussed this in the TensorRT forum, but as far as they know there is no platform specific setting that I have to use when running on a Nano;
further info on the discussion is at the link at the beginning of this post

Do you have any idea of what could be happening?

Thanks in advance,

f

Hi,

Have you also tried to run inference on a desktop environment?
It looks like this is an inference time error, is that correct?

By the way, if you can run inference successfully on the host, could you share the library version with us?
Thanks.

Hello,
yes, as I mentioned above, I successfully ran inference on rtx2080ti, T4 and V100.

I used both TensorRT 6.0.1.5 (with CUDA 10.1) and TensorRT 7.0.0.11 (with CUDA 10.2), on the above platforms and everything worked fine.

It definitely looks like some issue related to the custom plugin, the code of which I shared in the post on the TensorRT forum

if you need further information, let me know

thanks,

f

Hi,

Sorry for keeping you waiting.

There are some serializer-related issues fixed in our new TensorRT package.
Would you mind to give our new JetPack4.4 GA a try first?

Thanks.

Hi,
thanks for your reply;
I tried your suggestion and updated to the GA version of Jetpack, but the problem persists;

Originally I thought the problem was in deserializing the custom plugin, but after some tests with a dummy custom plugin I found out that it serializes/deserializes correctly;

the element that appears to be problematic is NMSPlugin, since I receive this error at deserialization time

#assertionbatchedNMSPlugin.cpp,116
Aborted (core dumped)

NMSPlugin is a native TRT plugin so I dont know how to investigate further;

do you have an idea of what could be happening?

Thanks for your support,

f

Hi,

You can find the TensorRT plugin source here:

Please help to check where the assertion comes from.
Detail to build and replace the plugin library can be found here:

Thanks.