Tools version issues while converting and deploying model to jetson nano trt

I have a pretrained UNet model in Pytorch. I convert it to onnx using:
torch.onnx.export(model, dummy_test, “best_15_1D_model.onnx”, verbose=True)

Now I am converting this to trt model, I tried to do it on the my local setup (not jetson nano) and use the converted model on jetson nano then I run into the issue:
[TensorRT] ERROR: coreReadArchive.cpp (38) - Serialization Error in verifyHeader: 0 (Version tag does not match)

I try the conversion on Jetson Nano with the preinstalled trt_exec in the JetPack. I run into the error: " [8] Assertion failed: mode == “constant” && value == 0.f && “This version of TensorRT only supports constant 0 padding!” ".

I looked at other posts and it seems I need to use a new version of tensor rt for this to work, so a version of >8.2 is required. But all versions of >8.2 need cuda 11.x. And Jetson Nano only supports cuda10.x ?

How do I work around this? Do I need to retrain my model without padding for this to work?

Hi,

Please noted that TensorRT doesn’t support portability.
You will need to generate and deploy with the same TensorRT software and GPU architecture.

It seems that the conversion works on your desktop environment.
Could you share the environment setting (TensorRT, cuDNN, and CUDA) of your local setup with us first?

Thanks.

Thank you for your quick reply.

You will need to generate and deploy with the same TensorRT software and GPU architecture.

Which TensorRT version could I use on Jetson Nano with JetPack version 4.5 to convert onnx to trt with padding support? It has cuda 10.2.

Could you share the environment setting (TensorRT, cuDNN, and CUDA) of your local setup with us first?

Yes the conversion worked on my local setup with TensorRT-8.2.4.2, CUDA toolkit version is 11.4, Driver Version: 470.103.01.
For cudnn, sudo apt search cudnn|grep installed returned:

cudnn-local-repo-ubuntu1804-8.4.0.27/now 1.0-1 amd64 [installed,local]
libcudnn8/unknown,now 8.4.0.27-1+cuda11.6 amd64 [installed]

Hello @AastaLLL any update on this?

Hi,

We also have a JetPack release that has TensorRT 8.2.
Would you mind upgrading your Nano to the latest JetPack 4.6.2 to see if it works?

Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.