Is there a way to use TensorRT 8.4.3 on the Jetson Nano Dev Kit?


I’m wondering if there’s a way to install TensorRT 8.4.3 on the Jetson Nano Developer Kit. Currently I have Jetpack 4.6.1 installed which gives me access to TRT 8.2.1 but this throws an error when I try to convert an upsampling node to .trt format. However, on google colab I was able to get the model to convert using TRT 8.4.3 but so far none of the installation methods shown on the NVIDIA documentation pages have worked to install this version on the nano. Is there a way to do this?

Please refer to the installation steps from the below link if in case you are missing on anything

Also, we suggest you to use TRT NGC containers to avoid any system dependency related issues.


Let me change the question a little - what’s the best way to avoid this error:

In node 84 (importUpsample): UNSUPPORTED_NODE: Assertion failed: (scales_input.is_weights()) && "The scales input must be an initializer."

I get this when using TensorRT 8.2.1 converting onnx to trt within tensorrt.OnnxParser but this does not happen in TensorRT 8.4.3. Can I somehow modify my own 8.2.1 backend code to accomodate this operation?

SOLUTION FOUND: when exporting the .pth to .onnx via torch.onnx.export(model, dummy_input, “model.onnx”, opset_version=14) set the opset_version to 14, this will work

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.