I’m wondering if there’s a way to install TensorRT 8.4.3 on the Jetson Nano Developer Kit. Currently I have Jetpack 4.6.1 installed which gives me access to TRT 8.2.1 but this throws an error when I try to convert an upsampling node to .trt format. However, on google colab I was able to get the model to convert using TRT 8.4.3 but so far none of the installation methods shown on the NVIDIA documentation pages have worked to install this version on the nano. Is there a way to do this?
Let me change the question a little - what’s the best way to avoid this error:
In node 84 (importUpsample): UNSUPPORTED_NODE: Assertion failed: (scales_input.is_weights()) && "The scales input must be an initializer."
I get this when using TensorRT 8.2.1 converting onnx to trt within tensorrt.OnnxParser but this does not happen in TensorRT 8.4.3. Can I somehow modify my own 8.2.1 backend code to accomodate this operation?
SOLUTION FOUND: when exporting the .pth to .onnx via torch.onnx.export(model, dummy_input, “model.onnx”, opset_version=14) set the opset_version to 14, this will work