SOLUTION FOUND: when exporting the .pth to .onnx via torch.onnx.export(model, dummy_input, “model.onnx”, opset_version=14) set the opset_version to 14, this will work
Related topics
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| Installing TensorRT 8.4 | 2 | 106 | December 23, 2024 | |
| Can I install TRT 8.5 on Jetpack4.6.3? | 10 | 486 | September 1, 2023 | |
| I want to install TensorRT 8.0 without using jetpack | 2 | 1180 | November 4, 2021 | |
| Unable to install TensorRT7.1.3 by Debian Installation | 13 | 1157 | October 3, 2022 | |
| TensorRT on Jetson Nano (Jetpack4.6.1) for Python3.8 Cuda10.2 | 5 | 1038 | February 10, 2025 | |
| Unable to Install TensorRT 7.1.3.4 in Jetson Nano(Jetpack 4.5) | 2 | 316 | September 6, 2023 | |
| Fail to import tensorRT on Jetson Nano | 3 | 1661 | October 15, 2021 | |
| Tensorrt in python3.8 venv on jetson nano | 4 | 400 | April 24, 2024 | |
| TensorRT Model building issues. Can TensorRT be updated? | 4 | 535 | April 9, 2025 | |
| Tensorflow pointNet model conversion into tensorrt jetson nano | 13 | 867 | August 23, 2023 |