I don’t know how that article installs it, but I would recommend trying the l4t-pytorch container for a compatible version of JetPack-L4T that you are running and seeing if the performance is different using the container, because those images come with PyTorch/torchvision already installed and tested using the wheels from this post.
Related topics
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| How to run pytorch custom inference on Jetson Nano's GPU? | 4 | 1294 | June 21, 2022 | |
| Strange jumping results on FPS and inference time | 9 | 1312 | October 18, 2021 | |
| Jetson nano slow cuda times with pytorch | 14 | 1327 | October 11, 2023 | |
| Torch Tensor.cuda() very slow | 6 | 3519 | October 18, 2021 | |
| Slow CUDA Loading&Initialisation / GPU Warmup issue | 7 | 1646 | July 21, 2023 | |
| Jetson Nano slow CUDA performance vs CPU | 6 | 1042 | February 26, 2024 | |
| Jetson Nano Pytorch : moving image to cuda takes too much time | 2 | 500 | October 15, 2021 | |
| Torch model | 4 | 561 | August 4, 2022 | |
| Run pytorch custom inference on Jetson Nano’s GPU but it stop | 5 | 115 | March 26, 2025 | |
| Jetson nano sometimes extremely slow with GPU | 7 | 1351 | November 3, 2023 |