Jetson Nano ONNX Export

I’m trying to convert my .pth model to onnx format by using onnx_export in jetson inference. However, when I run the below comment I get an error. How can I fix it.

1 Like


Have you installed PyTorch?
If yes, which version and how do you install it?


Hi, yes I installed pytorch it was v1.7 then I updated to v1.8 by using “pip3 install pytorch==1.8” comment. Now it says Torch not compiled with CUDA enabled.

Hi @CostGazelle, if you install PyTorch from pip/pip3, it won’t have CUDA enabled. For that, you have to install one of the PyTorch wheels from here: (those were built with CUDA support)

Hi @dusty_nv , I followed the link and installed the pytorch. However, in the verification step it prints CUDA available false.

Can you try doing a pip3 uninstall torch first and make sure that PyTorch is completely uninstalled, before re-installing the wheel from that Nano PyTorch topic?

@dusty_nv thank you so much. It solves the problem.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.