Impossible to train/test/inference in GPU Jetson Nano

Hey everyone!

I found troubling trying to use my GPU in Jetson Nano. I have tried both, Tensorflow and Pytorch installed from Nvidia Docs. The code is simple ‘Hello AI World’ type, in short no problem with it. However, when I run it, it stuck in training. And it does not move on anywhere and Jetson just does not operate, i.e. closing terminal or rebooting. Nevertheless, when I run on CPU mode, everything works just fine.

Things I tried:

  1. Latest Jetpack and Tensorflow installed from Nvidia Docs.
  2. Latest Jetpack and PyTorch installed from Nvidia Docs.
  3. Latest Jetpack and downgraded PyTorch (1.7.0, 1.8.0)
  4. Downgraded Jetpack and latest PyTorch…

In short, whatever I do, it is impossible to train on GPU.

I don’t know where problem is. Could this be due to Power supply? (Jetson is connected to USB cable and travel adapted charger with 9 V and 1.67 A or 5 V and 2.0 A output.)

I would be grateful for any clues.

Hi @islomgr, did you install the PyTorch from this topic?

Are you able to run the training using the jetson-inference container? That has a known-working install of PyTorch already in it.

It is possible that the training would run slowly due to the limited power available from USB power supply. Instead I recommend using 5V⎓4A DC Barrel Jack power supply like from this thread:

However if your Nano isn’t turning itself off, then it should still be possible to run with lower amperage power supply (albeit more slowly). Can you keep an eye on the memory usage with tegrastats or jtop? Do you have swap mounted, like shown here?

Also, which example from Hello AI World are you trying to train? Are you trying the cat/dog image classification example?

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.