Can I use Jetson Nano to train a neural network?

Hi,
I went through the tutorials and some other github repositories for Jetson nano, it seems to me that Jetson nano can only be used for inference, the neural network is either trained using digits on cloud or pre-trained from a PC with GPU. I am wondering if it’s possible to use Jetson nano for training with pytorch (which I have installed, but don’t quite know how to use it on Jetson nano).
Thanks for your help!

Hi leey10, yes, since you can install the full versions of PyTorch, TensorFlow, ect on Nano you are technically able to run training as well. It won’t be particularly fast, which is why you see mostly inferencing being discussed, but you could transfer learning on a pre-trained network overnight.

I included a couple results from the PyTorch imagenet example in this blog:

https://devblogs.nvidia.com/jetson-nano-ai-computing/

You may want/need to mount a swap file should you need extra memory for training.

Hi leey10,
I wrote a blog post on how to perform transfer learning training on Jetson Nano with PyTorch. Here is the link
[url]https://www.zaferarican.com/post/transfer-learning-training-on-jetson-nano-with-pytorch[/url]

I hope it helps

Nice! Thanks for sharing!

I am glad to hear this *can" be done locally (even if it is slow compared to using a beefy desktop or server for the training). There is something fundamentally important (to me at least) about having edge nodes be capable of that kind of self-sufficiency (like being able to train the networks they will later run, as well as being able to function as fully self-hosting Linux systems with the ability to rebuild their own kernel, applications, and so on from source).

Even if in practice most of the time people do use their larger & more powerful desktop or even one or more remote systems “in the cloud” I feel the self-hosting capabilities are vital having seen far to many devices meant to tie in to cloud services become useless bricks when the network becomes spotty or when their original manufacturer folds or decides to deprecate the cloud service they require.

Maybe I’m just a pessimist but when I evaluate a device of fhis nature I evaluate it with the assumption that it is only as useful and capable as it would be if it (and for tiny embedded systems, a desktop for development and programming as well) were on a network island air-gapped from all else. For some applications this is actually a reasonable first order requirement, but even when it isn’t thinking it through sheds light on how independent such a node can ever be and what level of resilience a system built atop it would have.

In this respect the Jetson Nano is a gem in that its software stack is not built to lock users helplessly into any kind or service/subscription/etc. the way so many other device manufacturers do. The Jetson Nano as a truly self-hosting system is worthy of praise on those grounds as well as the more obvious grounds of price, performance, and power consumption. Here’s for self-hosting systems!

Cheers,
-lars

1 Like

Hi Lars, glad you find this topic interesting. FYI, there is an updated tutorial on training object detection models onboard Jetson available here:

https://github.com/dusty-nv/jetson-inference#training