How to confirm jetson inference train.py is using GPU for processing

I am following the AI Hello World tutorial (being new to the Nvidia Jetson world and AI generally)

(jetson-inference/pytorch-cat-dog.md at master · dusty-nv/jetson-inference · GitHub)

When trying to retrain my first model (cat_dog), everything is working but jtop shows me that the CPU and not GPU appears to be doing all the work.

I am running Jetpack 4.4.1. and using the docker containers from dusty-nv’s github.

Firstly, should the GPU be under load when retraining?

Secondly, if yes, what is the best way to check the GPU is set up correctly?

Thanks for reading.

Quick update, it appears that jtop had merely frozen on me and was not displaying the GPU usage and temperature rise whist training.

I have restarted jtop, and running another epoch of training and it is showing that yes indeed the GPU is working hard.

OK gotcha, and yes the PyTorch training from Hello AI World will automatically use GPU. For example, here in the code it transfers the model to GPU so that CUDA/cuDNN is used:

The input tensors also get put on the GPU. For these examples, I have the default GPU hardcoded to 0 (since the Jetson has one GPU).