Let’s say you are training model or do some GPU manipulations.
How you can check GPU memory remaining in Jetson Nano using Python?
Ideal scenario is to use some functions available e.g. in numba, tensorflow, pytorch, etc.
from something import something_showing_Jetson_Nano_GPU_Memory
total, used, remaining = something_showing_Jetson_Nano_GPU_Memory()
print(f"total {total} GB used {used} GB remaining {remaining} GB")
total 4.00 GB used 3.51 GB remaining 0.47 GB
It looks like memory information is available in /proc/meminfo. Details are here https://github.com/ItsSiddharth/Py_Monitor_JetsonTX2/blob/master/Py_Monitor_JetsonTX2/__init__.py