Jetson Nano most GPU memory is not available

Hello! Seems like something is using the video memory on my Jetson Nano and I can’t figure out what.

Since there is no nvidia-smi on Jetson Nano, I check video memory using Tensorflow. When tf.Session is initialized it logs:

2019-09-03 04:21:19.118556: I tensorflow/compiler/xla/service/service.cc:168]   StreamExecutor device (0): NVIDIA Tegra X1, Compute Capability 5.3
2019-09-03 04:21:19.118912: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1433] Found device 0 with properties:
name: NVIDIA Tegra X1 major: 5 minor: 3 memoryClockRate(GHz): 0.9216
pciBusID: 0000:00:00.0
totalMemory: 3.86GiB freeMemory: 2.86GiB

This is pure system after reboot with disabled GUI https://devtalk.nvidia.com/default/topic/1050739/how-to-boot-jetson-nano-in-text-mode-/

I tried to reboot, switch xorg off in case it took the memory, nothing helps.
I tried https://github.com/rbonghi/jetson_stats/wiki/jtop to monitor GPU usage but it doesn’t show the memory

Can you please say what might be the problem, or suggest a program which allows to view what processes use GPU

1 Like

Hi gribabas, the memory could be used by Linux for the system or TensorFlow may be allocating the memory as it is default policy of TensorFlow to allocate GPU memory up-front. See this post for TensorFlow commands to change it’s memory allocation behavior:

[url]Out of memory error from TensorFlow: any workaround for this, or do I just need a bigger boat? - Jetson Nano - NVIDIA Developer Forums

To test, you could run tegrastats in the background after a fresh reboot, or you could use the cudaMemGetInfo() function.

I have compiled the following code taken from here https://devtalk.nvidia.com/default/topic/491518/cuda-programming-and-performance/cudamemgetinfo-how-does-it-work-33-/post/3522842/#3522842

mem.c:

#include <stdio.h>
#include "cuda.h"

void checkGpuMem() {
  float free_m,total_m,used_m;

  size_t free_t,total_t;

  cudaMemGetInfo(&free_t,&total_t);

  free_m =(uint)free_t/1048576.0 ;

  total_m=(uint)total_t/1048576.0;

  used_m=total_m-free_m;

  printf ( "  mem free %f MB mem total %f MB mem used %f MB\n", free_m, total_m, used_m);
}


int main() {
    checkGpuMem();
    return 0;
}
$ nvcc mem.c -o mem
$ ./mem

it prints te after fresh reboot without gui:

mem free 2968.000000 MB mem total 3964.359375 MB mem used 996.359375 MB

So 1 gb is still occupied

I also found this message https://devtalk.nvidia.com/default/topic/1032815/jetson-tx2/how-can-i-see-gpu-memory-used-/post/5259244/#5259244
so tried to occupy all the RAM and received the following result

mem free 120.417969 MB mem total 3964.359375 MB mem used 3843.941406 MB

Do I understand correctly that RAM and video memory are the same on jetson?

Yes, that is correct, as the memory is shared between CPU and GPU. Since it is after a fresh reboot, that memory may be used by Linux for the OS and Ubuntu. You could cross-reference the amount of used/free memory with tegrastats or /proc/meminfo.

1 Like