In my GPU #0, 11341MiB of GPU RAM is used, and no process is listed by nvidia-smi. How is that possible, and how can I get my memory back?
Thu Aug 18 14:27:58 2016
+------------------------------------------------------+
| NVIDIA-SMI 352.63 Driver Version: 352.63 |
|-------------------------------+----------------------+----------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
|===============================+======================+======================|
| 0 GeForce GTX TIT... Off | 0000:02:00.0 Off | N/A |
| 29% 61C P2 71W / 250W | 11341MiB / 12287MiB | 0% Default |
+-------------------------------+----------------------+----------------------+
| 1 GeForce GTX TIT... Off | 0000:03:00.0 Off | N/A |
| 22% 42C P0 71W / 250W | 23MiB / 12287MiB | 0% Default |
+-------------------------------+----------------------+----------------------+
| 2 GeForce GTX TIT... Off | 0000:82:00.0 Off | N/A |
| 22% 35C P0 69W / 250W | 23MiB / 12287MiB | 0% Default |
+-------------------------------+----------------------+----------------------+
| 3 GeForce GTX TIT... Off | 0000:83:00.0 Off | N/A |
| 0% 33C P0 60W / 250W | 23MiB / 12287MiB | 0% Default |
+-------------------------------+----------------------+----------------------+
+-----------------------------------------------------------------------------+
| Processes: GPU Memory |
| GPU PID Type Process name Usage |
|=============================================================================|
| No running processes found |
+-----------------------------------------------------------------------------+
I had launched a Theano Python script with a lib.cnmem=0.9 flag, which explains why it used 11341MiB of GPU memory (the CNMeM library is a “simple library to help the Deep Learning frameworks manage CUDA memory.”.). However, I killed the script, and was expecting the GPU memory to get released. pkill -9 python did not help.
I use a GeForce GTX Titan Maxwell with Ubuntu 14.04.4 LTS x64.