hello all just setup cuda on my laptop (using Quadoro M1200) and running VGG deep learning program to train 23000 images with batch size = 32.
Below is the output of nvidia-smi. Firstly I dont understand why its showing 4096 MiB as the available memory as i checked in dxdiag that the M1200 has avl memory of 20375 MB.
Also why is it utilizing only 2914 MiB ?
Apologize for the rookie questions.
C:>nvidia-smi
Sun Oct 29 11:52:23 2017
±----------------------------------------------------------------------------+
| NVIDIA-SMI 385.54 Driver Version: 385.54 |
|-------------------------------±---------------------±---------------------+
| GPU Name TCC/WDDM | Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
|===============================+======================+======================|
| 0 Quadro M1200 WDDM | 00000000:01:00.0 Off | N/A |
| N/A 65C P0 N/A / N/A | 2914MiB / 4096MiB | 100% Default |
±------------------------------±---------------------±---------------------+
±----------------------------------------------------------------------------+
| Processes: GPU Memory |
| GPU PID Type Process name Usage |
|=============================================================================|
| 0 5936 C …ta\Local\Continuum\anaconda3\python.exe N/A |
±----------------------------------------------------------------------------+