GPU usage

hello all just setup cuda on my laptop (using Quadoro M1200) and running VGG deep learning program to train 23000 images with batch size = 32.
Below is the output of nvidia-smi. Firstly I dont understand why its showing 4096 MiB as the available memory as i checked in dxdiag that the M1200 has avl memory of 20375 MB.
Also why is it utilizing only 2914 MiB ?

Apologize for the rookie questions.

Sun Oct 29 11:52:23 2017
| NVIDIA-SMI 385.54 Driver Version: 385.54 |
| GPU Name TCC/WDDM | Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
| 0 Quadro M1200 WDDM | 00000000:01:00.0 Off | N/A |
| N/A 65C P0 N/A / N/A | 2914MiB / 4096MiB | 100% Default |

| Processes: GPU Memory |
| GPU PID Type Process name Usage |
| 0 5936 C …ta\Local\Continuum\anaconda3\python.exe N/A |

The Quadro M1200 does indeed have 4GB of memory. See here, for example:

nvidia-smi is a utility provided by NVIDIA itself, it would be exceedingly unlikely to report an incorrect amount of GPU memory. Why your application uses only 2914 MB out of the 4GB available you would have to ask the application vendor. It’s entire possible that whatever problem you gave the app to work on simply does not require any more memory to handle.

Thanks njuffa. The code ran for 1 hr 45 mins. so I was thinking if it had used more memory then maybe it would have been faster