Why do cudaMemGetInfo also occupies a lot of GPU memory?

When I use cudaMemGetInfo to query the size of free and used GPU memory, it immediately occupies about 213MB on my GPU (checked using nvidia-smi). I don’t know why. It’s not documented. There’s also someone who seems to encountered the same problem: cuda - Free memory occupied by cudaMemGetInfo - Stack Overflow

The answer is given at the link you already provided.

When you use CUDA for any purpose, it allocates a lot of memory on the GPU for overhead. This is covered in many places including the article you linked.

The memory isn’t associated with cudaGemGetInfo itself. It is associated with the context that must be created to support CUDA running on that GPU.

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.