How to find out how much global memory is being used?

How do I find out how much global memory I have allocated on the device without manually adding up all my allocations to find out how much I’m using?

It would be useful to have a total memory allocated and percentage of the device memory being used after running an application on the gpu. This information doesn’t appear to be available through the cuda profiler.

The driver API function cuMemGetInfo() should give you what you need.