Maximum allocation size? Workspace issues

Hello!

I am trying to create a workspace of GPU space. The purpose of this workspace is to help head off memory errors before they happen (then, hopefully, my code can die more gracefully than it has been).

I am attempting to allocate the largest amount of global memory on the gpu that is possible (in one allocation), but my allocates seem to fail at a size that is much, much smaller than what is returned by cudaGetDeviceProperties->totalGlobalMem. Does anyone know what the largest space is that I can allocate?

I am using a Tesla C2070, with a reported global mem (bytes) = 5636292608. I cannot allocate more than about 40% of that global mem size.

Thanks,
Jeremiah

Nevermind. I found a bug in my code.