GPU soft-reset / freeing all resources possible?

Hi everybody,

my application repeatedly allocates and de-allocates large chunks of memory (up to 200MB, no easy way to avoid that, unfortunately). Part of the application also uses OpenGL. I run into the problem that cudaMallocs randomly (?) fail even though cuMemGetInfo reports enough free memory (e.g., a 200MB malloc fails even though 300MB are reported free). The failure rate depends on the delay between application launches (waiting between launches reduces failure rate) and on how often I repeate allocation/deallocation operations (later operations during an application launch fail more often).

I suspect GPU memory fragmentation is the culprit. Therefore my question: is there any programmatic way (CUDA command, OpenGL, other interface) to force a GPU soft-reset or to otherwise free all allocated GPU memory? The application will be the only program running, so I could afford to lose all allocated GPU resources (if the OS is not be affected). I am running Windows XP.

Any ideas? Let me know if you need further details.

Cheers, and thanks,
ehan6

cudaThreadExit() ?

What driver are you using?

Sorry for the late reply, somehow my email alerts for new posts didn’t quite work …

cudaThreadExit() does the job of freeing all resources that were allocated in CUDA. But, apparently, this does not free resources allocated with OpenGL routines before. I can clearly see that running GPU-memory intensive tasks before my CUDA algorithms makes the CUDA part fail randomly with ‘out of memory’ errors. When not running the OpenGL part, the CUDA algorithms run fine. But, as far as I see, the OpenGL code deallocates all its resources on exit. So that’s a bit puzzling. Also, when I run the OpenGL part, wait for a few minutes, and then run the CUDA part, it usually also works fine. A vague guess would be that the driver does some clean-up tasks in regular intervals, so some resources are not free’d on the spot.

I run on Microsoft Windows XP SP 2, driver version 197.59 .

Thanks,
ehan6

bump …

ehan6, did you get to solve this? Im having the same problem…

ehan6, did you get to solve this? Im having the same problem…

No, I haven’t solved this yet. Sorry.

No, I haven’t solved this yet. Sorry.

On Linux, rmmod + insmod of nvidia.ko will do the trick.
On Vista/Win7, you can reset the driver using devcon (http://support.microsoft.com/kb/311272)
On WinXP…i don’t know. Ideally when you quit a graphics app, the driver’s video memory heap should return to its previous state and there should be no fragmentation. Maybe you should open task manager and kill all clients that might be using the heap.

On Linux, rmmod + insmod of nvidia.ko will do the trick.
On Vista/Win7, you can reset the driver using devcon (http://support.microsoft.com/kb/311272)
On WinXP…i don’t know. Ideally when you quit a graphics app, the driver’s video memory heap should return to its previous state and there should be no fragmentation. Maybe you should open task manager and kill all clients that might be using the heap.

Thanks! That’s the first real answer I get to that question. However, I would like to do the ‘reset’ without killing the respective application. Maybe that’s asking a bit too much.

Thanks! That’s the first real answer I get to that question. However, I would like to do the ‘reset’ without killing the respective application. Maybe that’s asking a bit too much.

With Fermi you may use custom memory manager.

Hi Lev,

could you please address in more detail?

Thanks,

Tuan