I decided to follow one of the suggestions present in this forum [url]http://www.nvnews.net/vbulletin/showthread.php?t=177732&page=7[/url] and it seems to be working. After disabling the thermal monitor in nvidia-settings the simulation passed through the memory leak point without any problems. I will run longer simulations and post them back here…
After running a few tests with msi enabled and thermal monitor disabled in nvidia-settings I can say that they do not solve the memory leak problem, but they clearly provide a tremendous improvement. Now it is not always that there is a memory leak and when it happens it takes much more time than before…
It seams that there is something strange in the way the nvidia driver is handling interruptions…