Mode Switching and CUDA memory

I was looking at the new GTX295 cards today (they have a ton of memory!) and I thought about something…

It has been covered extensively on the forum here that if the video mode switches, the card may overwrite memory currently being used by a CUDA program, and cause it to crash. With some of the new video cards having way more memory than a regular video driver (i.e. normal windows usage) will use, why not detect (in the driver) the maximum amount of memory that could be used by the windows driver – determined by the max color bit-depth and resolution. Then, just disallow CUDA programs from allocating that memory, and the problem should go away.

The reason I ask this is because it seems so simple that I know it must have been brought up before…so what is the problem with doing this that would make it not work? Or, if it does work, why hasn’t nVidia included this workaround yet?

That sounds like one approach. However, I think that on Vista it’s not even necessary since VRAM is virtualized and in a conflict can be just swapped to RAM. NVIDIA has several options for fixing this, and I hope that they’re intending to!