Display on Intel / compute on NVIDIA -> screen flash WHY?

I am successfully displaying Windows 7 using the integrated Intel GPU. I made sure that in the BIOS of my ASROCK H67M board I set the integrated graphics as enabled/primary and support multiple displays. Meanwhile I have a GT710 card running and developing CUDA code on it.

My question is, when I compute (CUDA on the GEFORCE card), Windows 7 flashes, goes dark and gives me a “Graphics Not Responding” warning.

Does anybody understand the innards of how Windows graphics works and why it cares what’s happening on the NVIDIA chip? It seems like there is still a shared frame buffer or bus or something - but I’ve read through many posts and it seems that nobody really explains why we can’t cleanly isolate multiple GPUs in a Windows computing system???

A VGA device (i.e. something that publishes a VGA classcode in PCI config space) that has a proper WDDM driver loaded will be included by windows in the WDDM system, whether it is hosting a display or not. This is according to Microsoft’s design, and GPU manufacturers have no control over this.

Windows WDDM has a watchdog timer (the TDR mechanism) which checks all WDDM GPUs in a system. If any of them stop responding, you will observe an error message, and the system will reset the GPU driver.

If you want to cleanly isolate a GPU in a windows system, it is necessary to remove it from the WDDM subsystem. The TCC driver by NVIDIA does exactly this. However TCC mode is only supported on certain GPUs. It is not supported by your GT710.

If you want to avoid the WDDM watchdog in a windows system and a GeForce GPU, the only option is to modify the TDR (watchdog). One possible method to do this is covered in Nsight VSE, see here: