Program behaves differently when run under NSight graphics debugger

I have a program which is frequently rendering corrupt frames (a large portion of the frames render very obviously wrong, such as 1/2 the screen rendering black and portions rendering skewed). However, when run under nSight’s debugger, all graphical errors go away. So my question is how does the debugger change the render environment? I’m hoping by knowing how the environment changes, it will narrow down where the app is failing to use the DX11 API correctly and thus trashing rendering.

CPU debuggers do not make the problem go away, but any attempt to use a GPU debugger causes the bug to go away.

If this information is somewhere in the user documentation, I would love a link to where in it. I didn’t find it, obviously.