I have an application which uses DXGI Output Duplication for screen capture. I see memory leaks only with NVidia hardware, but also only in a very specific situation.
When capturing a full-screen application, and when alt-tabbed away from that application the call to AcquireNextFrame() will return DXGI_ERROR_ACCESS_LOST, which can occur in other situations too. Currently my application will after seeing this error a dozen or so times release and re-init the DXGI subsystem.
Ive spent a while looking for an unreleased resource or some such being responsible for the leak, however it doesn’t occur on AMD machines and so I’m wondering if its a corner-case and possibly a leak in the supporting drivers.
Today I release all the buffers, shaders, call ClearState() and Flush() in the device context, and then Release it and then finally the device itself along with the output duplication object. Re-init will start all the way back at D3D11CreateDevice()…
So when I trigger these re-inits, after doing this 100 times or so, gigabytes of memory are leaked. Again, none leaks when using AMD hardware. We have tested this on both older NVidia cards and the an NVidia 2080, we have a lot of different test machines for our application with different specs and software versions.