Hi NVidia Team,
A couple of months ago we enhance our D3D11.3 based engine by using tiled resources. During testing we experienced strange GPU memory consumptions after resizing those buffers. When mapping tiles from regions beyond the tile pools initial size our gpu memeory footprint explodes.
Here is our repo case:
bool forceLeak = true;
// -> Sysinternals ProcessExplorer: dedicated GPU MEM: 4.8 MB ( GeForce RTX 2080Ti )
// -> Sysinternals ProcessExplorer: dedicated GPU MEM: 3.6 MB ( GeForce GTX Titan )
TiledResourceLeakRepoCase( device, context, forceLeak );
//-> Sysinternals ProcessExplorer: dedicated GPU MEM: 708.8 MB ( GeForce RTX 2080Ti )
//-> Sysinternals ProcessExplorer: dedicated GPU MEM: 387.6 MB ( GeForce GTX Titan )
TiledResourceLeakRepoCase.inc.txt (4.0 KB)
We test ist only on GTX Titan and 2080 Ti platforms with various drivers. In all situations the memory behaviour was the same, expect the magnitude of memory consumtion.
Any ideas what happens there?
PS: A relatet issue I’ve already posted here. Strange Behaviour of DX11 TiledResources at RTX driver ( RTX 2080 Ti )