Running OpenSceneGraph / OpenGL previews on NVIDIA GTX 1070 with Max-Q Design - GPU memory leak

Hello everyone,

Thank you for your time and attention!

I am a software developer working on a software which includes multiple, built-in, real-time OpenSceneGraph / OpenGL previews. One of our customers reported severe problems when using our software on the following system:

  • ASUSTeK GX501VSK Notebook
  • Intel Core i7-7700HQ CPU @ 2.80GHz (8 CPUs), ~2.8GHz
  • NVIDIA GeForce GTX 1070 with Max-Q Design
  • 16 GB RAM
  • Windows 10 Pro 64 bit
  • NVIDIA driver: 390.77
  • Monitor resolution: 1920 x 1080 @ 120 Hz, 120 DPI
  • Color depth: 32 bit

Besides this system, the costumer has a similar notebook with an NVIDIA GeForce GTX 1080 which shows the same problematical behavior (please see details below). At the moment, any other computer does not seem to be affected, neither of our customers nor in our company.

Question 1: Are there any differences affecting application programming between the NVIDIA GeForce GTX 1070 with Max-Q Design and former NVIDIA GeForce cards / NVIDIA cards without special optimizations?

According to the costumer’s report and further tests I did via TeamViewer on his computer, the symptoms of the problem are the following:

  • When running our software with previews enabled, the Dedicated GPU Memory (according to the Windows Task Manager) continuously increases up to 100 %, i.e. 8 GB.
  • Our software behaves unexpectedly or crashes at usually uncritical operations, even before the memory usage reaches the maximum value.
  • In some tests, OpenSceneGraph signalled the warning "detected OpenGL error 'out of memory' at after RenderBin::draw(..)".
  • The growth rate of the memory usage in the default use case of our software is around 0.1 GB per two or three seconds.
  • The increasing is slower, when less previews are enabled or the monitor refresh rate is reduced, but the memory still runs full.
  • When deactivating all previews at run-time, the memory usage does not increase anymore.
  • When our software is closed, the memory usage immediately jumps back to a minimum value.
  • Apart from the Dedicated GPU Memory, the other parameters in the Windows Task Manager do not indicate any abnormalities.

I started an OpenSceneGraph example program as well as other graphically sophisticated applications on the customer’s system and none of them showed the same issue. Therefore, here is a short overview of how our software realizes the previews:

  • Our software consists of multiple tiers, finally linking against OpenSceneGraph 3.0.1 and Qt 4.8.7.
  • The previews are basically osgQt::GLWidgets driven by osgQt::GraphicsWindowQt. After creation, each GLWidget is embedded in a common QWidget.
  • In our previews, the texture contents are continuously updated (e.g. with 50 FPS). In contrast, the texture sizes and the geometries are changed on demand only, which is rarely the case.
  • Furthermore, we use osgViewer::CompositeViewer in single-threaded mode.

When searching for ‘GTX1070 memory leak’ on Google, many articles pop up, which sounds as if there really is a hardware/driver issue (http://hexus.net/tech/news/graphics/98335-nvidia-geforce-gtx-1070-bios-updates-fix-memory-issues/).

Question 2: Am I missing any known issues referring to NVIDIA cards or drivers in this use case?

I have tried to change different 3D settings in the NVIDIA Control Panel, especially G-Sync, but they did not have any meaningful impact.

Question 3: Which settings could also be worth experimenting with?

Any thoughts on the above-mentioned observations, suggestions for further test cases to single out specific factors, or ideally a fix for the problem would be highly appreciated.

Thank you very much!

Kind regards,
Matthias

Hello again,

Has anybody experienced a similar problem?
Do you have any hints to help me go further?

Thank you very much!

Kind regards,
Matthias