Xorg VRAM leak because of Qt/OpenGL Application

Hello board,

I am working on a complex Qt/OpenGL Application.
Xorg starts leaking in VRAM when i’m using the application and never release the memory, until I restart X of course.

$ nvidia-smi
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 390.48                 Driver Version: 390.48                    |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|===============================+======================+======================|
|   0  GeForce GTX 105...  Off  | 00000000:01:00.0 Off |                  N/A |
| N/A   46C    P8     4W /  N/A |     50MiB /  4040MiB |      0%      Default |
+-------------------------------+----------------------+----------------------+
                                                                               
+-----------------------------------------------------------------------------+
| Processes:                                                       GPU Memory |
|  GPU       PID   Type   Process name                             Usage      |
|=============================================================================|
|    0     29628      G   /usr/lib/xorg-server/Xorg                     47MiB |
+-----------------------------------------------------------------------------+
$ ./myOpenGLQtBasedApp ... doing graphic stuff then exiting
$ nvidia-smi
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 390.48                 Driver Version: 390.48                    |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|===============================+======================+======================|
|   0  GeForce GTX 105...  Off  | 00000000:01:00.0 Off |                  N/A |
| N/A   46C    P8     4W /  N/A |    110MiB /  4040MiB |      0%      Default |
+-------------------------------+----------------------+----------------------+
                                                                               
+-----------------------------------------------------------------------------+
| Processes:                                                       GPU Memory |
|  GPU       PID   Type   Process name                             Usage      |
|=============================================================================|
|    0     29628      G   /usr/lib/xorg-server/Xorg                    107MiB |
+-----------------------------------------------------------------------------+

The version of Xorg does not matter, tested a few.
The version of the driver does not matter, as long as it’s nvidia, tested 340, 384, 390
The linux distribution does not matter, tested Ubuntu 16.04, 18.04, fedora
The de does not matter, tested Unity, Gnome-shell, Xfce, Lxde + Compton, Openbox + compton
The compositor used does not matter, but the leak disappear without a compositor.
I did not test Wayland.

Do you know what could cause this behavior ?
Could this be a nvidia driver bug ?
If not, what could, in our code create this behavior ?

The issue was resolved thanks to an intense debugging session.

This was a Qt issue.
The leak was caused by NULL parenting the parent of the windowContainer containing the QVTKOpenGLWindow just before deletion.

This code was here before when we used a QOpenGLWidget and it caused no issue. In any case, NULL parenting a widget before deletion is useless so removing the line resolve the issue.

This leak shouldn’t happen though, even in this situation, so I have opened a Qt issue to report it.
https://bugreports.qt.io/browse/QTBUG-69429