Shared System memory on Linux

Hello Everyone

I know that Windows provide Shared System memory for whole system. so It allow to allocate more memory than dedicated GPU has.

I am wondering, is it possible to do similar on Linux?
I think may be there is any option for that in drivers?

my issue is:
I have multiple OpenGL apps running in background, it don’t render anything, it just consume GPU memory on idle :(. however I cannot kill it because it provide other services for me. (on demand off screen render service)
Any other launched apps are stuck on gpu memory allocations/texture allocations because of no free gpu memory.

I am trying to find way how to extend video memory without hardware changes.

do you know any ways to reduce memory usage/limit usage for part of opengl apps?

What are the applications you are running? Can you share applications you are running? Please provide nvidia bug report. also see

Hello Sandip,

Actually this issue happens with any OpenGL aps, for example:

  • I have quadro card with 1024 megabytes of memory.
  1. Launch “steam -bigpicture”
  2. check nvidia-smi - steam process memory usage is 250 megabytes of video card memory, overall usage 500/1024
  3. Launch any game
  4. check nvidia-smi again - steam process memory usage is still same, so game has only 500 megabytes memory to use. But actually steam is background app now.

if game tries to allocate more than 500 megabytes then it fails and stuck. it is expected behavior for me. But I am looking for way how I can avoid it.

is there any option to use system memory as video memory for Linux, similar to windows “system shared memory”?

I don’t think that I need to provide any bug report because it is feature-question or feature-request.

Thank you

I believe you may be a little confused as to what Windows “system shared memory” is (there is no such thing with that name, and for a very long time our GPUs have been able to “spill” in system memory when video memory is exhausted, on Windows as well as on Linux).
In the situation you describe the behavior is expected - just because you’re starting a new application doesn’t mean that other applications will “make room” for it (why would they). Once the VRAM limit is reached, the driver behavior will be a mix of evicting video memory not currently in use and spilling to system memory.
Either way if the game “fails and gets stuck”, it’s an application bug.