One eternity later, still a problem…
I switched to Linux because more and more open sourced AIs are easier to install on Linux, but if we can’t use RAM as shared memory we’ll need A100 GPUs very soon in our PCs…
One eternity later, still a problem…
I switched to Linux because more and more open sourced AIs are easier to install on Linux, but if we can’t use RAM as shared memory we’ll need A100 GPUs very soon in our PCs…
Till now this problems still exists, and since I owns far more ram than vram, that really limited my ability to train and run large models
I am absolutely disgusted and baffled this is still an issue and nothing is being done or being said about this absolutely major issue. For example, this makes games that run close or at 8gb vram (like Squad) completely unplayable. The moment the vram gets to the limit the system freezes.
I will never buy another overpriced garbage nvidia gpu. You are treating us customers like complete crap.
Meanwhile you keep pushing your limited 8gb Vram cards, at the same time you neglect your growing linux consumer base. Completely unacceptable.
I mean is this even active forum? Is there some other place this issue should be brought up in? Because it seems that no one from Nvidia is seeing this post…
I just posted a workaround here:
https://forums.developer.nvidia.com/t/vram-allocation-issues/239678/57?u=majteam
Good game to all.
As what I posted there, this is still only a specific solution for dxvk (although thank you providing that solution!)
The problem still exists…
I’m commenting here to make sure that nvidia doesn’t see this problem as SOLVED.
This is not working in squad. Also I read that this limit is “guideline only” and game can just ignore it.
-Squad in dx11 mode.
-dxvk.conf in binary folder next to squadgame.exe
-In the file reads:
dxgi.maxDeviceMemory = 6144
Mangohud confirms nothing changes.
Hard NOT SOLVED
Edit: I also added this to the conf: dxvk.hud = memory
That does give me the hud ingame so I can confirm the file is in correct place and is being read, but max memory command is just being simply ignored.
Please reply something this thread is 1.5 years old…
It’s not a hard limit, games can still decide not to use it.
Also I think it is important to note that this issue is not only gaming related. It is also productivity issue. As someone said here already, this is a problem with their LLM’s. For me, I have encountered “cuda out of memory” error in blender, and for that only fix was to install windows alongside just to render the animation in windows instead of cpu in linux… That is a huge no no.
Nvidia, please fix.
I made an update to the workaround:
https://forums.developer.nvidia.com/t/vram-allocation-issues/239678/60?u=majteam
As this command “dxvk.hud = memory” does work when the config is in the game exe folder, doesn’t that mean that the config file is properly loaded and the dxgi.maxDeviceMemory command is just ignored?
Yes, the command seems to be ignored and although I don’t understand why, i don’t know the game SQUAD.
Unfortunately, some apps may choose to ignore this command. I have no other solution to offer you at the moment.
Edit:
After a brief search, your game seems to be particularly VRAM hungry for textures. There seems to be an UNCAP TEXTURE POOLSIZE option in the game settings. Could this possibly be the cause of your problem?
Of course not, textures low and this setting is not turned on.
The texture use is not a problem in windows 11. Because the nvidia drivers handle vram, unlike in linux. That is not a solution, this thread needs someone from nvidia to acknowledge this.
This is not only a gaming issue as i said.
0 answer from the devs EVER on this subject is crazy.