DirectX12 performance is terrible on Linux

DirectX 12 games ran through Proton/Wine on Linux perform so much worse compared to Windows. This can’t be just the overhead from the translation as AMD GPUs perform just fine, but I think it’s due to a lack of proper driver optimizations.

DXVK and VKD3D developers have acknowledged this and they can’t do much about it due to the driver’s closed source nature. They can’t tell how exactly it works behind the scenes.

I don’t know if this is talked about enough, but I don’t see the top threads covering his topic. I think it has high priority.

I hope Nvidia developers at least take a look at this and maybe optimize things if they can. Nvidia GPUs are great on Linux when it comes to computation/CUDA, etc, but for real time applications they are sadly not performing optimally at the moment.

Update, more information:

  • Driver version is pretty much irrelevant, this applies even today, same goes for Proton/Wine/vkd3d/dxvk – I have been using Linux for about 3 years, and it has always been considerably slower for directx12 titles
  • Wayland improves performance compared to x11, which is an interesting behavior that might help pinpoint the issue
  • Hardware-wise, Pascal and older GPUs experience this more severe, Turing and newer are better but they are still considerably slower than Windows
  • CPU heavy software like emulators or some games suffer more
  • From what I can tell, this happens regardless of hardware (CPU, RAM, etc), as long as you are using an Nvidia GPU

Here’s a video comparing performance on Linux and Windows on an Nvidia GPU (4080 Super) under x11:

I will update the post later when I have time to dualboot to give my own numbers, but my experience matches the video. An example of a game I’m currently playing is Resident Evil 4 remake. On Windows I get 80+ FPS, while on Linux I get from 40 to 75 depending on where I am at the same settings.

On x11 it gets down to the 30s, so running Wayland actually gives me reasonably higher framerate, which is definitely nice, but I know things can be even more optimized.

My own hardware:

  • 2060 super
  • 3700x
  • 16gb ram
  • Arch Linux
  • Kde Plasma 6.1.4
7 Likes

Would be really nice to hear something from NVIDIA about it.

Would be really nice if you provided the data to confirm your speculations:

  • GitHub Tickets
  • Affected games
  • FPS comparisons that include graphics settings for both operating systems
  • Versions of NVIDIA drivers used under both OS’es
  • Wine and Linux kernel versions
  • Linux distro used

In short, this thread is not actionable.

If you want NVIDIA to care, you need to provide the full instructions how to reproduce the issue.

3 Likes

I updated the post. I apologize for not providing more info initially. But it was more of a discussion than a bug report as this is a known issue and applies across different set-ups and has been like this for years now. But I did provide more information either way.

1 Like

It’s widely known that DX12 games run worse in Linux on Nvidia hardware.
DX11 games usually run much better.

From what I’ve seen there can be as much as a 30% performance drop with DX12 titles. Turning on DLSS/FSR can make up for that loss but it’s a “weak” workaround for the actual problem.

1 Like

Geen team DX12/vkd3d performance:

Red team DX12/Vkd3d performace:

4 Likes

I also want to emphasize that this is not really just a dx12 issue, although it is definitely more noticeable when using dx12 vs dx11. If I had to guess, it has to do with the increased CPU workload. Or having more API calls.

Emulators (running natively) also exhibit similar behavior. Emulators in general run much faster on Windows vs on Linux using an Nvidia GPU. And this time there’s no translation layers at all, both run natively using Vulkan.

Here’s one emulator that has this issue manifested in a big way, it’s Ryujinx, a Switch emulator. Here’s a screenshot running The Legend of Zelda Tears of the Kingdom.

FPS on Linux is 60% less.

Ryujinx developer said this might be because of the massive amount of draw calls the emulator makes when asked about the poor performance on Linux using Nvidia. Compared to other emulators. As while they do perform worse on Linux when using an Nvidia GPU, Ryujinx has the biggest hit.

There’s another weird behavior, it’s not just the framerate. Every time I open the game on Linux, the first ~10 seconds I get a lot of stutter, similar to shader compilation, but it’s not actually shader compilation as that should be taken care of especially running around for hours (collectively) in the same area. It seems to me like the CPU is bottlenecked by the GPU somehow. And this is probably related to memory.

I don’t know if this is related to the MTRR/PAT thing or not, but out of curiosity I enabled the variable but there doesn’t seem to be any difference. At least on Wayland.

2 Likes

Mmm, I have fiddled with many settings but there’s no magic switch that can be done be at software/user level.
However disabling 4G Decoding in BIOS can increase performance in some apps/games but disabling it can also decrease performance in other apps/games. Which leads me to believe that it’s Nvidias memory handling that’s causing it.

1 Like

Another bench with around 18% performance loss compared to windows, while amd cards have similar performance.

4 Likes

The video was very interesting especially considering Raytracing, which I never tested myself given my GPU is too weak for that. So, to summarize the video, which covers dx12, here’s the findings:

  • When RTX OFF
    • Nvidia FPS loss on Linux compared to Windows on average is about 18%.
    • AMD FPS loss on Linux compared to Windows is mostly non-existent.
  • When RTX ON
    • Both AMD and Nvidia suffer a massive loss im performance on Linux compared to Windows, about 50%.
3 Likes

Its not 50% but its still too much. With Ada cards it’s around 30% for me. But that seems to be the general loss overall and not specific to RT.
Guardians of the Galaxy
Windows


Linux

Ignore the driver versions tho’, they are not correct.
VRAM usage in this game is almost doubled in Linux too, I don’t know why.

4 Likes

Could you test x11 as well?

fwiw, here is some anecdotal evidence

I play diablo 4 on ubuntu. Diablo 4 is vkd3d. I recently updated my video card from an AMD 6950xt to a 4080 super.

The game ran pretty flawless with the 6950xt. When I started using the 4080 super I experienced frequent problems, usually when diablo 4 gpu memory use hit 16GB (fps drops and stuttering). Long story short, I ended up setting a 14GB limit in dxvk.conf (vkd3d uses this limit) and all my problems went away. It seems like there is some sort of problem freeing memory with the nvidia drivers (550.107.02) and diablo 4. I had no such problems with the 6950xt and the gpu is the only thing that changed.

This could be an issue specific to diablo and not vkd3d though.

Not much difference on X11 really, wayland is pretty much on par with X11 with the 560 driver.

Yea, unfortunately D4 does not run very well, but nice that you found a workaround. So you set dxgi.maxDeviceMemory = 14336 then?

actually

dxgi.maxDeviceMemory = 14336
dxgi.maxSharedMemory = 14336

I got the idea from another post on the web somewhere

Are you sure those work for vkd3d? As far as know these are dxvk variables.

yes, they are still honored with vkd3d (I set it very low to be sure, like 5GB, and gpu memory usage stayed there). I am not familiar with the implementation details but it was explained why it works in the other post. If I can find it again I will post it here. Here is a snippet from the vkd3d github page:

"
vkd3d-proton does not supply the necessary DXGI components on its own. Instead, DXVK (2.1+) and vkd3d-proton share a DXGI implementation.
"

1 Like

fyi
VRAM Allocation Issues

1 Like

Btw, I did a test with X11, the FPS is better there. Maybe +5 give or take… But I doubt you’ll notice it while gaming really.

1 Like