The situation on KDE/Kwin/Plasma performance

Okay, so, it seems the situation evolved meanwhile. I would like to express my warmest thank you to the Nvidia team as it was a major one.

One long standing and very annoying was apparently fixed, see : NVIDIA pushed out two new Linux drivers recently with 396.45 and 390.77 | GamingOnLinux

“Fixed a bug that caused kwin OpenGL compositing to crash when launching certain OpenGL applications.”

I yet have to test in the long run but it seems the most annoying issue occuring with NVIDIA drivers & Kwin was fixed, that is to say : the panel / desktop freezing randomly when an app would interrupt and resume kwin compositing (they would still react to mouse clicks but wouldn’t update). It would be great if the NVIDIA team could get in touch with the kwin team (whose leader stepped down meanwhile) to tell them what was triggering it…

I still have the following problems with either no solution or requiring a work-around :

  • sub-optimal framerate on the desktop (occurs under GNOME too, but doesn’t occur with open source drivers (Nouveau, AMDGPU, Intel…).
  • work-around to get a tear-free experience (either : triple buffer ; GL_YIELD ; or force compositing pipeline). Not required either when using open source drivers. Also partly necessary under GNOME.

(also, applying “force compo pipeline” slows down a lot KDE startup, but maybe the problem is on KDE’s side this time !)

Cheers ! Thanks for caring for KDE users !

Actually, it does not seems so:
https://bugs.kde.org/show_bug.cgi?id=353983#c142
Is it still “fixed” for you?
If the answer is yes, maybe you could write your findings to the readers of that bug report.

@kokoko3k :

Well, there are improvements apparently.

Yes, the panel still freezes once in a while when compositing is interrupted. (started game, generally)

That’s a little annoying if you alt-tab to the desktop. But at least, the panel is restored when the game is exited.

In my experience, in the past, the big problem was that the panel would sometimes remain in this frozen state after then game had exited. (had to restart plasmashell which is not acceptable for normal users – yep I did install Neon onto some family PCs :)

So it’s half fixed…

OFF TOPIC : meanwhile, I gave GNOME another try (Manjaro & Ubuntu). I was surprised to realize :

  1. Gnome Shell would eat tons of CPU usage when just scrolling in Firefox. Animations are quite jerky (worse than in kwin actually !). Even with a non nvidia card they are not constantly smooth contrary to kwin (but cpu usage is normal with open source drivers). It’s apparently being optimized at the moment.
  2. you also need the tearing workaround under GNOME (to a lesser extent, but still).

So there are clearly issues on both major desktops with the NVIDIA drivers, which is sad to see as gaming performance is stellar and those cards are great (power efficiency). I really wonder why this situation remains after all those years, that’s strange.

Edit : double post

Hi !

Just checking if some people here had had better experience meanwhile ?

On the Arch wiki, there’s a slightly different solution
https://wiki.archlinux.org/index.php/NVIDIA/Troubleshooting#1._GL_threads
(applying USLEEP to kwin only).

I made several other attempts with different DEs meanwhile : (with my 1060 GTX and different drivers versions ranging from 390.x to 415.x)

  • I had very disappointing experience with the latest Gnome 3.30 (a lot of stuttering all the time on the desktop – especially terrible under Ubuntu 18.10 when used along with FF with hardware acceleration : 50% of the time, the FF scrolling is randomly very jerky for the very same pages. Some kind of sync issue I guess. I double checked under Manjaro – not great either.

  • I had a surprisingly GOOD experience with Cinnamon 3.8. The desktop feels constantly smooth. Even FF seems to suffer less than with the other DEs (in which for instance the “new tab” animation is not perfectly smooth).

  • Regardless of the DE used, some apps are always slower : i.e. scrolling in FF or in the Steam client in thumbnails mode.

  • With all DEs EXCEPT CINNAMON I have a significantly smoother desktop when using OSS drivers, Intel / AMD GPUs. (well, some cinnamon apps are intrinsically slower thank KDE ones but at least windows animations / Muffin is butter smooth).

AND UNDER KDE

  • Well, the framerate gets very frequently under 60 when moving windows & triggering effects. That’s really OK for most people I guess – but that’s not normal.
  • As soon as there is some significant CPU usage, apps / animations seems to stall. If you’re playing a video and starting a big app, the picture will freeze a little – which never happens with OSS drivers.
  • Just opening Dolphin and browsing feels a little bit sluggish. That’s not placebo but maybe a little subtle. Things that feel really instant with other GPUs are not when using the NVIDIA driver. As I said before it’s OK but frustrating when you have a great GPU.

→ when using Intel / AMD GPUs with OSS drivers, I get a constant 60 FPS framerate REGARDLESS OF the CPU usage. I never get a single app to “stall” under high CPU usage. Every app feels more responsive. ←

  • Also I noticed that when using “force compo pipeline” along with disabling VSYNC (in KWIN and in some games) I would frequently get no tearing and, it seems, less input lag. That’s especially noticeable in some games, like Hollow Knight. So it seems to be a good solution although I’m not sure what is optimal.

So well. It’s definitely usable. But it definitely make you feel you have a slower machine. I heard NVIDIA sent some cards to some KWin devs. I hope the situation will change, eventually ! :-)

I can confirm very slow Plasma responsiveness and jittery cursor with a GeForce GT 710 on a very fast machine, with both the recent short- and long-term Nvidia drivers. It doesn’t matter what the resolution is or what the refresh rate is set at, the Application Launcher menu, for example, takes 1-2 seconds to pop up, and I have all desktop effects off!

Firefox, however, renders fast, but menus etc. pop up just as slowly.

And Plasma with the VESA driver is fast.

It’s some combination of Plasma with the Nvidia drivers that seems to be causing the slowdown.

The tricks mentioned in this thread don’t help.

I’m using KDE Plasma 5.14.4.

This topic started 3 years ago, and this is still a relevant problem with proprietary drivers today.

Using RTX 2080 SUPER on a system with 32GB RAM, and a i9-9900K. Drivers are 460.xx

Opening the K-Menu, receiving notifications, opening programs, and several other actions cause serious performance hits. For me, this is most noticeable when playing games, such as “Shadow of the Tomb Raider”, or watching videos on youtube.

When there’s constant activity on Discord, resulting in repeated notifications, it makes gaming pretty much impossible, which absolutely should not be happening on a very high end gaming PC.

This system is screamingly fast, and responsive in Windows 10 with the latest drivers, but the performance in Kubuntu 20.04 today is worse than it was, back when I had a GT 610, and a dual-core CPU, when using proprietary drivers.

I was using 'Force composition pipeline" to get around this problem, but that’s not working anymore.

Kubuntu 20.04
KDE Plasma: 5.18.5
KDE Frameworks: 5.68.0
Qt: 5.12.8
Kernel: 5.4.0-65-generic

1 Like

Hey! Just a heads up, the beta for kde right now has a rewritten kwin compositing pipeline that for me at least is way better in stability. I would highly recommend checking it out if you are able!

With KWin 5.20 on Gentoo, I used the following variables in /etc/profile.d/kwin.sh:

export KWIN_COMPOSE=“O2”
export KWIN_OPENGL_INTERFACE=“egl”
export KWIN_TRIPLE_BUFFER=1

Those settings basically force KWin to use EGL as compositing backend. Unfortunately blur effects are going to be unavailable but the experience was way smoother. With KWin 5.21 I removed the first two variables and the experience is way better than before and I also have blur effects.

I am using low latency desktop effects with OpenGL 3.1 and “Smooth” effects.

NOTE: triple buffering has also to be enabled in Xorg configuration at driver level otherwise it won’t work.

Even with @axelgenus 's settings, I get massive tearing across multiple displays. The last compositor to get it right was “kwin-lowlatency”, but at the cost that you actually couldn’t game without Xorg becoming unresponsive and dropping inputs.

I think there’s something fundamentally wrong about how the Nvidia driver does vsync, considering AMD and Intel do not have this problem. Especially @jasoncollege24’s description of notifications dropping the entire desktop framerate to a quarter or less, making the whole system unusable just from a tiny pop up. Using nvidia GPU as a HTPC, all background applications must be disabled otherwise a single pop up could make your foreground video drop most of its frames during a pop up animation. It’s almost a joke, but could notification popups be a denial of service vector for nvidia GPU owners on linux?

The only configuration I could muster with nvidia graphics and smooth KDE usage is Wayland. However, this configuration is barely usable due to graphical glitches. So unless someone found the magic configuration, we have KDE + Wayland with an unusable desktop with buttery smooth scrolling and animations, or KDE + Xorg with mid-screen tearing and inconsistent frame timings but a technically functional desktop. Both options are horrible.

Screen tearing with multiple displays is usually due to a X.org limit: are you using displays with different refresh rates? Even the smallest difference among refresh rates causes tearing. nVIDIA X Server Settings utility has an option to specify the monitor to synchronize to (X Screen # > X Server Video Settings > Sync to this display device). You should select the monitor with the highest refresh rate.

Triple buffering a V-sync should also help but I found out that KWin 5.21 has a low latency mode which greatly improves the final result (I am not talking about the kwin-lowlatency patchset, KWin compositor scheduler has been rewritten and an option about latency is available in compositor’s settings).

The alternative is to enable Wayland but nVIDIA drivers do not provide GLX acceleration for Xwayland. Basically it means no gaming at all but desktop should perform better because Wayland support different refresh rates for each monitor.

Yes, I have two displays running at different refresh rates, unfortunately. A 1920x1200 @ 59.95hz and a 1920x1080 @ 60.00hz. I configure the 60hz display as the device to sync to. Maybe there’s a way to overclock the slower display to make up and run exactly at 60hz, but that’ll probably break the hotpluggability of my current setup.

And regarding tearing, it happens about 50% of the time, and once it happens it’s super difficult to get rid of. Usually cycling CTRL+ALT+F1/F2 and testing it can remove tearing of individual windows. But then there’s a second level of tearing in software accelerated windows themselves, such as Firefox for example. I use vsynctester.com to verify, my work laptop(s) using Intel + thunderbolt can maintain steady frame rate without tearing while my desktop using a 980ti cannot.

So you’re probably right, the reason not everyone has this issue is they may run just single monitor or they run two monitors with an identical refresh rate. I’ll try overdriving my slower display and report back if that fixes it. But just note, it’s bizarre that nvidia’s proprietary driver is the only modern linux graphics driver that has this issue, and it’s persisted for so long.

Just check with xrandr if your slower monitor can already run at the same frequency as the faster one or the other way around. I can feel your pain since my workstation also has two monitors attached:

Screen 0: minimum 8 x 8, current 4479 x 1080, maximum 32767 x 32767
HDMI-0 disconnected (normal left inverted right x axis y axis)
DP-0 disconnected (normal left inverted right x axis y axis)
DP-1 disconnected (normal left inverted right x axis y axis)
DP-2 connected primary 2560x1080+0+0 (normal left inverted right x axis y axis) 673mm x 284mm
   2560x1080     60.00*+
   1920x1080     60.00    59.94    50.00    23.98
   1680x1050     59.95
   1280x1024     75.02    60.02
   1280x800      59.81
   1280x720      60.00    59.94    50.00
   1152x864      75.00
   1024x768      75.03    60.00
   800x600       75.00    60.32
   720x576       50.00
   720x480       59.94
   640x480       75.00    59.94    59.93
DP-3 disconnected (normal left inverted right x axis y axis)
DP-4 connected 1920x1080+2559+0 (normal left inverted right x axis y axis) 527mm x 296mm
   1920x1080     60.00*+
   1600x900      60.00
   1280x1024     75.02    60.02
   1152x864      75.00
   1024x768      75.03    60.00
   800x600       75.00    60.32
   640x480       75.00    59.94
DP-5 disconnected (normal left inverted right x axis y axis)

As you can see, I am lucky enough to have two monitors with the same refresh rate. Even if they are the same though the clocks are never really perfectly synchronized so I’ve also experienced some tearing or performance degradation in the past. With nvidia-drivers 460.39 and KWin 5.21 most of my issues have been solved. Hope the best for you too. ;)

@axelgenus looks like I was able to get xrandr to get me 60hz on my display! There was one quirk that was a pleasant discovery, you can allow custom modes through an xorg.conf option:

Option "ModeValidation" "AllowNonEdidModes"

With this, I wrote a quick script to find my display and force the 60hz mode. It seems to be making a difference. Here’s the script, maybe overengineered:

#!/bin/bash

XRANDR_MODE="1920x1200_60.00"
DISPLAY_PORT="$(xrandr | grep -P 'connected.*1920x1200' | grep -Po '^\S+')"
DISPLAY_MODE="$(
    gtf 1920 1200 60 | grep Modeline |\
    sed -r 's/\s*Modeline "[^"]+"\s+//'
)"

if [ -n "$DISPLAY_PORT" ]; then
    xrandr --newmode "$XRANDR_MODE"  $DISPLAY_MODE || true
    xrandr --addmode "$DISPLAY_PORT" "$XRANDR_MODE" || true
    xrandr --output "$DISPLAY_PORT" --mode "$XRANDR_MODE"
fi

If the xrandr mode already exists or is attached already to the display port, it spews errors but is otherwise benign. But so far so good, it feels like an improvement and I’m not getting mid-screen tearing. Will need to continue testing to see if I just got lucky (which happens), or if my tearing is fixed longterm.

So, I think I figured out the right invocation of parameters to make kwin_x11 run properly with nvidia. This manifests as the environment variables that need to be loaded early, or at least when replacing kwin.

$ cat ~/.config/plasma-workspace/env/kwin.sh 
export KWIN_TRIPLE_BUFFER=1
export __GL_YIELD=USLEEP
export __GL_MaxFramesAllowed=1

KWIN_TRIPLE_BUFFER=1 is required to tell KDE that we’re using triple buffering
__GL_YIELD=USLEEP prevents kwin from spinning too fast and consuming too much CPU.
__GL_MaxFramesAllowed=1 prevents actual tearing in kwin and seems to fix a lot of micro stutter

It’s strange that kwin still can’t tell if nvidia is configured for triple buffering or not. Adding KWIN_TRIPLE_BUFFER=1 (or even 0 if triple buffering is disabled) is required since kwin almost always picks the wrong value with nvidia.

For the system as a whole, setting __GL_YIELD=USLEEP globally does help with responsiveness across the board, but I’m sure it reduces FPS in nearly all games. This is something you’ll definitely want to set per app or game to get the behavior/performance you’re looking for.

As for __GL_MaxFramesAllowed, I get the behavior I’m looking for by only applying it to kwin. Letting at default or setting to “3” for my entire desktop while kwin is set to “1” seems to fix micro stutter, tearing, and other weirdness that I don’t get with mesa based acceleration.

Hope this helps someone, this took a lot of experimentation to get right. And as usual, this may not apply to everyone but it does help with my configuration.

1 Like

Hey!

I generally avoid necro-posting but @steven-liquorix Thank you very much! This actually solved my stuttering issue I’ve been having for the last 2 years. (RTX 3090 @ Arch Linux/KDE-Plasma).

Cheers!