High CPU usage on xorg when the external monitor is plugged in

I can’t believe this is still an issue. I understand Linux isn’t a huge priority but this is just insulting. I honestly regret buying a laptop with an Nvidia GPU, can’t wait for March next year for the new models to come out to buy an all AMD laptop, I don’t even care if it turns out to be slower. No wonder Linus said Nvidia was the worst company they’d ever dealt with.

3 Likes

I actually just tried the latest driver (I recently reinstalled Pop OS so was on 515) and it broke my desktop. I just switched to integrated only (luckily my laptop’s HDMI goes to the iGPU) and I think I’m going to leave it that way. Dealing with this is just not worth it.

Yeah, my Manjaro updated the driver a couple of days ago to 525.60.11. Not only it didn’t resolve the problem with the CPU usage, but the external display only configuration stopped working, the laptop became super sluggish, totally not usable. Now I have to keep my built-in laptop screen on for a singular reason of working around bugs in the driver. And the CPU usage by xorg is still 30%.

All development and most of the testing in the semiconductor industry and chip design is using Linux, actually. I assume it is true for NVIDIA too. Testing on the manufacturing floor using Windows is hilarious to imagine because you need a CLI only and special drivers to intimately talk to the hardware. Big FPGA emulators used for design verification are run on Linux too (Inside Nvidia's Emulation Lab | Computer Graphics World). As you can see, NVIDIA uses emulators from Cadence. You can conclude that it is running Linux from job openings for Cadence Design Systems, Inc. Therefore, Linux is of utmost importance for NVIDIA’s existence as a chip designer. Considering Linux is involved starting from the chip design and ultimately to the manufacturing floor, it’s so sad that NVIDIA managed to screw Linux drivers.

this is so sad…

That is the biggest regret that I dealt with Nvidia, I regretted bought an Optimus laptop that I intended to Use Linux as a main Operation System. So, like I said earlier I plan to switch on a Mac environment to get my development space spicy in terms of GUI and it’s Unix-like, it acts like both Linux and Windows at the same time.

Sorry but I can no longer wait for long periods of time, I bought this laptop was 2 years ago, I am depressed just for fixing not really worth the time, and it still take any longer than I expected.

I actually had this problem like so many years ago with Nvidia Desktop, the problem is still there in the latest Nvidia Drivers today, like screen tearing and refresh rate issue, this is probably caused by Xorg or something that Nvidia should not be related to, but I am disappointed 😥 I probably stick-on Windows again after dealing with Nvidia Linux so frustrating and not worth the time ⌚ I can use my PC to fully work with Linux it does not have an Nvidia graphic so it should be fine, I just remote through an SSH to get my job done.

NB: I can no longer wait I was so disappointed; I found another bug Nvidia on Linux when switch to Nvidia-Performance I noticed the consume 22watt when just moving mouse around it is about 10-25watt not rendering something heavy. Compared to Windows is just about under 10watt, 15 to 20watt when watching a video on a browser compared to Linux it slightly different. That’s why the fan is so loud when booted on Linux just for doing lightweight tasks.

Apologize for the delayed response. As communicated earlier, we were able to root caused the issue but couldn’t integrate the fix into last release.
Fix will incorporate in future release.

2 Likes

You need to make sure the nvidia configuration file exists into X11 folder. So copy the file from /usr/share/X11/xorg.conf.d/ to /etc/X11/xorg.conf.d/

sudo cp -p /usr/share/X11/xorg.conf.d/10-nvidia.conf /etc/X11/xorg.conf.d/nvidia.conf

sudo nano /etc/X11/xorg.conf.d/nvidia.conf

Add the following line: Option “PrimaryGPU” “yes”

The file should look like this:

I wrote here: Xorg + nvidia driver / debian / ubuntu / mint CPU high usage when connecting second monitor (Solved) - Marendasoft

It’s good that there are some possible workarounds, but this defeats PRIME, meaning you are forced to use the DGPU for everything all the time, just because you want to plug an additional monitor in. That’s not how this is supposed to work.

3 Likes

It’s been almost TWO YEARS since this problem was first reported. Sad, sad state of affairs.

3 Likes

I have tested the latest drivers (525.78.01)[production branch release].
And I’m pleased to return here with pleasant news !
At least for me the high CPU usage is finally resolved. Hooray ! (For reference see above for my basic setup.)

It took a while, but I’m grateful nonetheless.
So thanks Nvidia-team. And thank you everyone on this thread for their efforts to try to make everyone’s lives a little easier. Goodbye for now and perhaps we’ll meet again in another thread.

How did you manage that? hmm, using the Nvidia GPU all the time is not a fix, it gives me another problem such as high-power usage on Nvidia when it’s on Idle it consumes about 24 - 30watt all time, and also the glitch tearing screen when scrolling stuff on the web and video offline through video player or streaming on the browser 🧐

I’ve downloaded the latest driver from nvidia, compiled and packaged it. Then after installation, regenerated the initramfs.img.
I am using the hybrid setup here’s my X config:

Section "ServerLayout"
    Identifier "layout"
    Screen 0 "screenmix"
    Option "AllowNVIDIAGPUScreens" "True"
EndSection

Section "Device"
    Identifier  "amdgpu"
    Driver      "amdgpu"
    BusID       "PCI:5:0:0"  
    Option	"VariableRefresh" "true"
EndSection

Section "Device"
    Identifier  "nvidia"
    Driver      "nvidia"
    BusID       "PCI:1:0:0" 
EndSection

Section "Screen"
    Identifier "screenmix"
    Device "amdgpu"
    GPUDevice "nvidia"
EndSection

And output of xrandr --listproviders:

For gpu power consumption on my system, this is with a youtube video on the external screen.

It only consumes about 6-9 watt. At least for now and on my system.
Here’s some more pictures:

As to the glitch tearing when scrolling or watching a video. That doesn’t seem to happen to me.
I not sure if it did happen before to me, simply because of the cpu problem. I didn’t like my expensive laptop cooking for nothing. Hence I’ve never tested much more. Up until this problem was resolved. So that I finally could use my long bought products together. Now that I can. Here are my results.

If you or anyone else would like to know more information. Feel free to ask.

1 Like

So, the release notes on this forum are out of date…and the link to the “latest” graphics drivers are actually out of date. If you manually search on their download site, you get the version the poster above me has (newer):

And, it states this:

“Fixed excess CPU usage in hybrid graphics configurations where an external display is connected to an NVIDIA discrete GPU and configured as a PRIME Display Offload sink (also known as “Reverse Prime”).”

So, it would seem team green finally released a fix! And, it hit the Arch repo today (I queried pacman yesterday and it was still an older version, but today it’s this one!).

The driver version 525.78.01 release notes suggest that a fix was made; there are two items worth reporting:

a) on my Dell 7610 with Nvidia 3060, Fedora 37, NVIDIA drivers via rpmfusion, KDE 5.26.4, with an LG 4K screen connected and displayed “to the left” of the laptop, the excessive CPU load - i.e. the performance issue - is gone; this also applies to Ubuntu 20.04 LTS with Mutter.

b) during testing I noticed serious functional misbehaviour, though, when running glxgears; step:

  • boot box
  • log into an X11 desktop environment (Fedora 37 + KDE; Ubuntu 20.04 LTS + Mutter)
  • start glxgears (without PRIME offload)
  • make sure that glxgears runs on the screen connected to the Nvidia GPU
    … and wait for 60 seconds to repeatedly observe, eventually, one of the following two broken behaviours
  • screen connected to the NVIDIA output turns black (but recovers within a second)
  • some garbage rectangles (pink pixel garbage) show up, and get cleared again in what appears to be random sizes and random locations, but only on the screen attached to the NVIDIA GPU

This faulty behaviour also occurs with __NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia glxgears (and all other things being equal)

This faulty behaviour does not occur when the output screen is the laptop’s 3072x1920 (3K!) Intel GPU driven built-in screen.

This misbehaviour is not limited to glxgears; it also applies to, for instance, Firefox, or Visual Studio Code, when those application windows are moved around with the mouse; once the corrupt is present, moving the mouse will nicely animate the pixel garbage.

The photo below shows the pixel garbage - it appears in totally random locations, nowhere close where I would expect repaint damage. The garbage cannot be captured using a screenshot tool, e.g. KDE’s "Spectacle, all looks good there, hence the photo:

And, bonus misbehaviour: I think while glxgears was running in the background on the NVIDIA GPU screen, and while I was doing some wiggling on that screen, I got something to totally lock up, because the content on the NVIDIA GPU screen was totally frozen. Things did come back after I disabled the screen via KDE Display Configuration and re-enabled it. But this smelled a lot like a deadlock (somewhere).

I see exactly nothing of that reflected in logs as WARN or ERROR (and I really would expect at least the screen going all black to emit some kind of diagnostic). I expect that

Jan 07 14:18:13 fedora.home kernel: nvidia-modeset: ERROR: GPU:0: Idling display engine timed out: 0x0000c67e:0:0:1128
Jan 07 14:18:15 fedora.home kernel: nvidia-modeset: ERROR: GPU:0: Idling display engine timed out: 0x0000c67e:0:0:1128

is the result of my “fixing” the deadlock, not something that was emitted by the running driver.

1 Like

The problem is greatly improved (the high CPU), but CPU usage is still higher (6-8%, vs. 2-3%) with no DE apps running with 1 or 2 external monitors enabled, than with none.

Also, the ordering/assignment of displays is all messed up. If I set my laptop display as primary, the taskbar gets moved to the 3rd monitor. If I force an app to run on primary, it runs on the 2nd monitor. When I disable both displays, go to sleep, and wake up, the 2nd monitor re-enables itself (likely due to hotplug events in the driver detecting a “new display” during wake).

So…better than not being able to use it at all, but buggy AF. I’m going to try your test and see what happens.

I have a 5800h (AMD) + 3070 laptop, with mux. I only tested in hybrid mode (internal display = AMD, external displays = nV). I ran multiple glxgears, including with no frame rate cap, and several concurrently, and I don’t see any corruption. Interestingly, the instance running on the AMD driver, even with “reverse prime” (AMD driver, displayed on an nV display), maintains >15k fps…the nV instance hovers around 4-5k fps (on an nV monitor), and if I drag it to the AMD display it drops to 3-4k fps.

Performance mode is “auto” but it pegs itself to the max clocks.

I think the issues you’re having likely have something to do with the Intel hybrid config.

1 Like

I’ve downloaded the latest driver 525.78.01 but no luck, still high usage 25-30%.
nvidia-bug-report.log.gz (431.0 KB)

For me, after updating to 525.78.01 the CPU usage dropped to 3%-8%. That’s good, but still not acceptable. When using only dGPU or iGPU the CPU usage is around 0.3%. I hope this isuue will be fixed too.

Also I’ve noticed a strange little issue. When I’m using external display with reverse PRIME setup, and playing games on external display (no matter whether via dGPU or iGPU), during camera movements the picture is not as smooth as when using only one GPU (dGPU or iGPU, external monitor or laptop - doesn’t matter) without reverse PRIME - the picture moves smoother. I’ve noticed this in such native Linux games as SuperTuxKart, Yamagi Quake 2 and 0AD. Vsync configurations don’t affect this issue at all.

Finally, there is absolutely working flawlessly on Ubuntu 23.04 Daily build by using Wayland and the latest Nvidia Beta Drivers the problems that I had gotten so far are now gone in these current versions, the external monitor is not tearing anymore, about the Nvidia power usage is really low at idle it is like windows how it behaves sitting at 5 watts when no activities, when the cursor is moving around there are still under 10Watts with external monitor connected which are great improvement so far.
I forgot something to tell, now on Wayland I can set fractional scaling individually of each monitor. That is so cool, I love it.

1 Like