High CPU usage on xorg when the external monitor is plugged in

I have tested the new drivers (525.53) and can confirm. The problem is still there.

3 Likes

Canā€™t wait to get the fix. I will plan to migrate on a Mac environment, I will sell this gaming laptop the only works with Windows itā€™s just suck for software development couldnā€™t install Linux on it it has a bunch of terrible bugs like screen tearing, Nvidia takes too much power on idle, the machine is too hot compared to windows and anohter hidden bugs that I havenā€™t figured out yet.
This is probably my last day to see you guys, I give up.
Bye Linux, Nvidia, Gaming laptop. Nice to have you accompany me for the last decades.

Same issue. Gigabyte Aero 16 YE5. External display connected to a display port directly connected to the dGPU.

We were able to root caused the issue and fix is incorporated for future release driver.
Shall communicate further once driver is released publicly.

8 Likes

Great job. Thank you!

I have tested the latest ā€œproduction branch releaseā€ drivers(525.60.11). For me the issue still exists.
Hopefully it wonā€™t take too long before the integrated fix is released.

2 Likes

I canā€™t believe this is still an issue. I understand Linux isnā€™t a huge priority but this is just insulting. I honestly regret buying a laptop with an Nvidia GPU, canā€™t wait for March next year for the new models to come out to buy an all AMD laptop, I donā€™t even care if it turns out to be slower. No wonder Linus said Nvidia was the worst company theyā€™d ever dealt with.

3 Likes

I actually just tried the latest driver (I recently reinstalled Pop OS so was on 515) and it broke my desktop. I just switched to integrated only (luckily my laptopā€™s HDMI goes to the iGPU) and I think Iā€™m going to leave it that way. Dealing with this is just not worth it.

Yeah, my Manjaro updated the driver a couple of days ago to 525.60.11. Not only it didnā€™t resolve the problem with the CPU usage, but the external display only configuration stopped working, the laptop became super sluggish, totally not usable. Now I have to keep my built-in laptop screen on for a singular reason of working around bugs in the driver. And the CPU usage by xorg is still 30%.

All development and most of the testing in the semiconductor industry and chip design is using Linux, actually. I assume it is true for NVIDIA too. Testing on the manufacturing floor using Windows is hilarious to imagine because you need a CLI only and special drivers to intimately talk to the hardware. Big FPGA emulators used for design verification are run on Linux too (Inside Nvidia's Emulation Lab | Computer Graphics World). As you can see, NVIDIA uses emulators from Cadence. You can conclude that it is running Linux from job openings for Cadence Design Systems, Inc. Therefore, Linux is of utmost importance for NVIDIAā€™s existence as a chip designer. Considering Linux is involved starting from the chip design and ultimately to the manufacturing floor, itā€™s so sad that NVIDIA managed to screw Linux drivers.

this is so sadā€¦

That is the biggest regret that I dealt with Nvidia, I regretted bought an Optimus laptop that I intended to Use Linux as a main Operation System. So, like I said earlier I plan to switch on a Mac environment to get my development space spicy in terms of GUI and itā€™s Unix-like, it acts like both Linux and Windows at the same time.

Sorry but I can no longer wait for long periods of time, I bought this laptop was 2 years ago, I am depressed just for fixing not really worth the time, and it still take any longer than I expected.

I actually had this problem like so many years ago with Nvidia Desktop, the problem is still there in the latest Nvidia Drivers today, like screen tearing and refresh rate issue, this is probably caused by Xorg or something that Nvidia should not be related to, but I am disappointed šŸ˜„ I probably stick-on Windows again after dealing with Nvidia Linux so frustrating and not worth the time āŒš I can use my PC to fully work with Linux it does not have an Nvidia graphic so it should be fine, I just remote through an SSH to get my job done.

NB: I can no longer wait I was so disappointed; I found another bug Nvidia on Linux when switch to Nvidia-Performance I noticed the consume 22watt when just moving mouse around it is about 10-25watt not rendering something heavy. Compared to Windows is just about under 10watt, 15 to 20watt when watching a video on a browser compared to Linux it slightly different. Thatā€™s why the fan is so loud when booted on Linux just for doing lightweight tasks.

Apologize for the delayed response. As communicated earlier, we were able to root caused the issue but couldnā€™t integrate the fix into last release.
Fix will incorporate in future release.

2 Likes

You need to make sure the nvidia configuration file exists into X11 folder. So copy the file from /usr/share/X11/xorg.conf.d/ to /etc/X11/xorg.conf.d/

sudo cp -p /usr/share/X11/xorg.conf.d/10-nvidia.conf /etc/X11/xorg.conf.d/nvidia.conf

sudo nano /etc/X11/xorg.conf.d/nvidia.conf

Add the following line: Option ā€œPrimaryGPUā€ ā€œyesā€

The file should look like this:

I wrote here: Xorg + nvidia driver / debian / ubuntu / mint CPU high usage when connecting second monitor (Solved) - Marendasoft

Itā€™s good that there are some possible workarounds, but this defeats PRIME, meaning you are forced to use the DGPU for everything all the time, just because you want to plug an additional monitor in. Thatā€™s not how this is supposed to work.

3 Likes

Itā€™s been almost TWO YEARS since this problem was first reported. Sad, sad state of affairs.

3 Likes

I have tested the latest drivers (525.78.01)[production branch release].
And Iā€™m pleased to return here with pleasant news !
At least for me the high CPU usage is finally resolved. Hooray ! (For reference see above for my basic setup.)

It took a while, but Iā€™m grateful nonetheless.
So thanks Nvidia-team. And thank you everyone on this thread for their efforts to try to make everyoneā€™s lives a little easier. Goodbye for now and perhaps weā€™ll meet again in another thread.

How did you manage that? hmm, using the Nvidia GPU all the time is not a fix, it gives me another problem such as high-power usage on Nvidia when itā€™s on Idle it consumes about 24 - 30watt all time, and also the glitch tearing screen when scrolling stuff on the web and video offline through video player or streaming on the browser šŸ§

Iā€™ve downloaded the latest driver from nvidia, compiled and packaged it. Then after installation, regenerated the initramfs.img.
I am using the hybrid setup hereā€™s my X config:

Section "ServerLayout"
    Identifier "layout"
    Screen 0 "screenmix"
    Option "AllowNVIDIAGPUScreens" "True"
EndSection

Section "Device"
    Identifier  "amdgpu"
    Driver      "amdgpu"
    BusID       "PCI:5:0:0"  
    Option	"VariableRefresh" "true"
EndSection

Section "Device"
    Identifier  "nvidia"
    Driver      "nvidia"
    BusID       "PCI:1:0:0" 
EndSection

Section "Screen"
    Identifier "screenmix"
    Device "amdgpu"
    GPUDevice "nvidia"
EndSection

And output of xrandr --listproviders:

For gpu power consumption on my system, this is with a youtube video on the external screen.

It only consumes about 6-9 watt. At least for now and on my system.
Hereā€™s some more pictures:

As to the glitch tearing when scrolling or watching a video. That doesnā€™t seem to happen to me.
I not sure if it did happen before to me, simply because of the cpu problem. I didnā€™t like my expensive laptop cooking for nothing. Hence Iā€™ve never tested much more. Up until this problem was resolved. So that I finally could use my long bought products together. Now that I can. Here are my results.

If you or anyone else would like to know more information. Feel free to ask.

1 Like

So, the release notes on this forum are out of dateā€¦and the link to the ā€œlatestā€ graphics drivers are actually out of date. If you manually search on their download site, you get the version the poster above me has (newer):

And, it states this:

ā€œFixed excess CPU usage in hybrid graphics configurations where an external display is connected to an NVIDIA discrete GPU and configured as a PRIME Display Offload sink (also known as ā€œReverse Primeā€).ā€

So, it would seem team green finally released a fix! And, it hit the Arch repo today (I queried pacman yesterday and it was still an older version, but today itā€™s this one!).