High CPU usage on xorg when the external monitor is plugged in

This issue still exists and practically makes On-Demand profile useless with external monitors. Any updates?

Same problem here, guys… when the second monitor is plugged in Xorg and nv_queue goes up on CPU.
changed settings on bios to discrete so I could work, but as people said above this is not a solution.

Details of my machine:

lsb_release -a:
No LSB modules are available.
Distributor ID: Ubuntu
Description: Ubuntu 20.04.4 LTS
Release: 20.04
Codename: focal

lspsi -k:

01:00.0 VGA compatible controller: NVIDIA Corporation GP104BM [GeForce GTX 1070 Mobile] (rev a1)
Subsystem: CLEVO/KAPOK Computer GP104BM [GeForce GTX 1070 Mobile]
Kernel driver in use: nvidia
Kernel modules: nvidiafb, nouveau, nvidia_drm, nvidia

dpkg -l | grep nvidia
ii libnvidia-cfg1-510:amd64 510.47.03-0ubuntu0.20.04.1 amd64 NVIDIA binary OpenGL/GLX configuration library
ii libnvidia-common-510 510.47.03-0ubuntu0.20.04.1 all Shared files used by the NVIDIA libraries
ii libnvidia-compute-510:amd64 510.47.03-0ubuntu0.20.04.1 amd64 NVIDIA libcompute package
ii libnvidia-compute-510:i386 510.47.03-0ubuntu0.20.04.1 i386 NVIDIA libcompute package
ii libnvidia-decode-510:amd64 510.47.03-0ubuntu0.20.04.1 amd64 NVIDIA Video Decoding runtime libraries
ii libnvidia-decode-510:i386 510.47.03-0ubuntu0.20.04.1 i386 NVIDIA Video Decoding runtime libraries
ii libnvidia-encode-510:amd64 510.47.03-0ubuntu0.20.04.1 amd64 NVENC Video Encoding runtime library
ii libnvidia-encode-510:i386 510.47.03-0ubuntu0.20.04.1 i386 NVENC Video Encoding runtime library
ii libnvidia-extra-510:amd64 510.47.03-0ubuntu0.20.04.1 amd64 Extra libraries for the NVIDIA driver
ii libnvidia-fbc1-510:amd64 510.47.03-0ubuntu0.20.04.1 amd64 NVIDIA OpenGL-based Framebuffer Capture runtime library
ii libnvidia-fbc1-510:i386 510.47.03-0ubuntu0.20.04.1 i386 NVIDIA OpenGL-based Framebuffer Capture runtime library
ii libnvidia-gl-510:amd64 510.47.03-0ubuntu0.20.04.1 amd64 NVIDIA OpenGL/GLX/EGL/GLES GLVND libraries and Vulkan ICD
ii libnvidia-gl-510:i386 510.47.03-0ubuntu0.20.04.1 i386 NVIDIA OpenGL/GLX/EGL/GLES GLVND libraries and Vulkan ICD
ii linux-modules-nvidia-510-5.13.0-30-generic 5.13.0-30.33~20.04.1 amd64 Linux kernel nvidia modules for version 5.13.0-30
ii linux-modules-nvidia-510-generic-hwe-20.04 5.13.0-30.33~20.04.1 amd64 Extra drivers for nvidia-510 for the generic-hwe-20.04 flavour
ii linux-objects-nvidia-510-5.13.0-30-generic 5.13.0-30.33~20.04.1 amd64 Linux kernel nvidia modules for version 5.13.0-30 (objects)
ii linux-signatures-nvidia-5.13.0-30-generic 5.13.0-30.33~20.04.1 amd64 Linux kernel signatures for nvidia modules for version 5.13.0-30-generic
ii nvidia-compute-utils-510 510.47.03-0ubuntu0.20.04.1 amd64 NVIDIA compute utilities
ii nvidia-driver-510 510.47.03-0ubuntu0.20.04.1 amd64 NVIDIA driver metapackage
ii nvidia-driver-local-repo-ubuntu2004-460.32.03 1.0-1 amd64 nvidia-driver-local repository configuration files
ii nvidia-kernel-common-510 510.47.03-0ubuntu0.20.04.1 amd64 Shared files used with the kernel module
ii nvidia-kernel-source-510 510.47.03-0ubuntu0.20.04.1 amd64 NVIDIA kernel source package
ii nvidia-prime 0.8.16~0.20.04.1 all Tools to enable NVIDIA’s Prime
ii nvidia-settings 470.57.01-0ubuntu0.20.04.3 amd64 Tool for configuring the NVIDIA graphics driver
ii nvidia-utils-510 510.47.03-0ubuntu0.20.04.1 amd64 NVIDIA driver support binaries
ii screen-resolution-extra 0.18build1 all Extension for the nvidia-settings control panel
ii xserver-xorg-video-nvidia-510 510.47.03-0ubuntu0.20.04.1 amd64 NVIDIA binary Xorg driver

Any update to this? I’m pretty disappointed in my new laptop with 3070 + AMD hybrid graphics. Using any external display via HDMI or DP causes extremely high CPU usage doing nothing. If the above comments are true, it’s also flooding my SSD with needless writes, probably shortening its life, too.

Using discrete graphics only is not a solution either; battery life would be horrendous, and on top of that, the nVidia drivers (for both discrete only via mux, or using prime-run) result in NV-GLX error 156 when running some 32-bit OpenGL applications…forcing me to use the iGPU (which works perfectly fine). Further, when running discrete-only, the backlight controls do not work, it’s stuck on full brightness.

I feel boxed into a corner having paid a premium for top-tier hardware that doesn’t work properly for some basic/fundamental things, like an external display, 32-bit programs, and backlight control…

I use Ubuntu, and Razer Blade 2020 15" laptop. The HDMI port is connected to nvidia dGPU directly.
If I choose prime-select on-demand, I meet the same problem, and nvidia-smi shows power usage 11w.
If prime-select nvidia, the problem disappears, but nvidia-smi shows more power usage 27w.

No LSB modules are available.
Distributor ID: Ubuntu
Description:    Ubuntu 21.10
Release:        21.10
Codename:       impish

After a year ago it has not been fixed, got frustrated to fix this, seems it’s bug with the Nvidia Driver and Xorg, Using Wayland is actually has no problem like Xorg has but Wayland is not well implemented with Nvidia it’s very buggy and not useable. Though Xorg is more useable with Nvidia if we don’t care about power being consumed when switch to full Nvidia mode with an external Monitor on along with internal, the power usage on idle was reached 30watt only opening browser scrolling up and down it triggers fan to kick in like a jet, temperature gets constantly above 80 degree Celsius, but if we not using an external monitor it runs fine, or with external monitor using Hybrid mode it also trigger the cpu fan instead not the Nvidia one CPU on Idle was 70 Celsius and Xorg was consumed about 30% of CPU Usage on idle without cursor movements. It’s really annoying for the last year I stick with Windows 11, I hope this Problem will fix soon as possible as the Nvidia Drivers get to Open Source. I am waiting for Wayland to fully support with Nvidia Proprietary Drivers.

Same issue with my lenovo legion 5 pro (amd + 3070) the fans won’t stop spinning and the gpu utilization is around 33%, I’ve already filed a bug High Nvidia GPU utilization on external monitor and just come across this post. Can you please fix this!

I have a very simple solution that solved it for me.

I have an Acer Nitro with NVIDIA GeForce GTX 1650. After installing Ubuntu 22.04, every time I attached an external monitor the Xorg CPU usage was 25% - 33% even when nothing is running. This happens with driver 470 and 510.

To fix it - run the NVidia X-server settings app. Under Prime Profiles, select
NVIDIA (Performance Mode).
Reboot. Now Xorg CPU usage is 0.3%.

1 Like

Same here. Lenovo Thinkpad T15g, external monitor connected via HDMI. Arch Linux, everything latest. With adaptive or audo performance settings Xorg shows ~30+% CPU usage. When setting the performance to maximum it goes down but the temperature goes up and nvtop shows power consumption and high frequency for the GPU.

Thanks to the people positing on this thread, it is good not to be alone with the problem.

That solution also works for me. Laptop attached and external monitor using a GTX 1060 mobile.
Thank you!

Two months later, and not even an acknowledgement of the bug report. Sad state of affairs with this company.

That solution also works for me.

I hardly can call it a “solution” - more power consumption, more heat and this is a software problem. Nvidia does not listen to its customers. Next laptop I get will NOT have Nvidia hardware.

This is not solution, when I switched to Nvidia Performance mode it caused to generate more power and heat so it triggered fan to rump up very loud. I looked into temps monitor nvidia-smi the Nvidia used about 30watt power constantly even I just move around cursor this was caused more heat and fan ramping up so out loud,
I will not install any Linux on Hybrid laptop so I will stick on Windows due to Driver software problems on Linux, I’ve been waiting for this about 1 year nothing seemed to fix I just gave up not to try another solution I will get rid Linux on my entire drive.

Another month gone by, and still no response from nVidia AT ALL. This serious issue has been around for 1.5 YEARS. Laptop can only function using the built-in display, what a farce.

Another two weeks, another ignored major issue.

I have the same problem. I have Pop OS 22.04 with Nvidia Graphics.
That’s a shame that Nvidia doesn’t fix this issue !

Hi All,
I have tested few systems like Alienware m17; Acer Nitro AN517-41 and Dell Alienware with Ubuntu and Fedora releases recently with latest released driver but could not duplicate issue locally.
I will try to find similar systems reported in this thread and share test results with pop OS.
Apologies for the delayed response but will make sure to prioritize it.

1 Like

Found this thread trying to figure out whats going on here. The profile is currently in “NVIDIA On-Demand” mode.

External Samsung 34" 3440x1440@100hz connected over usb-c and providing power to the laptop. Laptop screen is also in use at 2560x1600 resolution

System and driver information as well as top snapshot:

I am on Omen 15 2020 which has Ryzen 7 4800H RTX 2060 after updated to the latest stable kernel and Nvidia drivers the problem is still there, Currently using PopOS, but I had tested a bunch of ditros with the same exact issues.

As far as I can tell, this depends on the specific hardware video output path (configuration) of the device.

Case in point for my Dell Inspiron 16 plus (7610), a Tiger Lake 11800H + Nvidia 3060 RTX device which has one HDMI output and one USB-C port (which serves graphics data via Thunderbolt Alternate Mode or DisplayPort Alternate Mode).

The HDMI port is exclusively served by the Intel GPU, and the Intel GPU itself is unable to send data via USB-C

The Nvidia GPU is the only GPU that is able to generate any output through the USB-C port (and, IIRC, the Nvidia GPU is unable to output to the built-in screen of the notebook).

(I believe I am describing a MUX design here, which is the “cheap” version of doing hardware)

In that constellation, with the exact same external 4K screen attached to the device on

  • HDMI == only Intel GPU works; all good
  • USB-C (using an Alternate Mode) == Intel GPU serves built-in display; Nvidia GPU serves external screen

In the latter configuration, you will see all this massive CPU load on the Xorg process; and all this load is coming from the in-process Nvidia driver, with a massive number of calls to the LInux VDSO to get the current time.

This posting of mine shows more technical detail: Nvidia X11 driver busy-polls kernel on clock_gettime in a tight loop - this also has an nvidia-smicall to expose the problem much better: It gets a whole lot worse, i.e. much worse than the badness in the “normal” case, if you downclock the Nvidia GPU.