External monitor doesn't work on Wayland

Under a Wayland session my external monitor is detected however there’s nothing displayed on it just a black screen.

OS: Fedora 36
Nvidia driver: 510.68.02
Kernel: 5.17.6-300.fc36.x86_64

Laptop model: Acer Predator G3-571
iGPU: Intel HD 630
dGPU: Nvidia GTX 1060

HDMI is wired to the GTX 1060 and internal screen is wired to iGPU (muxless design).

Hi, same problem here.

OS: Fedora 36
Nvidia driver: 510.68.02
Kernel: 5.17.5-300.fc36.x86_64

Laptop model: HP Omen 15
iGPU: Intel Mesa
dGPU: Nvidia GTX 2060

OS: Fedora 36
Nvidia driver: 510.68.02
Kernel: 5.17.6-300.fc36.x86_64

Laptop model: Nitro AN515-44 V1.04
iGPU: AMD ATI 05:00.0 Renoir
dGPU: Nvidia GTX 1650 (TU117M)

HDMI is wired to the GTX 1650 and internal screen is wired to iGPU.

UPDATE: In fact, Wayland doesn’t work at all, even with the built-in display only, it falls back to AMD Renoir Driver, as shown by GNOME’s about dialog, which says Graphics: AMD RENOIR / AMD RENOIR

We were able to duplicate issue locally and filed a bug 3644077 internally for tracking purpose.
Shall keep posted with further updates.


Wonderful news @amrits , much appreciated! I will be the happiest guy on earth if I can finally fully use my laptop with an external monitor and Wayland =D


Just a quick update: just upgraded the driver on Fedora to 510.68.02 and, to my surprise, I was able to enable Wayland and use my external display! =D \o/

Still confused with GNOME’s “about” dialog that still shows Graphics: AMD RENOIR / AMD RENOIR. Also, Firefox’s about:support page shows:

Window Protocol	wayland
GPU #1
Active	Yes
Description	AMD RENOIR (LLVM 14.0.0, DRM 3.44, 5.17.11-300.fc36.x86_64)
GPU #2
Active	No
Vendor ID	0x10de
Device ID	0x1f99

Even though PrimaryDisplay: yes is set on /etc/X11/xorg.conf.d/nvidia.conf.

Is this the expected behavior?

EDIT: unfortunately, windows/screen sharing doesn’t work on Wayland, which is a huge no-go for me, because of the countless work meetings :-(

Another update, just for the record: I did manage to enable screen/window sharing on Wayland, it turned out I had to specifically enable support both on Firefox (installing the firefox-wayland package and setting an experimental flag on Chrome). Once it is done, the process to choose what to share is very cumbersome (you have to go through GNOME’s sharing permission dialog, sometimes more than once), but it works.

So, I’ll stick to Wayland as long as I can. We’re moving forward, and this is very exciting!

BTW I just upgraded to the 515.57 driver, everything working so far

I have the next setup

OS: Archlinux
DE: GNOME 43.alpha and GNOME 42
Nvidia driver: 515.57

Laptop model: MSI GS63 Stealth 8RE
iGPU: Intel Mesa
dGPU: Nvidia GTX 1060


➜ lspci|grep VGA
00:02.0 VGA compatible controller: Intel Corporation CoffeeLake-H GT2 [UHD Graphics 630]
01:00.0 VGA compatible controller: NVIDIA Corporation GP106M [GeForce GTX 1060 Mobile] (rev a1)

My laptop has two display connectors:

  • Mini Displayport: this output always works,
  • HDMI: this output always remains blank, currently I have connected the 4k monitor

The problem is that I got the laptop display and the small external display (connected via the miniDisplayPort) working, but the display connected via HDMI is blank in wayland. On X11 all the three displays work as expected.

As you can see, gnome-control-center can view all the monitors but no output on the HDMI connected display.

If I go back to nouveau driver all three displays work but with awful performance.

Ok, I got it to work loading the nvidia-drm kernel module.

But the performance on the external displays is awful

That’s CPU copy, at this point just use Windows 11 like I do 😎

Hi @andre.ocosta
The compositor will still be running on the integrated GPU, as will all applications by default unless __NV_PRIME_RENDER_OFFLOAD=1 is set. So that is why GNOME and Firefox’s “about” pages list AMD as the GPU vendor.

1 Like

Thanks for the explanation @amrits , much appreciated. I never understood this, but now that you mentioned it, it makes perfect sense.

So, if I understood it correctly, this isn’t a bad thing per se, but it creates the necessity for iGPU ↔ dGPU data transfer in some contexts such as outputting to external monitors wired to the dGPU (eg. HDMI), which can be a problem; __NV_PRIME_RENDER_OFFLOAD=1 doesn’t incur on such performance penalties since everything is handled by the same driver, but it consumes more power. Is that right?