Optimus and External Displays - External outputs on nvidia gpu

Attempting to use:

  • Fedora 31
  • an external display
  • in Hybrid mode
  • on a Lenovo P52
  • over HDMI
  • using the nvidia drivers from rpmfusion (akmod-nvidia-440.44-1.fc31.x86_64)

The screen is not detected by the “displays” app, the screen is not used, nvidia-settings detects the screen but does not allow it to be enabled.

Reproducible: Always

Steps to Reproduce:
Connect screen over HDMI

Actual Results:
Nothing

Expected Results:
Screen is used

$ intel-virtual-output
No VIRTUAL outputs on “:1”.

Attempted to fix with:
xrandr --setprovideroffloadsink modesetting nvidia-G0

I got:
X Error of failed request: BadValue (integer parameter out of range for operation)
Major opcode of failed request: 140 (RANDR)
Minor opcode of failed request: 35 (RRSetProviderOutputSource)
Value in failed request: 0x1f9
Serial number of failed request: 16
Current serial number in output stream: 17

Some more details in bugzilla: 1793176 – Lenovo P52 (optimus) cannot use external displays

This limitation is actually documented on rpmfusion, I’d like to know how to get it working:
https://rpmfusion.org/Howto/Optimus#External%20Monitors%20detection

Is there a way to enable the external monitor connections without needing to use the nvidia gpu as the PrimaryGPU at all times? Obviously, this is more power-hungry than I would like.

I second this.

Is there a chance of dispensing with using nvidia as the primary GPU?

I think I tried every possible solution, and I didn’t succeed. I found some comment where it says that the driver does not support reverse prime on Linux, but no official confirmation. Is this the case? Are there any workarounds that are not rebooting and using the dGPU as primary?

No, not possible at the moment, the prop. nvidia driver does not support the needed output sink capability. Officially confirmed, somewhere in this forum and nvidia devs are (hopefully) looking into it.

Thanks, it’s the same for thunderbolt docks and adapters, everything gets wired to the dGPU. The problem with using the dGPU as primary is that X and gnome-shell suck a lot of GPU memory that I need for other tasks (e.g. deep learning). So I really hope that NVIDIA is listening an will implement a PRIME sink soon…