Support external displays in render offload mode?

I’m running Ubuntu 19.10 (which ships a version of X that has the commits that allow render offloading) on a Lenovo Thinkpad X1 Extreme with a 1050 Ti Mobile (driver version 440.26 off of Ubuntu’s “graphics-drivers” ppa: https://launchpad.net/~graphics-drivers/+archive/ubuntu/ppa).

Render offload mode (seemingly called “Nvidia on-demand” in the Nvidia X server settings app) works great for the laptop’s screen.

But I am not able to use external displays on the render offload mode, only on “Nvidia” mode. In render offload mode, I can connect an external monitor to the laptop’s USB-C port and it appears in the Nvidia settings window but Ubuntu seems to not see it. The monitor acts like it is not receiving any signal. Is it possible to support external displays?

Yes and no. Any external displays connected to the nvidia gpu don’t work in render offload mode, you’ll have to switch back to prime output mode for that.
Any external displays connected to the igpu will work, though. Many modern notebooks have an USB-C connector with displayport capability which is connected to the igpu.

Right, I am attaching a monitor to the laptop by connecting the laptop’s usb-c port to the monitor’s displayport input but it does not appear to work in offload mode, only in full-Nvidia mode.

That connector is connected to the nvidia gpu as seen in nvidia-settings, so it doesn’t work. Any other dp connectors?

After looking into it, it looks like for this model of laptop all the display outputs are wired to the discrete GPU.

So it is not possible to render on the integrated GPU and then have the discrete GPU just pass the frame through to the monitor?

No, currently only nouveau implements the prime output sink feature. Still, it wouldn’t be possible or at least a good idea to combine this with render offload since then the nvidia gpu has to render, copy the picture to the igpu mem for composition, then copy it back to the nvidia vmem for display. Would lead to horrible performance, I’d imagine.

generix is right, output sink support is not yet available. But we know it’s an often-requested feature and we’re looking into what we can do to support it.

Great, thank you for the info.

@aplattner Also, great work on the current render offload stuff, I have found it very useful despite this limitation.

Is there an issue I could sign up for to get alerted when at least provisional version is available with output sink functionality? I had to revert back to using integrated GPU due to inability to resume properly on my thinkpad p1 (see https://devtalk.nvidia.com/default/topic/1066963/linux/if-laptop-lid-is-closed-during-suspend-quot-failed-to-set-mode-no-space-left-on-device-quot-upon-resume/post/5407043/)

I wanted to chime in and add that adding the output sink support would be really helpful. Are there any plans/timeline to work on it?

I am also interested in “output sink support”. Are there any news?

Also, to clarify, would this mean the Nvidia card would get powered down so my battery would last longer (and the laptop would be quiter, since the fan is always on with nvidia GPU on)?

Output sink support when the source side is Intel was first added in the 450.51 beta.

  • Added support for displays connected to NVIDIA GPUs to act as PRIME display offload sinks, also known as “Reverse PRIME”. See the chapter titled “Offloading Graphics Display with RandR 1.4” in the README for additional information.

On Turing and up, the dGPU should be able to power off if the external displays are disabled and the runtime power management options described in the README are enabled.

I see. thank you for your reply, great news this is being added.

Do I understand it correctly that when external displays are used, the dGPU is always going to be on, no matter how the drivers are developed further, since it is a hardware requirement? Is there at least a future possibility for some power management? I imagine running external display is much less computationally demanding then running complicated 3D computations. I am not asking so much because of power consumption (though that is also a factor) but because of the fan noise. On my laptop, when the dGPU is on, the fan is always on and it is loud.

There are a variety of levels of power management. The runtime power management I’m talking about either powers the GPU off completely, or puts it into a low-power state where the computation cores are powered off and the video memory is in a low-power self-refresh mode. Turing and up can enter those low-power states if the external displays are powered off, either because they’re disabled in the display control panel or because DPMS kicked in and blanked the displays (and the HardDPMS option isn’t disabled in xorg.conf).

If the displays are on, then the GPU has to remain powered up so the display engine and video memory interface can work, and the GPU has to do at least some work to copy frames from system memory to video memory for display. However, the GPU has a variety of performance levels that it can use when it’s on. If it’s mostly being used for display, then the GPU will drop to a lower power state without powering off completely.

A perfect explanation, thank you!

Though in practice, it is probably a lottery if a given laptop is going to be silent while using external monitors or not :-D. Are there any benchmarks of how much power is used in the various power states?

nvidia-smi can usually report power usage numbers. You can also use it to query clock speeds, or you can see that information in the GPU page in nvidia-settings.

GPU video memory is generally designed for performance rather than power, so it’s generally expected for it to use more power when it’s running. What that means for noise really depends on the platform’s cooling solution.