NVIDIA 331.13 now supports " Sink Output" capability?

Is an OPTIMUS setup with NVIDIA-0 as an output sink provider/render offloading now possible using that new :
Option “AllowEmptyInitialConfiguration” “True”
?

No, this option simply allows the X server to start with no attached display devices (as far as the NVIDIA chip is concerned) without forcing the use of “NoScanout” mode, which prevents display devices from being attached to the NVIDIA GPU while the X server is running. The NVIDIA driver still only supports the Source Output capability.

I’ll be sure to add a note to the changelog if support for the Sink Output capability is added. Would this capability be useful to you? Knowing whether there are users who would appreciate it would help us prioritize things.

Both Sink Output and render offloading would be useful for me. My Thinkpad T530 has a miniDP connector that is hardwired to the Nvidia card. So it’d be nice to be able to use the Intel card most of the time for lower battery usage and turn the Nvidia card on when I need to use the external display. I’ve been using bumblebee for render offloading. Official support would be more convenient.

Thanks for clearifying this, Aaron. Sure this capability would be useful; to be honest, this is what we are waiting for since you added the output source capability in 319.12; right now I (and most guys from ubuntuusers.de I know) use a workaround when battery stamina is needed: Dirty script that restarts X on Intel switching off Nvidia GPU using bbswitch module borrowed from bumblebee project.
Those guys with DP/HDMI wired to Nvidia GPU cannot use their external monitors though.

I got a W530 with the same Problem as conky_vs_julian, but if i understand the optimus problem correctly, Sink Output capability alone would not help us, as we cannot load the nvidia and intel drivers at the same time (conflicting libgl).
However, nvidia seems to be working at this, which would allow us do to the latter, and then, Sink Output would enable us to have a fully working optimus setup.

@aplattner:
Maybe you should start some surveys to find out what the users want most?
My personal priority for improvements within the nvidia driver would be:

  1. Support for kernel 3.10+
  2. Support for Sink Output
  3. Support for loading intel and nvidia drivers at the same time (libglvnd, DRI3?)
  4. more EGL-Support, with wayland support
  5. faster 2D Acceleration for xrender
  6. Ability to power off the nvidia card using the nvidia driver

Ranked by personal priority, and as you ca see point 2,3 and 6 are mainly for my optimus setup.

By the way, thanks for listening to us, Aaron ;)

Sure? So how does output source work then? My xrandr --listproviders shows
Providers: number : 2
Provider 0: id: 0x2a6 cap: 0x1, Source Output crtcs: 0 outputs: 0 associated providers: 1 name:NVIDIA-0
Provider 1: id: 0x45 cap: 0xb, Source Output, Sink Output, Sink Offload crtcs: 2 outputs: 4 associated providers: 1 name:Intel

And my uname -r:
3.11.0-3-generic

;)

And which libgl do you use? Or, what ist the output of glxinfo?

The conflicting libGL problem only matters for render offload support. With Sink Output, the Intel card is doing all the rendering. The Nvidia card is just displaying it to the external screen (reverse optimus) so you’d just use Mesa libGL in that case.

For Source Output, the Nvidia card is doing all the rendering and the Intel card is just displaying (regular optimus). You just use Nvidia libGL for this case.

So, Sink Output capability would help, and you could switch between mesa and nvidia libgl with a simple bash script?

conky_vs_julian is correct: Sink Output would allow rendering to happen on the Intel chip and be displayed on a display device connected to an NVIDIA GPU. No rendering would happen on the NVIDIA GPU in that scenario.

Source Offload is the capability that would allow the desktop to be rendered on an Intel GPU while certain applications are rendered on the NVIDIA GPU and piped to the Intel chip for display. This is more difficult than the Source / Sink Output capabilities because it requires dynamically choosing which vendor’s OpenGL implementation to use, and negotiating a context with the appropriate vendor’s X driver inside the X server. libglvnd should help with the client side of that.

Sink Offload would allow the NVIDIA GPU to display OpenGL content rendered on a different GPU. By itself, that doesn’t seem particularly useful, but hopefully it could be used to render OpenGL content on one NVIDIA GPU and display it on another. I can’t promise that I’ll get around to implementing that feature.

I hate to dig up an old thread, but was curious if any work has been done towards adding support for source offload and sink output. I am able to do this with the nouveau driver, but the 3D performance leaves a lot to be desired (mostly due to inability to reclock).

No, I’ve been focusing on G-SYNC lately and haven’t had time to work on this.

Hello,

I am not sure if any of the sink/source offload/output methods should work for me, but I can’t seem to get the HDMI port to work on my Optimus GT 720M. I also have the Intel GPU that ships with my i7 4600u. I reckon the HDMI is physically connected to the GT 720M GPU (though I am not sure how to check this), but when I open nvidia-settings, it says “No scan out”. Is this a normal behaviour considering the offloading limits on the Nvidia drivers for Linux? I am using the latest Nvidia drivers 331.89.

The most bizarre part is that if I connect a Dell external screen through the aforementionned HDMI port, I get an output, but it is all reddish (the monitor works perfectly on the other computers with an HDMI to display port adaptor). I am also quite puzzled by the absence of any “VGA compatible controller” mention on my discrete GPU:
00:02.0 VGA compatible controller: Intel Corporation Haswell-ULT Integrated Graphics Controller (rev 0b)
03:00.0 3D controller: NVIDIA Corporation GF117M [GeForce 610M/710M/820M / GT 620M/625M/630M/720M] (rev a1)

I am using Ubuntu 14.04 (I had the same problem on 13.10) and my laptop model is Dell Latitude E5540.

Thanks in advance for the clarification,
Dakeryas

nvidia-bug-report.log.gz (78.3 KB)

The GF117 GPU doesn’t support display. I.e. it was literally built without a display engine:

[     6.399] (--) NVIDIA(0): Valid display device(s) on GeForce GT 720M at PCI:3:0:0
[     6.399] (--) NVIDIA(0):     none

That means that all of the display outputs on your laptop are necessarily connected to the Intel GPU. The NVIDIA X driver automatically goes into “no scanout” mode (as if you had specified Option “UseDisplayDevice” “none”) on GPUs with no display.

Ok, thank you very much for your reply.

Maybe I should switch to another topic, but I don’t really understand then why I have a reddish display on the monitor (as I said, it works on other computers via HDMI and works on mine via a VGA connection).

Xrandr lists two HDMI connections, and whether I switch with Prime to the GeForce GPU or to the Intel GPU, I have the same reddish output (all is neat underneath the red stripes that flicker on the screen)… It works fine in that I can move windows onto it, interact with them, but the output is awfully ugly. I have attached a picture of the same wallpaper on two different screens.

Any ideas about what I should look for ? So it can’t relates to the Nvidia drivers a priori ?

With the GT 720M rendering I get:
HDMI-1-0 disconnected
HDMI-1-1 connected 1920x1200+3840+0 518mm x 324mm

With the Intel HD rendering I get:
HDMI1 disconnected (normal left inverted right x axis y axis)
HDMI2 connected 1920x1200+1920+0 (normal left inverted right x axis y axis) 518mm x 324mm

I’m not happy to revive such an old thread but has there any progress been made on implementing render offload or output sink capabilities in the nvidia driver?

I played with it recently just to see if I could manage to use the outputs from the Intel graphics on my motherboard along with my Nvidia GPU (which it’s fair to say that there wasn’t much point to as the GPU itself has 5 outputs and supports 4, so at best I’d be saving myself $10 on monoprice dongles if I wanted two DVIs rather than DisplayPorts), and I could almost get it to work, but the resolution wasn’t right on the motherboard-connected displays (rather, it seemed to be set correctly internally, but the mouse cursor was appearing at the wrong offset and other weird stuff like that).

Frankly, given that this involved more editing of Xorg.conf than I’ve had to do in like a decade (due to xrandr pretty much handling everything else automatically nowadays), I don’t think it’s worth it. Even if it can technically work it’s a little too flaky. I remember being able to run two monitors off of a 9800GT and one off of the onboard Intel G31/G35 chipset in Windows 7 back in 2009 or therabouts, and it’s impressive that they got that to work at all, but it really seems like that was the only time that the feature was technically useful, given how many more outputs discrete GPUs have supported since then.

Incidentally, it does seem like there’s a minor flaw in the Linux driver for Nvidia GPUs supporting four-out-of-five-outputs; in Windows this is usually “any combination of four,” but in Linux it seems like one HDMI always conflicts with one of the DisplayPorts, so you’ll get a driver crash on startup without knowing why unless you swap which outputs you’re connected to.