Using xrandr to setup a multi-gpu and multi-monitor machine

Greetings

I am looking at using using the 390.141 on my system.
What makes things challenging is that I’ve got both a GF110 (570) and a GP107 (1050 Ti) gpu.
The GF110 has 2 - 1920x1080 monitors attached.
The GP107 has 3 - 1 - 3840x2160 and 2 - 1920x1080.
Running on Debian testing.

Can I use xrandr to position the monitors on ‘one’ screen (7680x3000)?

( I don’t want to use Xorg.config or nvidia-xconfig if possible.)

Not advisable. You would have to enable Xinerama (requires xorg fiddling) and a non-compositing DE.

Sorry - - - your response says that Xinerama is necessary - - - - -ok.

But is xrandr usable to set up the monitors on the screen?

Not possible with plain xrandr.

Hmmmmmmmmmmmm - - - - what is ‘plain xrandr’?

Please elucidate.

Let’s put it “not with xrandr at all on dual nividias of different gen”, if that’s clearer. Having intel+nvidia or amd+nvidia then you could use PRIME, enabled by xrandr.

generix Top Contributor
March 3 |

  • | - |

Let’s put it “not with xrandr at all on dual nividias of different gen”, if that’s clearer. Having intel+nvidia or amd+nvidia then you could use PRIME, enabled by xrandr.

Interesting - - - that.
The docs with the official download say that randr 1.3 is supported.
The information over at debian indicates that randr 1.4 is supported.
There is a file included which is:

/usr/share/doc/nvidia-driver/html/randr14.html
So it would seem that randr1.4 is supported. 
Using randr1.4 one can use the command 
$ xrandr --setprovideroutputsource to activate 2 different 
gpus. (Have done that a few time whilst running nouveau.) 
So there is a possibility - - - -which you do not seem to 
know about and until I've tried - - - - well I just won't 
know until I actually give it a whirl. 

Thanking you for your assistance. 

Regards

Where I ran into difficulty was in getting the second graphics card working.
Seems like nviida just doesn’t like that idea and I can’t find a solution in the searching.

So xrandr actually works quite well - - - - - what doesn’t work is trying to activate the second card.

you could try this:
nvidia-xconfig --no-composite -a --force-generate --separate-x-screens --xinerama -o xorg.conf
and then put it into /etc/X11

generix Top Contributor
March 4 |

  • | - |

you could try this:
nvidia-xconfig --no-composite -a --force-generate --separate-x-screens --xinerama -o xorg.conf
and then put it into /etc/X11

Asking questions before I spend a couple hours doing the changes and the testing:

reading your command - - - it looks like you’re saying separate x screens.
I would understand that to mean 5 separate screens.
What I’m looking for is ‘1’ screen some 7680x3000 pixels.

Please advise?

Xinerama combines physical X screens into one logical one. In your case you’d have one X screen per GPU, each spanning multiple physical displays.

RandR’s display offload features are not supported across NVIDIA GPUs so Xinerama is currently the only option available to bind displays across GPUs when SLI Mosaic isn’t available.

aplattner Moderator
March 4 |

  • | - |

Xinerama combines physical X screens into one logical one. In your case you’d have one X screen per GPU, each spanning multiple physical displays.

Tried that - - - - it was only allowing some misshapen monstrosity for a layout.

RandR’s display offload features are not supported across NVIDIA GPUs so Xinerama is currently the only option available to bind displays across GPUs when SLI Mosaic isn’t available.

Interesting!!
The open source driver ‘does’ allow one to do that.

Very interesting!!

Regards

Using zaphod heads, i.e. separate screens is/was a xinerama trick to work around positioning issues.

generix Top Contributor
March 5 |

  • | - |

Using zaphod heads, i.e. separate screens is/was a xinerama trick to work around positioning issues.

Thank you!

It would appear that positioning is more than somewhat of a full sized mountain hidden under the carpet.