Multi-xscreen configuration doesn't work on linux

https://nvidia.custhelp.com/app/answers/detail/a_id/176/~/linux---configuring-multiple-x-screens-on-one-card

Please, fix the documentation or fix the driver. Thank you!

Multiple X screens are still supported by the driver. What problem are you having with them?

Please note that many modern desktop environments such as GNOME and KDE don’t support more than one X screen, and there is a problem on some recent distributions that try to enable the AllowNVIDIAGPUScreens option in a ServerLayout section that overrides any custom layout specified in /etc/X11/xorg.conf.

Please run nvidia-bug-report.sh as root and attach the resulting nvidia-bug-report.log.gz file.

If KDE doesn’t support it then it’s KDE’s problem, BUT there is something I want:

I have two monitors - a 60hz and a 144hz one. When I play on the 144hz monitor(which is also the primary display) then any time I enable vsync anywhere(even in the game) my fps will be limited to 60 until I disable the 60hz monitor. The “Sync to this display device” setting doesn’t do anything. What can I do to get 144fps on my 144hz monitor?

Did you try setting the __GL_SYNC_DISPLAY_DEVICE environment variable? It should have the display device name as a value (e.g. DP-0, HDMI-0, etc.) You can see the display names in nvidia-settings.

Hi Mr. Plattner,

I’ve noticed that nvidia-settings allows me to create configurations involving more than four screens. When I try to save the configs it warns me that it won’t work but allows me to continue.

Can you point me to documentation on how to compile X11 with an increased limit on screens? I’d like to have 6 of them (connected to my two RTX2080 cards.)

Alternatively is there some other way to get this done? Is there some way to manually write an X configuration that works around the limit of 4 screens?

(Sorry I don’t know how to use this forum software - not sure how to ping you directly. Hope you get this as I’ve been searching for a while.)

The server has an internal limit of 16 screens. Each GPU has a maximum of four display outputs, but you should be able to create more screens than that if you’re okay with some of them not having displays.

Can you please attach the error you’re getting?

Thanks so much for your reply.

I guess I’m not using the right terms - let me be more specific:

  • I’m trying to connect six LCD panels to two RTX-2080 cards under OpenSUSE 15.1 / KDE Plasma

  • The nvidia-settings program is working fine for me. It warns me that the X-Server will not support 5+ LCD panels but I can ignore the warning and save the resulting xorg.conf file. When I do this there is no failure or error message but the 5+ LCD panels don’t work together. The first four can work fine but the others are not active.

  • I’m guessing that you were trying to say above that X can handle 16 XScreens but, as you said before, KDE can only use XScreen0.

  • I’m looking for a work-around - perhaps a manual configuration involving Xinerama? - which would allow me to activate six LCD panels such that KDE would be able to use them under, for example, XScreen0. If it’s possible to recompile a piece of software somewhere with the increased LCD panel capacity I don’t mind giving it a try.

Thanks again for your help - much appreciated

Ah yes, the mapping from X screens to actual GPUs and display devices can definitely be confusing.

You’re correct that the server’s limit of 16 is for X screens, and KDE has a limit of just one of those. The NVIDIA driver can natively drive up to four display devices on a single GPU as anywhere from 1 to 4 X screens, depending on how you want to split the display devices up across X screens.

Where it gets complicated is trying to combine displays across GPUs. You basically have two options for that:

  1. Xinerama - this is where you allocate one or more X screen per GPU, and then the server does its best to combine them in software and make them look like a single unified X screen to clients (such as KDE)
  2. SLI Mosaic - this is where the NVIDIA driver binds GPUs together at the driver level and presents them to the X server as a single device, so the X server only sees one X screen.

Both of these option have their pros and cons. Please refer to the X server’s Xinerama documentation and chapter 12 of the NVIDIA README, on configuring multiple display devices on a single X screen.

Thanks again Mr. Plattner,

I went through the docs and tried both the SLI=Mosaic and the Xinerama options. I’m guessing the Mosaic didn’t work because of the hodge-podge of different monitors I’ve got connected in my test setup (3 panels on one RTX-2080 and 2 on another for now.)

Eventually I decided to work on the Xinerama configuration as it seems more promising. Using nvidia-xconfig I was able to produce an xorg.conf file:

nvidia-xconfig --force-composition-pipeline=FORCE-COMPOSITION-PIPELINE --xinerama --no-sli --virtual=6400x2160

Then I used nvidia-settings to edit the resulting config and worked through the ACPID error message (found lots of info on that, installed the acpid software to resolve it.)

Unfortunately, when I try to startx, it fails while loading modules. That’s as far as I got :(

Below is the output from nvidia-smi and the text from the bottom of the log file (hoping you’ll see something obvious.) Attached are the xorg.conf file and the full log in case that helps.

By the way I’m happy to try the SLI=Mosaic again if that might help in any way.

Thanks again for your invaluable help - very much appreciated, --Sam.

About the system:

  • Running OpenSUSE LEAP 15.1
  • Using the nvidia repo, running the latest drivers (440.59)

$ sudo nvidia-smi

Sun Feb 16 19:17:00 2020       
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 440.59       Driver Version: 440.59       CUDA Version: 10.2     |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|===============================+======================+======================|
|   0  GeForce RTX 2080    Off  | 00000000:01:00.0  On |                  N/A |
|  0%   55C    P0    36W / 245W |    907MiB /  7974MiB |      4%      Default |
+-------------------------------+----------------------+----------------------+
|   1  GeForce RTX 2080    Off  | 00000000:02:00.0 Off |                  N/A |
|  0%   36C    P8    16W / 245W |      1MiB /  7982MiB |      0%      Default |
+-------------------------------+----------------------+----------------------+
                                                                               
+-----------------------------------------------------------------------------+
| Processes:                                                       GPU Memory |
|  GPU       PID   Type   Process name                             Usage      |
|=============================================================================|
|    0      3497      G   X                                            451MiB |
|    0      3638      G   /usr/bin/kwin_x11                            160MiB |
|    0      3641      G   /usr/bin/plasmashell                         133MiB |
|    0      3704      G   /usr/bin/ksysguard                             6MiB |
|    0      3749      G   /usr/bin/ksysguard                             6MiB |
|    0      4572      G   /usr/lib64/firefox/firefox                     6MiB |
|    0     12116      G   ...AAAAAAAAAAAAAAgAAAAAAAAA --shared-files    61MiB |
+-----------------------------------------------------------------------------+
[    27.549] (II) NVIDIA(1): [DRI2] Setup complete
[    27.549] (II) NVIDIA(1): [DRI2]   VDPAU driver: nvidia
[    27.549] (WW) NVIDIA(1): Not registering RandR
[    27.549] (II) Initializing extension Generic Event Extension
[    27.549] (II) Initializing extension SHAPE
[    27.549] (II) Initializing extension MIT-SHM
[    27.549] (II) Initializing extension XInputExtension
[    27.549] (II) Initializing extension XTEST
[    27.549] (II) Initializing extension BIG-REQUESTS
[    27.549] (II) Initializing extension SYNC
[    27.549] (II) Initializing extension XKEYBOARD
[    27.549] (II) Initializing extension XC-MISC
[    27.550] (II) Initializing extension SECURITY
[    27.550] (II) Initializing extension XINERAMA
[    27.550] (II) Initializing extension XFIXES
[    27.550] (II) Initializing extension RENDER
[    27.550] (II) Initializing extension RANDR
[    27.550] (II) Initializing extension COMPOSITE
[    27.550] (II) Initializing extension DAMAGE
[    27.550] (II) Initializing extension MIT-SCREEN-SAVER
[    27.550] (II) Initializing extension DOUBLE-BUFFER
[    27.550] (II) Initializing extension RECORD
[    27.550] (II) Initializing extension DPMS
[    27.550] (II) Initializing extension Present
[    27.550] (II) Initializing extension DRI3
[    27.550] (II) Initializing extension X-Resource
[    27.550] (II) Initializing extension XVideo
[    27.550] (II) Initializing extension XVideo-MotionCompensation
[    27.550] (II) Initializing extension GLX
[    27.550] (II) Initializing extension GLX
[    27.550] (II) Indirect GLX disabled.
[    27.550] (II) GLX: Another vendor is already registered for screen 0
[    27.550] (II) GLX: Another vendor is already registered for screen 1
[    27.550] (II) Initializing extension XFree86-VidModeExtension
[    27.550] (II) Initializing extension XFree86-DGA
[    27.550] (II) Initializing extension XFree86-DRI
[    27.550] (II) Initializing extension DRI2
[    27.550] (II) Initializing extension NV-GLX
[    27.550] (II) Initializing extension NV-CONTROL
[    27.630] (EE) 
[    27.630] (EE) Backtrace:
[    27.630] (EE) 0: X (xorg_backtrace+0x65) [0x557926398f45]
[    27.630] (EE) 1: X (0x5579261e7000+0x1b5c19) [0x55792639cc19]
[    27.630] (EE) 2: /lib64/libpthread.so.0 (0x7f091a220000+0x12300) [0x7f091a232300]
[    27.630] (EE) 3: /lib64/libc.so.6 (gsignal+0x110) [0x7f0919e9c160]
[    27.630] (EE) 4: /lib64/libc.so.6 (abort+0x151) [0x7f0919e9d741]
[    27.630] (EE) 5: /lib64/libc.so.6 (0x7f0919e66000+0x2e75a) [0x7f0919e9475a]
[    27.630] (EE) 6: /lib64/libc.so.6 (0x7f0919e66000+0x2e7d2) [0x7f0919e947d2]
[    27.630] (EE) 7: X (0x5579261e7000+0x41ca9) [0x557926228ca9]
[    27.630] (EE) 8: X (0x5579261e7000+0x9b9c8) [0x5579262829c8]
[    27.630] (EE) 9: X (_CallCallbacks+0x34) [0x5579262487f4]
[    27.630] (EE) 10: X (0x5579261e7000+0x60442) [0x557926247442]
[    27.630] (EE) 11: /lib64/libc.so.6 (__libc_start_main+0xea) [0x7f0919e86f8a]
[    27.630] (EE) 12: X (_start+0x2a) [0x55792623112a]
[    27.630] (EE) 
[    27.630] (EE) 
Fatal server error:
[    27.630] (EE) Caught signal 6 (Aborted). Server aborting
[    27.630] (EE) 
[    27.630] (EE) 
Please consult the The X.Org Foundation support 
	 at http://wiki.x.org
 for help. 
[    27.630] (EE) Please also check the log file at "~/.local/share/xorg/Xorg.0.log" for additional information.
[    27.630] (EE) 
[    28.297] (EE) Server terminated with error (1). Closing log file.

xorg.conf.txt (2.32 KB)
xorg-failed.log (19.6 KB)


Interesting info related to multi-gpu, multi-monitor setups:

https://www.phoronix.com/forums/forum/software/linux-gaming/33627-multi-gpus-and-multi-monitors-a-windows-gamer-wanting-to-use-linux?p=396620#post396620