Desktop, 2x GT 640, Arch, 3.11.1-1, nvidia 325.15, NO xinerama

Hello everyone :)

I have been reading and searching for a year for the answers to this, and I feel Im almost there. I have tried SO many different things. All I want is to use both Nvidia cards to drive separate monitors, but bound together like xinerama, but WITHOUT using xinerama. Ive heard that xrandr is supposed to take care of this, but alas, I cannot find any recent documentation on xorg, modesetting, dual cards, xrandr, etc etc. that is pointing me in the right direction.

any ideas?

Here are some of my system specs:

4-monitors (2 on each card)

Non-SLI mobo (at least to my knowledge)
Core i3
MSI B75a-G43 motherboard

Arch Linux

Enlightenment

X.Org X Server 1.14.3
Release Date: 2013-09-12
X Protocol Version 11, Revision 0
Build Operating System: Linux 3.11.0-1-ARCH x86_64
Current Operating System: Linux AerithARCH 3.11.1-1-ARCH #1 SMP PREEMPT Sat Sep 14 19:30:21 CEST 2013 x86_64
Kernel command line: BOOT_IMAGE=/boot/vmlinuz-linux root=UUID=10800ed3-63ce-4d6a-a44b-35ebea1fc459 rw quiet

Current version of pixman: 0.30.2
xrandr program version 1.4.1
Server reports RandR version 1.4

3.11.1-1-ARCH

Nvidia 325.15 drivers

The only way I can even get all screens to turn on is by running separate X servers, and on each monitor, xrandr -q gives me ONLY the output of that device. I want to know how to combine them together, like, by associating output providers or something.

Please tell me there is hope. :) I have been hearing about people doing this lately, but again, I find very little documentation or tips.

Anything you cats can think of is well appreciated.

Anyone have any ideas? Maybe linux modules or vitualGL or something like that?

How can I span these 2 cards?

You can use the BaseMosaic option to span the desktop to up to three screens, or use a Kepler-based graphics board with support for up to four displays on a single card.

This is not the solution, as BaseMosaic is for spanning multiple cards. You can actually configure everything for 1 card, using all available heads, without even making a xorg.conf file. You can configure them through Xrandr, as the single card is a single framebuffer, and multiple heads on a single card is plug and play.

What I want is to use multiple heads on MULTIPLE cards, and bind them to one screen, like Xinerama, but without the drawbacks. BaseMosaic is supposed to take care of this, and my cards are supposed to be compatible with BaseMosaic (thats why I bought them). lol

So, whats up guys? I mean, its been a while, and only one response?

So, to continue, I have been reading a lot about SimpleDRM, kernel 3.12, and the rest of the graphics stack, and it seems as though the nouveau drivers will support spanning multiple framebuffers and binding them to one screen (using a different concept of Render Nodes, as opposed to the master/slave thing the driver stack has right now).

Is Nvidia working on implementing anything to take advantage of these features? Such as drm.rnode and any other multi-card, multiple-display, WITHOUT xinerama? Anything at all. I read for HOURS per day, learn a lot, and notice that Nvidia just doesnt seem to be doing much.

From what I saw, it seems as though DMA_BUF support is what is needed to facilitate this with the open-source graphics stack, but that there is a conflict between the GPL and Nvidia EULA or something like that? I mean, that is a SUPER sad reason for a feature that is prevalent in Windows, as far as merging framebuffers between multiple cards, to be left out, simply because of conflicting licenses. I do fully understand that there is a LOT that needs to be changed to make this work, but as of this next Xorg server release, xorg drivers (kms, drm, dri3000, GEM, etc.), and the kernel 3.12, the open-source community is ready.

Now, I have been a LIFELONG customer of Nvidia, but I have started to lose some faith, as the open source drivers, and lack of documentation provided to FOSS groups (yes yes, I know nvidia is rolling SOME docs out, but this is what, 15 years later?), rreeeaaally is cramping my style. I dont really NEED multiple monitors (6 of them), but when I can reboot my system directly into windows, and get what I want, and then reboot back into Linux, and the only thing is stopping me is a few FOSS updates (which are coming like SOON), and Nvidia Linux drivers getting up to date with all of that, well… Its very discouraging.

I continue to support Nvidia fully, but I really would like to see some more progress on integrating with the open graphics stack better, for further support in KMS and DRM and all of that.

Not to mention it would be nice to have someone in this forum at least GUESS at what can be done. So far, I have tried:

Xinerama - works like crap
virtualgl - same; crap
sep. X screens - Archaic
base-mosaic - DOESNT work, though my card is supposed to be supported
every single xorg.conf trick you can think of

and NONE of this is working.

So yeah. Anyone got anything for me? I mean, I want to talk dev, but no one seems to want to respond to hard stuff like this on Ubuntu forums, and a few others. Im hoping that these forums will give me some more insight, but, so far no response.

Thanks for anything you can shoot my way.

DJYoshaBYD

I’m afraid I don’t have anything to suggest immediately, but I’m curious what are the drawbacks of running separate X screens? “Archaic” doesn’t explain it. I guess you can’t move windows across screen boundaries. Anything else?

I also don’t see how VirtualGL entered the picture.

VirtualGL was used as a workaround to do a xinerama-like setup a few years back.

Archaic, in terms of not up-to-date nor up-to-par with a modern system for managing multiple framebuffers.

Yes, I cannot move across separate screens, which might as well be a multiseat setup. Not interested in that lack of flexibility.

BUT, I have managed to get it working somewhat.

Using:

Arch Linux
Kernel 3.12-rc5-1 64-bit
nouveau drivers
vf86-video-modesetting

xorg.conf looks like this (I have two cards, so I claim two cards)

Section "Device"
	Identifier  "Card0"
	Driver      "nouveau"
	BusID       "PCI:1:0:0"
EndSection

Section "Device"
	Identifier  "Card1"
	Driver      "nouveau"
	BusID       "PCI:4:0:0"
EndSection

xrandr --listproviders shows:

Provider 0: id: 0xc7 cap: 0x7, Source Output, Sink Output, Source Offload crtcs: 4 outputs: 3 associated providers: 0 name:nouveau
Provider 1: id: 0x66 cap: 0x7, Source Output, Sink Output, Source Offload crtcs: 4 outputs: 3 associated providers: 0 name:nouveau

I typed:

xrandr --setprovideroutputsource "0x66" "nouveau"

Now it shows:

Provider 0: id: 0xc7 cap: 0x7, Source Output, Sink Output, Source Offload crtcs: 4 outputs: 3 associated providers: 1 name:nouveau
Provider 1: id: 0x66 cap: 0x7, Source Output, Sink Output, Source Offload crtcs: 4 outputs: 3 associated providers: 1 name:nouveau

Notice associated providers now shows 1

Now, I run:

Screen 0: minimum 320 x 200, current 6064 x 1080, maximum 8192 x 8192
DVI-D-1 connected primary 1680x1050+0+0 (normal left inverted right x axis y axis) 408mm x 255mm
   1680x1050      60.0*+
   1600x1200      60.0  
   1280x1024      75.0     60.0  
   1440x900       75.0     59.9  
   1280x960       60.0  
   1152x864       75.0  
   1280x720       60.0  
   1024x768       75.1     70.1     60.0  
   832x624        74.6  
   800x600        72.2     75.0     60.3     56.2  
   640x480        75.0     72.8     66.7     60.0  
   720x400        70.1  
HDMI-1 connected 1440x900+1680+0 (normal left inverted right x axis y axis) 410mm x 257mm
   1440x900       59.9*+   75.0  
   1280x1024      75.0     60.0  
   1280x960       60.0  
   1152x864       75.0  
   1024x768       75.1     70.1     60.0  
   832x624        74.6  
   800x600        72.2     75.0     60.3     56.2  
   640x480        75.0     72.8     66.7     60.0  
   720x400        70.1  
VGA-1 connected 1024x768+3120+0 (normal left inverted right x axis y axis) 304mm x 228mm
   1024x768       60.0*+
   800x600        60.3  
   640x480        60.0  
   720x400        70.1  
VGA-2 disconnected (normal left inverted right x axis y axis)
DVI-D-2 disconnected (normal left inverted right x axis y axis)
HDMI-2 connected 1920x1080+4144+0 (normal left inverted right x axis y axis) 708mm x 398mm
   1920x1080      60.0*+   60.0     59.9  
   1920x1080i     60.1     60.0  
   1280x1024      75.0     60.0  
   1280x720       60.0     59.9  
   1024x768       75.1     70.1     60.0  
   1440x480i      60.1     60.1  
   800x600        72.2     75.0     60.3     56.2  
   720x480        60.0     59.9  
   640x480        75.0     72.8     60.0     60.0     59.9  
   720x400        70.1  
  1280x1024 (0x6c)  135.0MHz
        h: width  1280 start 1296 end 1440 total 1688 skew    0 clock   80.0KHz
        v: height 1024 start 1025 end 1028 total 1066           clock   75.0Hz
  1280x1024 (0x6d)  108.0MHz
        h: width  1280 start 1328 end 1440 total 1688 skew    0 clock   64.0KHz
        v: height 1024 start 1025 end 1028 total 1066           clock   60.0Hz
  1024x768 (0x70)   78.8MHz
        h: width  1024 start 1040 end 1136 total 1312 skew    0 clock   60.1KHz
        v: height  768 start  769 end  772 total  800           clock   75.1Hz
  1024x768 (0x71)   75.0MHz
        h: width  1024 start 1048 end 1184 total 1328 skew    0 clock   56.5KHz
        v: height  768 start  771 end  777 total  806           clock   70.1Hz
  1024x768 (0x72)   65.0MHz
        h: width  1024 start 1048 end 1184 total 1344 skew    0 clock   48.4KHz
        v: height  768 start  771 end  777 total  806           clock   60.0Hz
  800x600 (0x75)   50.0MHz
        h: width   800 start  856 end  976 total 1040 skew    0 clock   48.1KHz
        v: height  600 start  637 end  643 total  666           clock   72.2Hz
  800x600 (0x76)   49.5MHz
        h: width   800 start  816 end  896 total 1056 skew    0 clock   46.9KHz
        v: height  600 start  601 end  604 total  625           clock   75.0Hz
  800x600 (0x77)   40.0MHz
        h: width   800 start  840 end  968 total 1056 skew    0 clock   37.9KHz
        v: height  600 start  601 end  605 total  628           clock   60.3Hz
  800x600 (0x78)   36.0MHz
        h: width   800 start  824 end  896 total 1024 skew    0 clock   35.2KHz
        v: height  600 start  601 end  603 total  625           clock   56.2Hz
  640x480 (0x7b)   31.5MHz
        h: width   640 start  656 end  720 total  840 skew    0 clock   37.5KHz
        v: height  480 start  481 end  484 total  500           clock   75.0Hz
  640x480 (0x7c)   31.5MHz
        h: width   640 start  664 end  704 total  832 skew    0 clock   37.9KHz
        v: height  480 start  489 end  491 total  520           clock   72.8Hz
  640x480 (0x7d)   25.2MHz
        h: width   640 start  656 end  752 total  800 skew    0 clock   31.5KHz
        v: height  480 start  490 end  492 total  525           clock   60.0Hz
  720x400 (0x80)   28.3MHz
        h: width   720 start  738 end  846 total  900 skew    0 clock   31.5KHz
        v: height  400 start  412 end  414 total  449           clock   70.1Hz

Now, all of my outputs on all cards are available, and I can turn them off and on using arandr, krandr, xrandr, whatever. Also, it does NOT START ANOTHER X SCREEN, so I can fully drag windows everywhere, monitor resolutions and boundaries are respected when maximizing windows, and compositing and hardware accel. works… NO XINERAMA!! YEAAAHHHH!!! Pure xrandr, with the exception of SUPER minimal xorg.conf file.

Now, the problem is, I cannot compile the Nvidia blob on this kernel yet… Pretty frustrating…

Essentially, the Nvidia drivers needs to be able to manage and work with the changes that have happened with the GEM/DRM stack. Basically, the X graphics stack and drivers have changed the way they reference cards.

All I need now is KMS and full Xrandr support in this kernel version (3.12), as well as being able to compile, Ill be good to go.

Is there any word on when we are going to see kernel 3.12 support, as well as further support for communicating with DMA_BUF, GEM, DRM, and the rest of the open-source graphics stack?

I love how this is supposed to be a “developer forum”, and not a single dev has bothered to even try to give me advice. :/

Im starting to notice this is a common feature among linux forums when it comes to anything but simple problems like installing drivers, xorg configs, etc…

Is there ANYONE here with in-depth knowledge of the graphics stack? Im no expert, but Im VERY familiar with it, but because of all the programming with little-to-no docs, its making it hard to understand some of the really advanced functions and features available to me without digging through source code and HOPING there are comments for the functions and its not obfuscated all to hell. haha

That’s not completely true, as Aaron did provide you some feedback which appears to be the only officially supported way to accomplish what you want in the current state of things.

I don’t think anybody here knows anything about nouveau.

It was mentioned somewhere else that 3.12 kernel support is being worked on, but no time estimate was given. They never give time estimates. There is a user-generated patch out there in one of the 3.11 or 3.12 threads which is a “use at your own risk” deal if you want to try that, but be sure to read Aaron’s comments in those threads first.

Aaron and sandipt are the only devs that post here when they have a chance, but are mostly focused on bug reports. The rest of us are jabronis that don’t know anything.

Can you sell your 640s and get a 650 Ti Boost 1 GB? They’re on sale on newegg and amazon now. I don’t know how much you can get for your 640s, but maybe you can recoup most of the cost.

Word. And nah. I dont want to sell them. lol. I shouldnt have to upgrade hardware to do what they were designed to do.

Now, I totally understand. I really just wanted some feedback from hella people. Im an amateur dev., and study the **** out of linux docs, specifically those related to the graphics stack (I actually want to start contributing, but need more know-how).

I have the kernel module compiled and installed now. No problem. It just doesnt work how nouveau does in my above example.

For instance, I have this for a xorg.conf (I have 2 cards, nvidia drivers do not support KMS, I have to claim 2 cards)

Section "Device"
	Identifier  "Card0"
	Driver      "nvidia"
	BusID       "PCI:1:0:0"
EndSection

Section "Device"
	Identifier  "Card1"
	Driver      "nvidia"
	BusID       "PCI:4:0:0"
EndSection

And thats it. I can control the 3 monitors on card0 with xrandr commands and any gui that supports xrandr.

Although, now when I use the xrandr --listproviders command, I get this:

Providers: number : 1
Provider 0: id: 0x279 cap: 0x1, Source Output crtcs: 4 outputs: 3 associated providers: 0 name:NVIDIA-0

It will never ever show the other card. The only way I can use it is Xinerama (blech…) or a separate X screen or session, which I either am stuck with that window, or I have to launch things via DISPLAY=:1 and all of that jazz. Again, archaic is the word that comes to mind. It just seems so old. And yes; I have done a lot of studying (I have been b***s deep in source code and docs for the last 2 years, studying and testing stuff on my systems), so I get why.

Im really hoping that they can get some sort of bridge or extension that will let the nvidia driver see the DRM/GEM/DRI stack as it shows with the nouveau driver. The money for my 8 monitor setup is burning a hole in my pocket. :D

Not really (I wish). haha (regarding dough for monitors)

Thanks for any input you can give. At least when people search in the future, anything we document here will be available.

Thanks :)

“You can use the BaseMosaic option to span the desktop to up to three screens”

I wish. lol. Every time I try to enable it, it wont let me. It keeps saying that I have a bad configuration.

You are a dev, so maybe you can answer this definitively for me. Are my GT 640 cards compatible with Base Mosaic on Linux? Im not sure what all I am doing wrong. I have read through ALL of the README for every Nvidia driver I have ran in the last 6 months. I mean LITERALLY read it front to back multiple times. haha. I am following everything that is documented to the letter, and my xorg log always shows that my stuff is not supported. I have not managed to even get 3 screens working with it. It straight up will not enable base mosaic, no matter what I do.

What does /var/log/Xorg.0.log say about why BaseMosaic isn’t working? You might need to start X with the “-logverbose 6” option to get more information.