Enabling SLI makes all the windows start flashing on Ubuntu 14.04


After adding a second gtx760 to my rig and doing “nvidia-xconfig --multigpu=on” and/or “nvidia-xconfig --sli=on” all the windows in the desktop started flashing, this happens with either 340.32 or 343.13 driver, reinstalling ubuntu didn’t fix the problem.

However odd it sounds launching civilization V will either result in a endless black screen or fix the problem until the game crashes and you get a black screen.

“nvidia-xconfig --sli=off” and “nvidia-xconfig --multigpu=off” + rebooting the system makes the windows go back to normal.

Please tell me how can I fix this issue or how to provide any files that may be relevant in diagnosing this bug.

Thank you in advance.
[This file was removed because it was flagged as potentially malicious] (631 KB)


Disable SLI tbh… I’ve been trying to get SLI in Linux fixed and brought to the attention of both NVIDIA and Valve… so far they “know” it’s an issue, but haven’t done anything to fix it.

Multigpu=on is ONLY for dual-GPU cards like the GTX 690 and Titan Z btw.

Here’s what I mean about SLI sucks in Linux… they implemented it in 2008 for ID Tech 4 Engine games only… and it hasn’t been updated since. So? That means only Quake IV, Doom III, Prey, Enemy Territory: Quake Wars and mods for ID Tech 4 Engine games such as “The Dark Mod” are the only games in Linux that support SLI. :(


… so since April 2014 I’ve done everything I can to get NVIDIA to fix SLI in Linux… but it’s not looking promising for a while yet tbh since they’re more interested in working on G-Sync than SLI… which sucks, since SLI is much more widely used… AMD’s CrossFire actually works in Linux a lot better than this tbh… then again, AMD’s drivers in Linux have always faired better than NVIDIA, which is definitely not the case in Windows operating system.


I’ve actually written up an extensive Wiki of my knowledge of SLI in Linux on the Antergos Wiki here:


NOTE!!! Even though I specifically wrote this for Antergos (based off Arch)? It works on ANY Linux Distro.

SLI / Multi-GPU on Antergos (and Linux in general)

NOTE: This was last tested with nVidia 337.25 drivers… despite these drivers adding support for the GTX Titan Z, a Multi-GPU videocard it still does not support SLI… so the GTX Titan Z is reduced to a single GTX Titan in Linux… same goes for other nVidia Multi-GPU cards such as the GTX 690. sad

With the introduction of SteamOS, nVidia has been increasing their support on the Linux platform dramatically compared to even a few years ago. Many long-awaited features such as nVidia’s Optimus finally has support (right now in a basic state) on Linux via nVidia Prime and newer GPU’s get support at a more respectable pace. SLI is supported… BUT, sadly SLI is still in an extremely primitive state.

For more information, please read SLI & Multi-GPU section of the ReadMe included with nVidia’s drivers for Linux.(external link)

Current limitations (last tested on 337.25) on nVidia’s drivers are as follows:

  • Linux only support SLI & Multi-GPU (Two-GPUs-on-one cards such as the GTX 690) can be enabled, but only via commant promts in the Terminal.
  • Only works on desktop platforms, SLI on mobile GPU's is unsupported
  • Only ID Tech 4 Engine games are officially supported in SLI by creating an "Application Profile" with GLDoom3 set to "true". Games on ID Tech 4 engine includes Quake 4, Enemy Territory: Quake Wars, Doom 3, Prey and Open Source games on ID Tech 4 include The Dark Mod.
  • GPUs with ECC enabled may not be used in an SLI configuration
  • SLI on Quadro-based graphics cards always requires a video bridge
  • TwinView is also not supported with SLI or Multi-GPU. Only one display can be used when SLI or Multi-GPU is enabled, with the exception of Mosaic.
  • If X is configured to use multiple screens and screen 0 has SLI or Multi-GPU enabled, the other screens configured to use the nvidia driver will be disabled. Note that if SLI or Multi-GPU is enabled, the GPUs used by that configuration will be unavailable for single GPU rendering.
  • Alternate Frame Rendering (AFR) and Split Frame Rendering (SFR) are supported in Linux. AFR2 is not supported nor is it planned for Linux.

    Alternate Frame Rendering (AFR) is supported. This SLI mode uses each card to render one frame then the next card renders the following. For example, in two-way AFR, GPU1 renders frames 1, 3, 5, etc. and GPU2 renders 2, 4, 6, etc. Outside of the ID Tech 4 engine Alternate Frame Rendering does not work on any card other than the first which causes framerates to drop by 50%. This is a known bug and nVidia are currently working on this issue.

  • [.SLI Profiles are NOT included with the drivers unlike the Windows version of the nVidia drivers. They must be manually created for ID Tech 4 Engine-based games. To learn how to make them yourself, please read the Creating Application Profiles section of this wiki page.

  • Split Frame Rendering (SFR) is also supported. This mode uses the first GPU to render the top half of the screen, and the second GPU to render the bottom half. This mode also does not function correctly outside of ID Tech 4. It will still render the full screen, but only at the speed of a single GPU.

    Alternate Frame Rendering mode 2 (AFR2) is NOT supported in Linux. This mode is opposite of Alternate Frame rendering by instead using the last GPU as the primary, and every GPU above it as the next in line. For example, in three-way AFR2? GPU3 renders frames 1, 4, 7, etc. GPU2 renders frames 2, 5, 8, etc. and GPU1 renders frames 3, 6, 9, etc. In Windows, AFR2 is the preferred SLI mode as it tends to yeild higher performance over AFR. My theory is because GPU1 is the primary card for video out, it uses a different GPU for the primary calculations it better balances the work load. I have not seen any documentation as to the specific reason AFR2 performs better.

Despite what was mentioned above, if you wish to try SLI / Multi-GPU? Here’s how to enable it via the terminal.

Open up Terminal
use the following command of your choice.. sli=on (enables SLI) sli=auto (Allows nvidia-xconfig to automatically enable SLI if it detects an SLI-ready setup) sli=afr (enables SLI in AFR mode) sli=sfr (enables SLI in SFR mode) 

terminal commands

sudo nvidia-xconfig -sli=on

sudo nvidia-xconfig -sli=auto

sudo nvidia-xconfig -sli=afr

sudo nvidia-xconfig -sli=sfr

sudo nvidia-xconfig -sli=off

Put in your password
Reboot Antergos (sudo reboot via terminal or via the GUI)
Open up the “nVidia X Server Settings” program
Select any GPU in the list, and look for the screen output. If SLI is indeed enabled? It will say “(SLI)” at the end of the detected screen output.

The Only difference is if you have a Multi-GPU (such as the GTX 690) you change the argument of sli to multigpu.
terminal commands

sudo nvidia-xconfig -multigpu=on

sudo nvidia-xconfig -multigpu=auto

sudo nvidia-xconfig -multigpu=afr

sudo nvidia-xconfig -multigpu=sfr

sudo nvidia-xconfig -multigpu=off

If you have Multi-GPU cards in your system in SLI (such as two GTX 690’s with the appropriate SLI bridge) you just have to mix the commands together.
terminal commands

sudo nvidia-xconfig -sli=on -multigpu=on

sudo nvidia-xconfig -sli=auto -multigpu=auto

sudo nvidia-xconfig -sli=afr -multigpu=afr

sudo nvidia-xconfig -sli=sfr -multigpu=sfr

sudo nvidia-xconfig -sli=off -multigpu=off

SLI Application Profiles for ID Tech 4 Engine Games on Antergos Linux

To use these application profiles? The easiest way is via the file manager.

Open your File Manager (example, Cinnamon uses Nemo)
go to your "Home" folder
Tap "CTRL" and "H" to toggle hidden folders and files
Go into the ".nv" folder
Open the "nvidia-application-profiles-rc" with your favorite text editor (such as gedit)
Copy and paste the following into the file and save (the last part with "alwaysapplied" is optional but I recommend it for enabling Threaded Optimizations on all OpenGL apps. (Some really old ones might have compatibility issues but you can always make a custom application profile with the "GLThreadedOptimizations" set to "false") : 

    "profiles": [


            "name": "always-on",

            "settings": [


                    "key": "GLThreadedOptimizations",

                    "value": true





            "name": "idtech4-app",

            "settings": [


                    "key": "GLDoom3",

                    "value": true





    "rules": [


            "pattern": {

                "feature": "procname",

                "matches": "etqw-rthread.x86"


            "profile": "idtech4-app"



            "pattern": {

                "feature": "procname",

                "matches": "etqw.86"


            "profile": "idtech4-app"



            "pattern": {

                "feature": "procname",

                "matches": "doom.x86"


            "profile": "idtech4-app"



            "pattern": {

                "feature": "procname",

                "matches": "quake4smp.x86"


            "profile": "idtech4-app"



            "pattern": {

                "feature": "procname",

                "matches": "quake4"


            "profile": "idtech4-app"



            "pattern": {

                "feature": "procname",

                "matches": "prey.x86"


            "profile": "idtech4-app"



            "pattern": {

                "feature": "procname",

                "matches": "thedarkmod.x86"


            "profile": "idtech4-app"



            "pattern": {

                "feature": "true",

                "matches": ""


            "profile": "always-on"




Save and exit
Open up the “nVidia X Server Settings” application
Browse to “Application Profiles”
if the “nvidia-application-profiles-rc” didn’t properly load the modified configuration? Click the “Reload” button. It looks like a green circle with two circular arrows.

Closing Notes

I hope this guide helps anyone who wish to learn more about Antergos and Linux in general. I wish you luck and excitement with your adventures in Linux with nVidia hardware. Image

If anyone else has good tips or improvements of course please feel free to modify this. It is my hopes that the nVidia Proprietary Drivers improve further in Linux (mainly in the SLI department coughs wink ) and communtiy involvement will only help further boost this.

ThE_MarD, I have seen you post this all over the place. I’m not saying SLI on Linux couldn’t use some love or that sli profiles like windows has would not be good to have.

BUT, I run 2x gtx580s and 3 monitors. I could not play things I have played on Linux (OR on windows for that matter) @ 5760x1080 without both GPUs rendering . The GTX580 is getting old. I just got done playing the new Wolfenstein:The New Order through wine and I played it mostly maxed out and @5760x1080 and had better performance (A LOT actually) than I did running it in windows at lesser settings. It’s a pretty demanding game for my aging GPUs. I’ve been playing through Dead Space 3 finally, also through wine. I just fired it up to see what it is running at and @ 5760x1080, maxed out it’s running 60fps with vsync. I pulled up nvidia-settings with it running and both GPUs show about 70% utilization. I generally have good performance with native games as well, while I have not checked utilization specifically, I do have an on screen display to show me FPS and GPU temps. The temps of both cards rise equally, so I take that to mean they are both working.

Before you bring up SteamOS, I saw your post there too. They know, it’s an incompatibility with the driver and steamOS’s compositor, and I’m sure it will be fixed.

So I think you are kind of blowing the SLI issue a bit out of proportion, It seems more like your setup is having issues. What those issues are I do not know. In general it seems SLI works pretty well for me. I would like to see Nvidia start doing some SLI profiles for games on linux though. I know many games would benefit from the tweaks.

http://pastebin.com/raw.php?i=cGfjZHtP ← my xorg.conf

mostly vanilla from nvidia-settings. I have only added the (Option “nvidiaXineramaInfo” “False”) so that openbox does not get info about my individual monitors and it treats it as ONE big screen as I have found it makes running games spanning all my monitors easier.

EDIT* Oh and I added the coolbit “4” as well…

Arch Linux
NVIDIA driver 343.13
Asus Sabertooth 990fx
AMD FX8350 @ 4.4
8gb kingston HyperX @ 1600
2x EVGA GTX580 SC 1.5gb
Samsung 840 EVO 250gb SSD (I have other drives obviously, but I do run my games from the SSD)

Thanks for the replys,

I had no idea SLI support on linux was in such a bad shape :(
The reason why I went for nvidia in the first place was because the guy on phoronix kept saying nvidia proprietary drivers where really good and were on pair with their windows equivalents, maybe the guy is bought.

Or maybe if ekrboi says his configuration works it may be due to the support for SLI coming in later than the support for single gpu’s, since hes still using gtx580s, his cards may already have SLI support and the SLI support for gtx6** and gtx7** could be in the works, but this is just speculation.

At least you guys have a functional system with SLI enabled, I mean no windows shaking in a uncontrollable manner or black screens, could this be distro related?

M4xP4yne, Please share step-by-step issue reproduction details. Please attach nvidia-bug-report by running nvidia-bug-report.sh as root user. Is any earlier driver versions worked on your setup? Also there is no any nvrm or xid errors in logs. Its good to test with other Desktop Env, iommu , .run installer etc.

There isen’t much to step by step issue reproduction, it’s just running

sudo nvidia-xconfig --sli=on

and then restarting xserver or rebooting, after that all the windows start shaking and flashing uncontrolably,

sudo nvidia-xconfig --sli=off

and restarting the xserver or rebooting will bring the system back to normal.

I’ve attached the log file to the original post.

Since I only got the second card recently, there aren’t any driver versions I’ve used with the current setup besides those mentioned earlier (343.13 and 340.32 both in x64 version).

Please let me know if there any additional information I can provide.

[This file was removed because it was flagged as potentially malicious] (631 KB)

Ahh, possibly a good point about the “older” cards being better supported though I REALLY hope not. There are a couple of GTX980’s in my future as soon as 8gb variants start to show up. a couple of 580’s in SLI still get the job done fairly well at 1920x1080… not so much now that I have 3 monitors.

Judging by your bug report it appears you are running Ubuntu 14.04. I see it shows the installation log, not sure if that will show up when installed via PPA or not. Are you using the Xorg-edgers PPA by chance? I recently switched to Arch linux, but I had been a Xubuntu user for years and I ALWAYS manually installed the drivers from nvidia’s offical .run package. When 14.04 came out I did a fresh install and I decided I would give the PPA a try, I figured why not? saves me a bit of trouble (except it didn’t lol). I had issues from the start. Once I removed the PPA drivers and manually installed them everything was back to normal, so if you ARE using the edgers PPA I suggest removing those drivers and install them manually as something to try first.

If you are already manually installing the drivers I would suggest trying another DE, I know Xubuntu (XFCE) worked for me, I can’t stand Unity so I have not tried vanilla Ubuntu sorry. You can either download and install Xubuntu to get XFCE or simply “sudo apt-get install xfce4” to install it as an alternative to Unity inside your already installed Ubuntu 14.04. then log out and select XFCE instead of Unity at the lightdm login. If it does the same thing in XFCE then at least we will know it’s not a Unity issue, and is probably a configuration problem.

Another thing possibly worth trying is installing a newer Mainline Kernel. 3.16.3 is the current “stable” and Ubuntu’s Mainline kernels can be found over there ->> http://kernel.ubuntu.com/~kernel-ppa/mainline/v3.16.3-utopic/

*note that if you install nvidia drivers manually you will need to reinstall them after the kernel update more than likely. This is a bit “advanced” because Ubuntu still compiles mainline kernels with GCC 4.6 and not 4.8 which is default for 14.04 IIRC. So you would also need to install GCC 4.6 and G++ 4.6 and then change the gcc and g++ symlinks in /bin to link to 4.6 instead of 4.8 in order to manually install the nvidia drivers successfully. DO NOT do this if what I just said makes no sense to you ;) If it does then do the following.

you would need to grab the following files:
1.) http://kernel.ubuntu.com/~kernel-ppa/mainline/v3.16.3-utopic/linux-headers-3.16.3-031603-generic_3.16.3-031603.201409171435_amd64.deb

2.) http://kernel.ubuntu.com/~kernel-ppa/mainline/v3.16.3-utopic/linux-headers-3.16.3-031603_3.16.3-031603.201409171435_all.deb

3.) http://kernel.ubuntu.com/~kernel-ppa/mainline/v3.16.3-utopic/linux-image-3.16.3-031603-generic_3.16.3-031603.201409171435_amd64.deb

download them to a folder named “kernel-3.16.3” or somethin like that(name doesn’t really matter) then from the command line cd to that dir and type “sudo dpkg -i *.deb” then let it do its thing and reboot. You will then be running the new kernel and need to reinstall the nvidia drivers.

This is all I can think of off the top of my head, Ill edit the post if I think of anything else.

EDIT* If you are already manually installing the nvidia drivers then try this first. I’ve looked at your log some more and it looks like you are also an AMD mobo and cpu user (me too!). Go into your bios and look for IOMMU settings. I see “[ 0.000000] Please enable the IOMMU option in the BIOS setup” in your log. IOMMU needs to be enabled. Then if that doesn’t fix it on it’s own, I also had to add “iommu=1 iommu=pt” to the kernel boot options. You can simply give this a shot by pressing ‘e’ at the grub bootloader and adding it to the boot options, look for the word “quiet” (probably it’s “quiet splash” on buntu) and manually type “iommu=1 iommu=pt” after it (minus the “”) then actually boot up. IF that works, I can tell you how to make it permanent.

I only needed to do the iommu=1 iommu=pt trick when I added the 3rd monitor, it wasn’t necessary when I was only running one and then two monitors on the same (first) card. Only when I added the 3rd and had to put it on the second card. And that brings me to another thought. I didn’t really game in linux until after I picked up my other 2 monitors and shortly after decided I wanted to rid myself of windows. I had SLI turned on with a single monitor and when I only had 2 monitors, but it wasn’t until I added the 3rd monitor to my second card and forced it to work that the drivers would not load right and I would get corruption if they even worked at all. So the IOMMU trick might be necessary for SLI to work right on AMD hardware, and I just never ran into the problem before since I wasn’t gaming with it. Maybe that 2nd card was never really doing anything for me before without the IOMMU “trick”.


Sorry for the delay… wanted to try some diff tests with Xubuntu and Antergos thinking that maybe ekrboi is right… maybe using his xorg.conf will fix it… see my results below.

@Maxpayne, I agree with ekrboi… don’t use xorg-edgers. VERY unstable. When I was testing SLI out long ago in Xubuntu 13? Same stuff happened to me. It’s best to compile yourself with the .run or stick with the stock drivers or switch to a Linux Distro with a rolling release (Arch, Linux Mint Debian: LMDE).

@ekrboi, trust me… I’d LOVE to be proven wrong, but so far no one’s shown me SLI working in Linux other than ID Tech 4 Engine.

Could you please post pictures of you running Unigine Heaven with SLI in Linux with the SLI Visual Indicator on? Same with Unigine Valley? Serious Sam 3? Natural Selection 2? Metro Redux?

2D portions? no problem, SLI visual indicator goes on full… any 3D? it drops to empty. Even in WINE my SLI setup doesn’t achieve anything.

As for “blowing it out of proportions”… why would I game on Linux if on Windows I could harness the second GPU and I can’t in Linux no matter what I’ve tried? I love tinkering around in Linux, it’s open, it’s fun, a challenge and so customizable. It also has some solid Environments which is great. The base foundation of code the Linux Kernel is built upon? Guaranteed yes it’s better than Windows… it’s just stuff like lack of SLI impedes me from wanting to fully switch over.

I looked at your xorg.conf and yeah it’s pretty much the same as mine… I’ve tried Arch 64bit with 3.16.2 and NVIDIA driver 340.32… still nothing. Xubuntu 32bit and 64bit with 3.13.x or 3.16.2 and nvidia drivers 331.x and 340.x… still no luck with SLI (I agree, Unity was great in Ubuntu 12.04… but… Unity’s heading in a direction I don’t like much like Microsoft Windows).

Arguably, there hasn’t been any SLI changes in the last few NVIDIA drivers for Linux, only detection problems with Multi-GPU cards like the GTX690 or Titan-Z… so running ye ol’ 331.x drivers should yeild SLI performances about the same as 340.x right?

So on SteamOS thread I created, I used Xubuntu 3.13.0-32 kernel with NVIDIA 331.89 drivers…below is Antergos 64bit 3.16.2 with NVIDIA 343.13 BETA modifying my xorg.conf to match yours with Xinerama settings and SLI = On instead of Auto… I thought since “Auto” wasn’t enabling SLI? that it was the cause of my problems so I tried “On”… nope… SLI is still disabled…

So? Then I tried without the Xinerama extra parts except for Xinerama=0 and Stereo=0… whilst keeping CoolBits=4… I get SLI, but half my regular framerate again… plus… graphics corruption on my Cinnamon desktop environment. :P


Yes, I use my cellphone lol… HTC One M7 ftw! lol… main reason being? I tried tinkering with “glc” and others, but I’ve had no luck… most of them haven’t even been updated for a year or two… ekrboi, do you have a recommendation for OpenGL fullscreen screenshot program? That’s another problem driving me nuts lol… stuff like Unigine benchmarks disable universal hotkeys so I can’t use xfce-screenshot and others like it. :\

So… that means with your config, even set to Auto? Odds are… SLI = Off… hence your decent performance. Can you look in your “NVIDIA X-Server Settings” and check for example, GPU 0… Under “X Screens” it should say something like “Screen 0 (SLI)” if SLI is enabled… if no “(SLI)”? SLI is disabled.

I think the only way currently to multi-monitor with SLI is via -sli=mosaic… ftp://download.nvidia.com/XFree86/Linux-x86/340.32/README/sli.html

People testing it on GTX 780’s with triple monitors:

So… unless you’re the exemption of the SLI Linux limitations? I don’t see how it’s possible you’re getting SLI in non-ID Tech 4 Engine based games multi-monitor.

Plus… after more Googling I even found this from 2012:


That’s Plagman… the same dude from the SteamOS thread. One of the heads of NVIDIA’s linux division. If anyone’s going to potentially spearhead this fix? Hopefully it will be him… but as-far as when? From what he’s said in that SteamOS thread? That won’t be for quite a while… only when SteamOS starts expanding beyond basic Linux game support and streaming games from Windows Operating System? maybe then will we see a revamp of SLI in Linux.

Until that day? I guess the best option for me would be… sell both my GTX 680’s for hopefully $300-ish each and then either take a 10% performance hit and buy a GTX 780ti which would run good under Linux though… or wait and see if the GTX 980’s do truly surpass the 780ti in more than just overclocking abilities… either way? Meh, pain in my ass when all I need is a fixed driver.

As for the argument of “SLI Profiles” in Linux? Heck, even without proper SLI Profiles in Windows 7? I can still achieve a 30% performance gain by just creating a profile and selecting “AFR2”. I only tend to get the main SLI performance boost though IF there’s a proper profile… Linux is the opposite… without a profile? it’s a 50% performance hit… that… or you use “GLDoom3” in an application profile regardless of what game engine and achive the speed of a single GPU… second one sits there on idle.

What program in Linux are you using to monitor your two GPU’s that you can see their load? May I please have the name? I haven’t actually looked into any, but I’d be interested in seeing what it does too.

It’s late and I am about to head off to bed. Ill gather some info and get some screen shots tomorrow. I swear I’m not making this up lol. 3 monitors and both cards working!

The OSD that I use is called GLXOSD and the source can be found over there -->> GitHub - nickguletskii/GLXOSD: GLXOSD is an extensible on-screen display (OSD)/overlay for OpenGL applications running on Linux with X11 which aims to provide similar functionality to MSI Afterburner/RivaTuner OSD. It can show FPS, frame timings, temperatures and more in OpenGL games and applications. It can also be used to benchmark games, much like voglperf.

I believe he has a PPA setup for the *buntus. For Arch there is pkgbuild in the AUR for it, but it only compiles the 64bit versions of the plugins so it only works with 64bit games. I ended up manually gathering the required 32bit libs (for which there are no AUR pkgs for) and compiled both 64bit and 32bit plugins myself.

More tomorrow.

Sorry to disappoint you ekrboi but the nvidia drivers I’m using where downloaded directly from nvidia website.

I could try another desktop environment though, not xfce cuz I hate it, but maybe KDE or gnome if there is a gnome classic mode for ubuntu.

Are you sure compiling a different kernel for my system might make a difference sli wise ?
After my fedora laptop updated to 3.16, it was finally possible to change brightness normally and delete the silly cron job I had to change the file with the brightness value but something tells me sli compatibility is way low on the kernel developers priority.

Well no PPA drivers, so that’s out. I don’t really think a kernel update would fix it, the kernel devs surely don’t care about SLI, I was thinking more along the lines of something else possibly conflicting that could be updated and inadvertently fixed in a newer kernel. Crazier things have happened to me with linux. Plus I just figured it was worth a try, it’s easy to remove and go back to the ubuntu 3.13 kernel if not, but like I said takes some system tinkering to get the nvidia drivers to compile with a ubuntu mainline kernel on 14.04.

The PPA drivers being bad (for me anyways) and the IOMMU thing are the only things I have ran into with my cards and linux.

*EDIT this whole thing got me to actually look into the IOMMU options that I used. I had just found them in a post about a similar issue and they worked so I have been using them. The link below outlines the AMD specific boot options. I ended up being able to boot my system and everything is working as far as I can tell so far with only the iommu=pt added to the boot options.



I have not forgotten, been a long day so it will probably be tomorrow.

Still buggy using the new driver(343.22) with or without IOMMU enabled :(

This only leaves out distro specific issues like the desktop environment or the kernel, maybe I’m gonna install arch on another partition and see if it works, but it wouldn’t make much sense that nvidia would make SLI work on most distros and leave out the most popular distro for steam linux users.


I’ve had good luck with Antergos 64bit. It’s based off of Arch and uses a GUI installer with a few Desktop environments to chose from including Cinnamo and Mate I believe.


Otherwise? Manjaro is another popular distro based off of Arch.


You can always install Ubunti with MATE desktop environment… it’s a fork of Gnome 2.

Give it a go! It could be the Ubuntu experience you’ve been waiting for. :)

Yea, I have to eat some crow on this one =( ThE_MarD, might be on to something after all. I’ve done more tests and it seems my GPU utilization directly translates to what res I am using for a game. If I run a game @ 1920x1080 on my middle monitor only then not much of my second GPU gets used. It’s only when running things at 5760x1080 that both GPUs seems to blaze along which I assume is just the way “base mosaic” works. I can’t be bothered to go disconnecting monitors and crap to do further tests, but it seems something is not right.

For instance, CS:GO just released for linux. If I run it at 1920x1080 on my middle monitor only and pull up nvidia-settings and look at GPU utilization it shows the first card @ 60-70% utilization and the second gpu is around 15%. If I scale to 5760x1080 both GPU’s show 40%ish, but the game seems to be defaulting to low res textures at that res no matter my settings, otherwise it would be higher.

But all of that is neither here nor there, your problem is not SLI actually working in games. Yours is just failing to work correctly at all. If the iommu thing didn’t fix it I would honestly probably point the finger @ ubuntu’s unity as my next guess.

For some reason neither antergos nor manjaro booted from the flash drive and loadkeys command didn’t work on arch, considering the prospect of having to guess where the keys were for the whole installation progress, I opted to give up on the idea of installing arch for now, maybe I’ll print a US keyboard on a sheet of paper and try installing it during the weekend.

For now I’ve installed gnome on ubuntu and the problem is mitigated for some reason but not non existent, some times the cursor starts flashing or an icon or piece of text does, a window ocasionally, in fact the text moved alone and flashed several times while writing this post.

Same occurred occasionally on civ 5 and alot on csgo, but I haven’t had the time to test it with, and without sli to figure out how much is nvidia fault and how much is valve’s, since there where plenty other bugs aswell.

So far has sli usefulness is concerned the memory use apears to be 50-50 but the gpu utilization is considerably lower on the second card especially on csgo.
Generally the second card ran at 1/2-2/3 of the first card on csgo.
For civ 5 the second card use was much better sometimes even passing the fisrt card by a couple % points, but generally lower.


No worries mang… I don’t multi-monitor so I can’t even test that… but it would make sense if each card was rendering its own screen… but trying to get multiple cards to render either in alternate frame rendering or split frame rendering on a single screen? it just doesn’t work… so in my case? Maximum SLI performance on a single screen isn’t working at all.

As I stated in my earlier post, I’ve tried Antergos, Linux Mint, Ubuntu and Xubuntu and even Fedora… all of them suffer the same SLI performance degration fate. My guess is it’s more of a NVIDIA driver limitation than a Linux Kernel issue and probably isn’t really affected too much by the Desktop Environment… the only thing the DE seems to affect in regards to the game? Is the framerate by how much cpu and memory the DE uses. Running low-end ones like XFCE and LXDE seem to give a 2 to 7 frames per second average higher due to reduced costs to run on your system compared to Gnome 3 (the worst) followed by KDE… Cinnamon seems to be in the middle of the two groups… never really tried Open Box though so I don’t know where it fits lol… my guess? Probably between Cinnamon and Gnome 3 for sapping system resources.

So yeah… if using single monitor? Disable SLI until there comes a time that the Linux Driver is better able to compete for the Multi-GPU gaming performance crown… as it sits right now? I think both AMD CrossFire and NVIDIA’s SLI doesn’t work properly on a single screen in Linux… meh, I’ll keep my eyes out for future Linux Drivers that might resolve this issue for us all. :)

“never really tried Open Box though so I don’t know where it fits lol… my guess? Probably between Cinnamon and Gnome 3 for sapping system resources.”

I use xfce for my normal desktop and then shut it down and xinit openbox for gaming. Openbox (OB) is damn near as simple as they come. It’s NOT a DE, it’s a WM ONLY. When you boot OB by itself for the first time you may think something is broken/wrong lol. Nope, thats just OB ;) You will be looking at a black/dark greyish screen with NOTHING there. The only thing it has is that you can right click anywhere on the desktop to get a “start menu” which can be edited with OB’s menu.xml and you can define OB’s behavior/keyboard shortcuts with it’s rc.xml… that’s basically it. It’s great for a gaming space because it practically uses no resources.


REALLY? Hmm… I’ll have to tinker around with OB then in the future thanks! I’m not a fan of that right click OB Menu though, Maybe I’m too much of a dinosaur haha I love my dedicated menu taskbar button… I’m sure there is plugins to set that up though heh, but that will be a project for a later date… for now? Tbh I’m a little bummed out with SLI on Linux to try and get it running 100% the way I’d like it… my ca0132 soundcard getting all screwed up in 64bit combined with poor SLI performance on my single monitor makes me just say “screw it” and stick with Windows 7 64bit… I do love Win 7 more than even good ol’ Win XP but Win 7 might be the last version of Windows I purchase though depending on if Win 9 is any good or does anything to optimize gaming / performance / file system. NTFS is good… but it’s not ext4 good.

As for OB, it’s like a blank slate. You can turn it into what you want. Once you turn it into a “desktop” it kinda defeats its purpose for me. If you want OB to auto start a 3rd party taskbar and cairo dock, etc, you can do that and you basically kill its “simplicity”. thats why I keep it as simple as possible. XFCE is for my main “pretty” desktop that uses 800mb of ram sitting idle. OB is for gaming, it does “nothing” and total RAM usage is >400mb idle.