Setting up two different drivers for two different graphics card

I am adding a GTX 960 to a system, running Linux Mint 17.3, which already has a GeForce 210. I’d like to use the 210 for my displays and the GTX 960 for CUDA computing (deep learning, to be specific - perhaps some VR in the future). Since the Noveau open-source drivers don’t play nice with the Nvidia drivers, I’m trying to run both of the cards on Nvidia drivers (340 for the 210; 352/361 for the GTX 960). However, when I setup the Nvidia drivers for the 210, the GTX reverts back to the open source noveau drivers (not allowing me to run CUDA) and when I setup the Nvidia drivers for the GTX, the 210 reverts back to noveau (which messes up my display, since noveau and Nvidia don’t get along). I’ve been using the built in driver manager to try to get the Nvidia drivers enabled at the same time. How can I get both nvidia 340 and nvidia 352/361 setup on my system, such that I can use my GeForce 210 for X while using my GTX 960 for CUDA processing?

I suppose, if I had to, I could run both my display and CUDA off the GTX 960, but I would prefer not to. Thanks.

Best I can tell, the Geforce 210 is a device with compute capability 1.2. Devices with compute capability < 2.0 are not supported by current NVIDIA display drivers and NVIDIA CUDA drivers. You cannot use two different NVIDIA drivers on the same machine at the same time. So your only options seem to be

(1) Use the latest driver package that still supports Geforce 210 for both GPUs. My memory is hazy whether the 340 driver family supports GTX 960, but I believe the answer is “yes”. Note that if you stay with the 340 driver family, you will be restricted to older versions of CUDA.

(2) Get a new card to drive the display, you could go as low as around $50 for a GeForce GT 720. Wait … I now see there is even a Geforce GT 710 for about $40.

(3) Use the GTX 960 for both compute and display

Hi, Can I use GTX Titan x 5.1 ans GTX 1080 Ti 6.1 together? If Yes, which driver I should install?

The driver for GTX 1080 Ti will work for GTX Titan X. Just select a driver for GTX 1080 Ti from

Thank you txbob

@txbob here’s complex version of that question, but maybe workable:

  • I am a Titan XP / Titan V owner and Nvidia GPU Cloud (NGC) member.
  • I am planning to use NGC containers with my Titan cards for Deep Learning, Scientific Computing, and general HPC.
  • My planned setup was to have a GTX 1070Ti or 1080Ti as the display card for my workstation and have NGC containers running the Titan cards as headless compute engines.
  • My systems all meet the rather specific requirements of running Ubuntu Xenial LTS.
  • My understanding is that in both the Windows and Linux world, Titan cards support a headless compute driver that is disabled in hardware / firmware of non-Titan cards, presumably this is what the containers use.
  • SO, my question: if Titan and special drivers are isolated in NGC containers, and just sending data as output from the container, can my system outside of the container use the regular video driver with a non-Titan card (e.g. GTX 1070Ti) connected to a physical display?
  • If not, what's the recommended setup?


  - B.

I can’t speak to interaction between NCG containers and the NVIDIA drivers, but other than in the Windows world, where there exist two fundamentally different kind of drivers (WDDM driver, which treats the GPU as a “VGA”, and TCC driver which treats the GPU as a “3D controller”), there is just one driver in the Linux world. You can use that driver with X (that is, with a GUI), or without.

The reason for this discrepancy is that the WDDM driver architecture on Windows, which gives maximum control over the GPU to the operating system (good for Microsoft), is a source of significant performance artifacts in terms of compute app usage (bad for NVIDIA). The TCC driver is a workaround for that, giving more control over the GPU to NVIDIA’s software, resuling in better performance with fewer performance artifacts. There are (so far) no equivalent performance issues on Linux, so the only annoyance there is that you have to work around the artificial hurdles the Linux folks put in the way of NVIDIA’s proprietary driver.

You don’t load any drivers in NGC containers. nvidia-docker takes care of pulling the necessary (user-mode) driver components from outside the container, into the container, at launch.

Just choose an appropriate driver for the Titan V (r387, currently). it will work with the GTX 1080Ti

(or alternatively just do an install of CUDA 9.1. The bundled driver with it will work with your Titan V and 1080Ti)

Establish X to use only the 1080Ti:

The remainder of NGC setup for this case can be found here:

If you have NGC-specific questions, I suggest posting those in the NGC area/forum:

@njuffa wrote:

Thanks @njuffa, that’s good info.
Have you read the NCG container docs? Anyone can sign up for an account AFAIK.
They definitely only refer to drivers by release #, and in the case of Titan are specific to Ubuntu 16.04 LTS. They carefully enumerate the architectures/products that are supported, omitting anything like the GTX 10 series, and older-dated documents conspicuously make a similar omission of Titan, though the newly released docs include Titan, both pascal and volta, in those lists.

I was definitely just making assumptions about there being the two types of drivers on both platforms - at the time I read about them in the docs I was only reading the Win version.

So “maybe there’s hope:” if discrimination against platform is done at the container/API level and both host and container run the same driver, and there’s no issue.

Well, anyway, while I was writing my Titan V arrived. I suppose an empirical approach is in order. But it would be good to hear from any NV mods or readers who have tried this recently.


Thanks @txbob, that is definitely a good and comprehensive answer. I daresay it may move a few extra Titan units. Thanks for the link on the ngc board, I was searching by question and it brought me here.
I appreciate the quick turnaround, too - very helpful.

  • B.

… and if you are running Ubuntu Xenial LTS with CUDA installed via apt, you will have found that this afternoon / evening the package manager is prompting you to install 9.1, just in time for your Titan V to arrive.

So timing!

I’m trying to install Quadro 4000 and P104-100 on the same computer. Every time when I’m installing driver package version 390.65 for P104-100, driver for Quadro stops working. If I reinstall driver for Quadro with package 372.95, P104-100 stops working.
What can I do with that?

It’s going to be difficult to get those cards to work together in the same system. The Quadro 4000 is a Fermi device, and support for Fermi devices is disappearing from the latest CUDA toolkits (9.x) as well as some recent driver branches.

OTOH the P104-100 was only released at the end of 2017, so it will require a fairly recent driver.

The simplest solution would be to replace the Q4000 with a newer GPU.

Sorry to bother you guys.My workstation has a p5000 card,it also has one more position for another gpu card.I just want to know,if I use p5000 and gp100 on the same workstation on ubuntu 16.04,will this work?

you should be able to do that.

Okay.A super big thank you to txbob.

I’m trying to set up an egpu with my dgpu and they don’t work at the same time
I’ve got:
Windows 10
I7-3537u igpu 4000hd
Dgpu gt72m
Egpu 1050ti

They all work if installed solo.
Intel HD works 100% of the time.
I can install solo from Windows update or get the correct controller from nvidia and it will work(only if I install 1 of the controller’s).

If I install one of the nvidia controller’s after installing other nvidia controller I’ll get error 43 on the card that I’ve installed the controller in the first time.

I’ve tried ddu with safe mode, different drivers, and even tried inf modded drivers but I always get to the same error 43.
Sometimes, installing the gt720m driver after installing 1050ti one gets me into windows bootloop and I require to enter safe mode, disable and/or uninstall one of the controller’s so I can boot normally again.

Just to clarify, I don’t need the two nvidia cards to work at the same time, and I managed to make them work one at the time by switching the bus controller for 720m on/off in the device manager(720m driver must be installed before 1050ti, bus must be off before install 1050ti as well, for this to work).

If the controller’s work solo, why they can’t work when they are both connected to the device? I would appreciate being able to switch from video cards without buggery wizarding.

I know this works well on most windows 32x, windows 7 and Linux distributions

gt 720m is a fermi device:

That device is no longer supported by recent drivers or recent CUDA toolkits. (on any operating system)

Installing a newer driver will result in error 43 on the Fermi device. (there would be an error on any OS)

Installing an old-enough driver that understands the Fermi device may not understand the newer Pascal device (1050ti), in which case the Pascal device would show an error 43. (there would be an error on any OS)

You cannot have 2 NVIDIA GPU drivers loaded and active at the same time, in any operating system.

If this were my system and I were desperate for this to work, I would search for a new-enough driver that recognized the 1050ti, but not so new that Fermi support had been dropped. The most recent drivers in the R384 or possibly R390 driver branch might fit that description. You would have to do some searching and testing. You would basically need to search for such a driver (R384 or R390, no later) that was published by NVIDIA after the 1050ti was released to the market.

However such drivers will not support the latest CUDA toolkits (e.g. 10.x)

And I’m not guaranteeing success, just outlining what I would try. A better option of course is to ditch that GT720m and get a newer base system, for the most flexibility and latest support.

And windows update does not know how to negotiate this minefield, and if you allow it to be active it may disrupt your best efforts. And yes, when backtracking drivers it is usually best to perform a full uninstall or clean sweep with a utility like ddu.

I suppose another option might be to just install a new driver, let it work when the egpu is plugged in, and when not, acknowledge that the GT720m is in an error state and use the igpu instead. There might even be a setting in your system BIOS to disable the dgpu.

Thanks you for your reply, I tested the driver range you’ve linked to me and they didn’t worked.
I’ve now successfully edited some 720m driver and added the 1050ti line in to it. I then installed the driver on the two devices and the same think still happens, error 43 on the 1st installed decide.
I’ve gotten a little bat file that removes error 43, but only on the device I got my main screen connected.(don’t know if that info will say something useful to you)
Tomorrow I will edit one driver from the range you suggested and give it another try… thanks once again for the info and effort

My suggestion would be to remove any drivers you have installed on your system now, and then install the driver associated with this CUDA toolkit installer:

That should install a 385.xx driver, and it should be compatible with both the GT 720M and the 1050ti.

(However you won’t be able to use CUDA 9.0 with GT 720M).

My next guess would be to try the same process with this:

CUDA 8 will work with the GT720M and should also work with the 1050ti (with a suitable driver).

If that doesn’t work I’m out of ideas. I think editing driver INFs is not going to be a useful effort, but of course you’re welcome to try whatever you wish.