2 nvidia graphic cards of different generations

Bonjour,

I have presently an nvidia card GeForce 8800 GT with 2 monitors. I plan to install a second video card : GeForce GTX 1050 in order to connect a third monitor dedicated to photo treatment in high resolution.

My problem is: the GeForce 8800 GT uses the 340.104 driver and the GTX 1050 uses the 384.111 driver.
How can these two cards will live together?

Thank you for any lights.

Regards.

François Patte

While solutions might exist, those are definitely very difficult to implement or have undesired side-effects. Nothing out-of-the-box.
Just buy some adapters for your monitors, hook them up to the 1050 and remove the 8800.

Thank you for this answer.

Do you mean that I can plug-in 3 monitors on this card with different resolutions: 2 monitors with resolution 1280x1024 and the third one with a 4k resolution?

Where can I find some doc to do this?

Thank you.

François Patte

Pascal based consumer video cards should support up to 4 independent outputs out of the box, so there’s no need for your 8800 GT any longer.

As for setting them all up, you could simply run nvidia-settings and make it generate the config you need.

Consult the docs of the vendor/manufacturer of the card. Different vendors, different connectors/counts. Furthermore, maybe upgrade to a 1050 Ti in case of 4k+additional monitors. Experience tells that you would need around 350MB VMem per HD monitor, makes around 1.4GB for a single 4k monitor, adding two other monitors you’re close to 2GB of a standard 1050. If you at some time replace your old monitors with higher res ones, you’d run out of VMem and graphics slows down.