Mixing A100 and RTX3090 in the same server

,

Hi, all

Currently, I have a couple of 32GB V100 in server A, and couple of A100 in server B, both servers have a few extra empty PCIe lanes. I am considering adding some of RTX 3090 or 4090 cards to fill the empty lanes.

I’m wondering if these combinations (V100+RTX, A100+RTX) will cause some issue when doing some deep-learning stuff. Do these cards support the same Nvidia driver and CUDA driver?

Thanks