Query regarding acceptable GPU for modulus installation

Hi all,

I am a bit new to modulus sym. Is it possible to install the software on A2000 or A3000 gpu ? It is very crucial for my project.


It looks like both of those cards support 12GB of VRAM. You’ll likely struggle with VRAM allocation issues, however it may be enough to get your feet wet and evaluate if the tool works for you. I’m running the aneurysm sample (without a validator, to see what happens) right now and it’s using 11.4GB. I’m running on an RTX 3090 with 24GB of VRAM and have had no issue running out of VRAM myself. When I do run out, I reduce how many sample points I’m looking at.

1 Like

Hi @kanadsen01

Presently we do not officially test on A2000 or A3000 GPUs. But given the right CUDA version / driver Modulus should work fine. As @npstrike correctly suggested, the struggle will typically be with VRAM size (this is true with any deep learning). Be aware that many of the examples provided are developed on GPUs with much more VRAM, so you may need to adjust for some.

Is this a deal breaker? No, not really. You can decrease the size of your model or your batch-size to fit things onto the device. Granted this will impact convergence slightly.

Good rule of thumb is if PyTorch works, most of Modulus should work (at least on a single GPU).

1 Like

Thanks a lot for the info.