For importance sampling example increasing batch size to 50,000 make memory outage

For annular_ring_importance sampling example, I changed Interior batch_size to 50,000 and it returned error in allocating GPU memory (RTX3080 laptop 16GB). I also tried on V100-32GB GPU and it also showed out of memory warning. T (11.6 KB)
he python file is as attached.

Hello @nvhai , A batch size of 50,000 will not run on with that amount of memory. For the importance sampled constraint you are computing the Navier Stokes momentum and continuity equations. These require doing several second order gradient calls each of which is very memory intensive. If you would like to run with larger batch sizes you can use multiple GPUs. I will note that the importance sampling functionality has been refactored in the 22.03 release and is now much simpler. You can find this in the ldc example.

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.