Resource exhausted: OOM when allocating tensor with shape[128,64,27,37]

I’m encountering this error:

I guess I’ve run out of RAM. I tried to add 32GB of swap space and the error didn’t change. So swap space can’t help with this error?

Some functions require physical RAM and will not function with virtual memory. GPU functions normally require physical RAM. The best you can do is to probably reduce the required memory footprint or to put non-CUDA applications in swap. Since you already added swap it is likely those non-CUDA apps going into swap still won’t provide enough RAM.