Zram swap is working weird

Hi guys,

I was doing some experiments on Nvidia Jetson Xavier to find out how swapping affects performance (latency).

To this end, I made a matrix multiplication application on GPU with a large matrix size to impose pressure on memory.
When xavier hits the memory limit (32GB), xavier doesn’t utilize the swap space and just kills the mm application.
However, with other applications such as PyTorch based DNNs, when I run multiple DNNs and hit the memory limit, xavier does utilize the swap memory.

I wonder what made this difference.

Thanks

Hi,

This is because GPU cannot access swap memory.
But other frameworks, like PyTorch, they might have CPU-based implementation which can utilize swap buffer.

Thanks.

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.