Can SegFormer‑B5 run on 16GB VRAM

Please provide the following information when requesting support.

• Hardware:GeForce RTX 5080
• Network Type:SegFormer‑B5
• TLT Version:N/A
• Training spec file:N/A
• How to reproduce the issue ?No error.

Hi,I am testing SegFormer models from B0 to B2, and I already see a clear slowdown as the model size increases. Before I continue to B5, I want to confirm:

Will SegFormer‑B5 run normally on a GPU with 16GB VRAM, or is it likely to crash due to out of memory?

Do I need a GPU with larger VRAM (e.g., 24GB or 48GB) if I want stable training?

Thanks!

Yes, it is needed to use a larger VRAM. Suggest to use 48GB or larger.

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.