Hi, all. I have trained a detection model on 1080Ti, with PyTorch 1.0.1. The GPU memory in inference is about 500M. However, when I run the same model on 2080Ti, the GPU memory increases to about 850M. What is the reason and how can I reduce the extra ~350M memory in 2080Ti?
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Memory of GPU increase so much when changing from 1080ti to 2080ti | 0 | 386 | June 23, 2020 | |
Fine tune gtx 2080 on Ubuntu? | 0 | 306 | October 17, 2020 | |
GPU Memory on K80 vs V100 | 0 | 1101 | August 11, 2020 | |
[NvEnc] [ffmpeg] Memory usage in low level . | 0 | 934 | September 23, 2015 | |
Physical RAM and GPU VRAM capacity matching? | 0 | 781 | July 11, 2023 | |
Fermi uses 200MB device memory extra | 1 | 778 | February 3, 2011 | |
How to measure GPU energy consumption | 0 | 1276 | October 26, 2020 | |
A huge difference of memory usage on different GPU | 1 | 629 | July 21, 2021 | |
Memory usage of GPU driver | 1 | 4210 | June 13, 2011 | |
Memory Issue | 0 | 468 | June 29, 2022 |