I am using AirSim with an ESPNet segmentation inference model on a GeForce RTX 2060 SUPER. It was working until CUDA started showing the OOM error. Despite
nvidia-smi showing free GPU memory,
torch.cuda.mem_get_info() is showing that atleast 70% of my memory is filled (even going above 90% in sometimes).
gc.collect() but they aren’t helping me.
What can be the reason
torch.cuda.mem_get_info() is showing filled memory? Please suggest some solutions to free up GPU memory.