Hi,
While my RL model is training, I am noticing that my RAM usage goes up as the number of epochs of training increases. Is there a reason why this happens? I am not sure what is being stored in RAM and I was wondering if there’s a way to clear it within the script every X epochs because I want to be able to train for longer epochs, but I am limited by the RAM.
Thanks,