Isaac Gym Using up RAM with increasing number of epochs

Hi,

While my RL model is training, I am noticing that my RAM usage goes up as the number of epochs of training increases. Is there a reason why this happens? I am not sure what is being stored in RAM and I was wondering if there’s a way to clear it within the script every X epochs because I want to be able to train for longer epochs, but I am limited by the RAM.

Thanks,

I found you have an option to train from an existing checkpoint, which is a workaround.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.