Memory usage on Jetson Nano during inference

Hi all,

I am currently running a detection and tracking program on my Jetson Nano 2GB. The detection is made with the proposed ssd-inception-v2, which works great. The tracking is made with a Kalman Filter and a ByteTrack implémentation.

My question is about memory usage. Idle, apparoximately 700MB are used. When I run my program, this number goes up to 1.8 GB, that is, almost full capacity.

I would like to add another element to my program (that is, tracking by Re-id), but I am afraid the memory will saturate.

  1. Can someone explain me why such a discrepancy in memory usage between “idle” and “in use” situations ?

  2. Is there any way to limit this, so that I can enrich my program ?

  3. I have found about this option for Tensorflow : (gpu_options = tf.GPUOptions(per_process_gpu_memory_fraction=0.4). But, if I am not mistaken, TRT is just not Tensorflow, although it is compatible with it. Would this option or anything similar still work ?

Thanks in advance

I shall move this topic over to the Jetson Nano forums.

I’m closing this topic due to there is no update from you for a period, assuming this issue was resolved.
If still need the support, please open a new topic. Thanks

Sorry for the late response, is this still an issue to support? Thanks

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.