I’m trying to run an object detection model(mobilenet ssd v2) on my jetson nano 2GB. I’m following this blog link. unfortunately when I tried to load pb file memory got full and stuck on there. I tried with TF OD API, it also stuck at loading pb file. when I monitor memory usage with jtop initially I have 1.4 GB free after loading pb file I can see memory filling up slowly(30 sec.). any idea why this is happening? How much ram need to run ssd model on jetson nano?
The sample is tested with Nano 4GB.
To save memory, maybe you can try pure TensorRT API rather than TF-TRT.