Getting stuck while running Jetson Nano Tensort inference


While running inference code for TensorRt for detection model, while running the program it gets stuck and doesnt through any error.

Can anyone tell me what could be the reason!!

Maybe it runs out of memory. Can you run htop/jtop in parallel and observe the Mem and Swap gauge?

Which program are you running, and can you post the log in addition to watching the memory usage as dkreutz suggests (tegrastats can also watch swap usage)?


ty for ur response, the problem was i was loading the binidings for every frame and as a result resulted in memory error.

Was able to solve it!!!