Inference time not stable for Jetson Nano with TensorRT

I am running yolov3 on Jetson Nano using TensorRT.
If I keep run inference continuously like video, the inference time is fine at FPS ~4.76 ( 0.21s for every images), and keep stable forever.
However, when I tried to put the delay with 5s in the middle. The inference time will be fluctuated a lot. For example.
1st : 0.32s
2nd : 0.26s
3rd : 0.34s
4th : 0.25s
5th : 0.31s
6th: 0.34s.
Anybody have ideas how to fix this?

Sorry for the late response, have you clarified the cause and resolved the problem?
Any result can be shared?



Have you maximized the clock first?

$ sudo nvpmodel -m 0
$ sudo jetson_clocks

Please note that the default clock mode is dynamic, which adjusted for power saving.
So it may take much longer if the clock rate is low.


I am able to work well now. Thanks for you kind support.