the inference time is fluctuating using tensorrt in tx2

I measure the inference time of my network using the method in sample/sampleGoogleNet/sampleGoogleNet.cpp, change TIMING_ITERATIONS to 1. I get frame from a video. However, the time is fluctuating for every frame. Sometimes it is 52ms for a frame, sometimes it is 17ms for a frame. Why is the time volatile?

Hi 373197201,

Have you maximized the TX2 performance?

sudo ./jetson_clocks.sh

Thanks

Yes, I have. I also have entered the command nvpmodel -m 0

Thanks
Bryan

Hi,

Here are some possible reasons:

1. Remember to lock GPU frequency to the maximal.
Please remember to run these commands in order:

sudo nvpmodel -m 0
sudo ./jetson_clocks.sh

2. Related to IO.
What is your input source? Is it camera stream?

3. It’s recommended to use TIMING_ITERATIONS > 10 to have a more accurate results.

Thanks.