How to measure inference time for a single frame? I’m using TLT pre-trained SSD model .
Essentially, I want to print 2 timestamps. One, before nvinfer is executed. Another, after nvinfer is executed.
Please show me where to do so in this script.
• DeepStream Version = 5.0
• JetPack Version = 4.4
You could add time stamp print before and after
“Infer context enqueue buffer failed”);
struct timeval start, stop;
"Infer context enqueue buffer failed");
printf("time of infer takes: %ld us\n", (stop.tv_sec - start.tv_sec) * 1000000 + (stop.tv_usec - start.tv_usec));
make -C libs/nvdsinfer/
backup original libnvds_infer.so under lib/
sudo cp libs/nvdsinfer/libnvds_infer.so lib/
Thanks, it worked!
Also, for future reference, to make the code snippet complete: #include <sys/time.h>
Yes, this technique allows printing of inference time. Here’s the thing I want to do. Get this inference time in my custom DeepStream application.
Is this the only place where this time can be calculated?
Is it possible to get this in nvinfer’s src pad probe?
It may not be appropriate to calculate m_BackendContext->enqueueBuffer(backendBuffer,
*m_InferStream, m_InputConsumedEvent.get() run time as inference time. since infer running in different cudastream, it’s asynchomous. it may not finish inference. you should use trtexec to get the inference time. you can find it under /usr/src/tensorrt/bin/