I run deepstream-test5 application and want to get the infer time of gst-infer

Please provide complete information as applicable to your setup.

Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU) Jetson AGX
• DeepStream Version 7.0
**• JetPack Version (valid for Jetson only) 6.0 GA
**• TensorRT Version 8.6
**• NVIDIA GPU Driver Version (valid for GPU only) 540.3.0
**• Issue Type( questions, new requirements, bugs) questions
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)

I run deepstream-test5 application, and want to get the infer time of gst-infer .
I only want to get the infer time of gst-infer .
Did you help me fix this problem?

please refer to this faq for Latency measurement. it will output latency in gst-nvinfer plugin.
nvinfer plugin is opensource. you also can add log to measure the infer time.

This is not my answer . i want to get the inference time a frame in deepsteam.

I don’t find any useful infomation in gst-nvinfer

sorry for the late reply! the lowl-level inference function is m_BackendContext->enqueueBuffer of NvDsInferContextImpl::queueInputBatch/queueInputBatchPreprocessed. you can add logs to check.

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.