Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU) Jetson Nano
• DeepStream Version5.1
**• JetPack Version (valid for Jetson only)**4
• TensorRT Version 7.1.1
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs) question
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)
print inference time, It is working.
I have also added a counter to represent number of frames processed. I found out that with “deepstream-app -c” all the frames in input source are not processed.
For example I gave video input with 442 frames but the logs i printed came out only for 229 frames.
I had set interval=0 in the config file that I provided.
Although when I run my custom detector parser with reference code deepstream-test3 source code example with same input video source all the frames were processed.
Is there a way I can process all the frames for nvinfer in deepstream-app ?