Frames Drop issue

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) Jetson Tx2
• DeepStream Version 5.1
• JetPack Version (valid for Jetson only) 4.5.1
• TensorRT Version 7.1.3
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

Hi, Currently i have a trained object detection yolov4 model (ver 3.0) and i am able to deploy into deepstream (python). I am able to do a live recording using the tx2 onboard camera (CSI) and saving the output inferenced image with reference to deepstream-imagedata-multistream-redaction.
Question: I realise there are a lot of frames dropped as the framerate resolution i set was (framerate = 30/1) while the average fps timing was 6fps. I was wondering if there is any way to solve this issue? I tried boosting the device using the jetson_clock etc/ setting the streamux batched-pushout-time to -1 (max) but the average fps still remain at ~6. is there anyway to prevent this frame dropping issue? Here is my code, configuration settings, weights etc
yolov4_resnet18_epoch_096_fp16.etlt_b1_gpu0_fp16.engine (12.1 MB)
yolov4_labels.txt (7 Bytes)
pgie_yolov4_tlt_config.txt (2.1 KB)
deepstream_yolov4_drone_usb_v4.py (16.6 KB)

Hi,

Please note that we have a new Deepstream 6.0 release.
It’s recommended to upgrade to our latest software first.

Do you need to run the inference for every single frame?
If not, would you mind trying the set the interval value to do the inference periodically?

This can lower the GPU workload and you can still get the intermediate result with our tracker.

https://docs.nvidia.com/metropolis/deepstream/dev-guide/text/DS_plugin_gst-nvinfer.html#gst-properties

Thanks.

Hi @AastaLLL , i dont have to run the inference on every frames, all i want is to have all 30 frames in one sec, having 6 of them being inferenced. i try setting the value of interval to be 24 (because 30 - 6 = 24) however, im still getting 5 frames per second and worst of all, 1/25 frames is being annotated after 3 seconds. Is there any other solution?

Hi,

It seems that there is no tracker component in your implementation.
You will need it to generate the intermediate bounding box.

Would you mind adding it and try the pipeline again?

More, the interval can be set to 5 (since 30 / 6 = 5).
Thanks.

Hi, from my understanding, the tracker plugin helps to give a specific id the detected object respectively am i right? So how does the tracker plugin helps with the frame rate issue? Anyways. i managed to solve the issue by remove the portion where i save the predicted frames. After removing that portion, my fps increase from 6fps to 23fps

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.