After 5-6 Pgie model if I load jitters is coming in frame

Please provide complete information as applicable to your setup.

**• Hardware Platform ---------> GPU
**• DeepStream Version --------> 7.0
• TensorRT Version ------------> 8.9
**• NVIDIA GPU Driver Version -----------> 545

@snehashish.debnath @rishika.rao

GPU ----> NVIDIA Quadro RTX 6000
Model :- yolo_v4_tiny (FP16)
I am running 45 cameras 25 FPS. When I am running more than 5 models (pgie) with tracker, I show jitters is coming in frame, My question is how I can overcome this ?

@Morganh @yingliu

Can you use “nvidia-smi dmon” to check whether the GPU consumption is OK when running the 5~6 PGIEs?

I am running nvidia-smi dmon checked consumption is under control.
Could you share me why it’s happening ?

Can you show us the log?

Can you describe what kind of jitter it is?

I am attaching the information ------>

dmon1.txt (4.4 MB)

From the “nvidia-smi” log, the hardware video decoder was overloaded sometimes. That may be the bottleneck.

The decoder is 100 in 2 or 3 instances and not throughout, this jitter is observed even when the decoder is at 60 or 70 as well.

Does no of detections affect the decoder ?

The so-called “jitter” looks like decoding corruption. For the compression format like H264, the corruption of frame may propagate since the frames are coded by residual. You can also measure the UDP packet loss rate when the “jitter” happens.

This was not the case when 2 -3 models were loaded, and even at that point the decoder at few instance would reach 100, but no jitter.

When I add 5 to 6 models and increase the interval the percent of jitter would reduce.

Can you measure the UDP packet loss rate when the “jitter” happens?

How do I measure that?

Please google it. It is ethernet related tool.

sure, but as far as I can say that when I start the pipeline without any models, there is not jitter, the moment I add multiple models, I am facing this issue.

The image you post in the topic shows the video is corrupted, the bbox is correct. So the problem happens before or during video decoding. To measure the ethernet packet loss rate will help you to identify whether the input data of the video decoder is correct or not.
You can also try to dump the stream before video decoder when the jitter happens.

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.