FPS limitation with new Nvstreammux in DeepStream 6.0

• Hardware Platform ( Jetson nano )
• DeepStream Version 6.0
• JetPack Version (4.6)
• TensorRT Version 8.0.1
• Issue Type( questions )


the main topic here is the necessity of reduce the frame rate of the Deepstream inference. We are developing a global application and need to share the Jetson nano resources with other areas/programs.

As global configuration:
USB camera: 640*480 - 30 FPS
PeopleNet at FP16
Primary-gie at interval at 0.

Actual maximum performance approx 11~12 FPS

Reading the documentation of the new Nvstreammux the configurable parameters of “overall max/min fps” limit the camera FPS of the stream between 8 and 6 FPS. We never see this FPS limitation, as we understand, you limit the FPS of the muxer, so to brute force yo need to be reduced the overall performance of the output. So the FPS to inference are limited to 8FPS, so if you inference in every each frame, you inference 8 frames per second, so you have a performance of 8FPS. I’m in the right line?

Share to us the nvstreammux configuration:


Nvstreammux configuration file:



Deepstream output:
max_fps_dur 8.33333e+06 min_fps_dur 2e+08
gstnvtracker: Loading low-level lib at /opt/nvidia/deepstream/deepstream-6.0/lib/libnvds_nvmultiobjecttracker.so
gstnvtracker: Batch processing is ON
gstnvtracker: Past frame output is ON
[NvMultiObjectTracker] Initialized
0:00:04.009883558 14495 0x15e3d8c0 INFO nvinfer gstnvinfer.cpp:638:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1900> [UID = 1]: deserialized trt engine from :/***/resnet34_peoplenet_pruned_2_3.etlt_b1_gpu0_fp16.engine
INFO: [Implicit Engine Info]: layers num: 3
0 INPUT kFLOAT input_1 3x544x960
1 OUTPUT kFLOAT output_bbox/BiasAdd 12x34x60
2 OUTPUT kFLOAT output_cov/Sigmoid 3x34x60

0:00:04.010076426 14495 0x15e3d8c0 INFO nvinfer gstnvinfer.cpp:638:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2004> [UID = 1]: Use deserialized engine model: /resnet34_peoplenet_pruned_2_3.etlt_b1_gpu0_fp16.engine
0:00:04.023211655 14495 0x15e3d8c0 INFO nvinfer gstnvinfer_impl.cpp:313:notifyLoadModelStatus:<primary_gie> [UID 1]: Load new model:/
* sucessfully

Runtime commands:
h: Print this help
q: Quit

p: Pause
r: Resume

** INFO: <bus_callback:194>: Pipeline ready

** INFO: <bus_callback:180>: Pipeline running

max_fps_dur 8.33333e+06 min_fps_dur 2e+08

**PERF: FPS 0 (Avg)
**PERF: 12.15 (11.88)

**PERF: 11.40 (11.54)
**PERF: 11.39 (11.48)

We are missing some configuration? or only works with multiple streams? or we have a misunderstanding?

In any case, can we limit the FPS of the Deepstream inference in other way? we use in some cases the “interval” parameter, but give us performance stats divert of the reality. (ex. PERF 29FPS with an Interval=2 and camera FPS acquisition of 30FPS → your inference performance is 10FPS).

Thanks in advance.
Best Regards

No. FPS limitation is for multiple inputs with different FPS. Please read the document Gst-nvstreammux New (Beta) — DeepStream 6.0 Release documentation.

It is not for your case.

Thank you.
Is not fully clear the documentation. Watching the diagrams of Deepstream-app, i can understand that even when only one source is enable, the muxer is here, but doing nothing.

Going back to the main question, what’s the path to follow to limit the FPS?

If you want to control the FPS, you may consider “videorate”. videorate (gstreamer.freedesktop.org)

Please make sure you are familiar with gstreamer before you start with deepstream.

Thanks, i’m not familiar with gstreamer, but i think that will need to start learning.

So, if i’m understanding well, the only way to modify that is entering to touch the app code?

Best regards.

FPS is decided by all the components (including HW, SW). “interval” is controlled by nvinfer plugin Gst-nvinfer — DeepStream 6.0 Release documentation, it just skip some frames but will not drop any one. So the final FPS will not be impact.

Ok, thank you.

So, if only skip and no drop, I think that other form to limit that FPS, is using a USB camera configuration with a limited frame-rate setup.

I’ll try to test. I need to release the GPU load by anyway, to be able this project.

Best regards.

“skip” will save some GPU loading, it will impact FPS only when the GPU loading exceed GPU capability.


Other point that I watch in the Source configuration is the possibility to drop frames: “#drop-frame-interval=5”
But not see any difference using that parameter, is linked to any specific source type?

Best regards

“drop-frame-interval” is controlled by nvv4l2decoder. Gst-nvvideo4linux2 — DeepStream 6.0 Release documentation, it will change FPS.

Thanks.Try to investigate more to this path, and share the conclusion.

Also, I made some test with the limitation over the source FPS, and is interesting.
Using a USB camera and the command: “v4l2-ctl --list-formats-ext”, can know the possibilities to configure the camera source.

 Size: Discrete 640x480
 			Interval: Discrete 0.017s (60.000 fps)
 			Interval: Discrete 0.033s (30.000 fps)
 			Interval: Discrete 0.067s (15.000 fps)
 			Interval: Discrete 0.167s (6.000 fps)

If increase “interval=1” and configure the camera “camera-fps-n=15”, can lock the FPS performance of the global app to **PERF: 14.99 (15.17) . And assume that the inference is running to 7,5FPS. With that can release some load to the GPU.

I hope to be able to merge both solutions, and can manage more detailed the FPS of the global application.

I appreciate so much the fast answer and your time.

Best regards.

Can you provide an extra info about how to use nvv4l2decoder into deepstream?

When select the Source type “1” ( 1=CameraV4L2 ), I understand that will be used the nvv4l2decoder.

Attach to the source parameters:

And when I run the application, not appreciate any performance difference.

No. v4l2src: GStreamer Good Plugins 1.0 Plugins Reference Manual

Please learn gstreamer knowledge and coding skills before you start with deepstream. There are a lot of gstreamer resources with internet.

It has nothing to do with deepstream. This is deepstream forum.

Thanks Fiona clear.

In the Deepstream Manual, can’t find your affirmation, only the opposite:

" DeepStream abstracts these libraries in DeepStream plugins, making it easy for developers to build video analytic pipelines without having to learn all the individual libraries."

Best regards, you can close the topic.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.