How can see the fps of the original uri rtsp video

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU): dPU A40.
• DeepStream Version: 6.4-triton-multiarch.
• TensorRT Version: 8.6.1.6.
• NVIDIA GPU Driver Version (valid for GPU only): 535.146.02.
• Issue Type( questions, new requirements, bugs): questions.
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)

How can see the fps of the original uri rtsp video like the fps of the console log at the start of the deep stream?

you can use ffplay, VLC or other players to get the original fps. you also can use the following comamnd-line to test the original fps. please refer to this topic.

gst-launch-1.0  rtspsrc location=rtsp://127.0.0.1:8554/test ! rtph264depay ! h264parse ! decodebin ! fpsdisplaysink video-sink=fakesink  text-overlay=false sync=false -v |grep current

Thank you,I’m newbie in deepstream so how this can help me in deepstream_tets3 python code, How i can do it?

do you want to get the rtsp fps information or measure the actual fps?

Measure the actual fps of origin source

test3 code already support measuring the actual inference fps. if only need the origin rtsp fps, please refer to my first comment.

It possible to merge the measuring the actual inference fps and origin rtsp fps in deepstream_test3? if yes help me to do this, thanks

measuring the actual inference fps is ready-made code. please refer my last comment.
About measuring the origin rtsp fps, you can add a probe function on uridecodebin, and get frame number per second.

Any hints to do that on nv-urisrcbin plugin, thanks.

you can add a probe function on nvurisrcbin’s src, and count the frame number per second.

How i can access the frames through the nvurisrcbin’s src probe function?

please refer to this code code. first create a variable to save frame number, when the probe function is triggered once, add 1 to this value. count the frame number per second.

what’s meaning of nvurisrcbin’s src, src pad or the first element in the bin?

pelease refer to this code deeptream-test1

Can you list the elements inside nvurisrcbin, thanks?

nvurisrcbin is not opensource, you can dump the media pipeline to check all plugins by this method.

when I calc the fps on nvurisrcbin’s src get the same fps as the end of pipeline, so how I can add a probe function on nvurisrcbin’s sink to measure the origin rtsp fps?

using “nvurisrcbin + nvinfer + fakesink”, you can’t get the original fps because inference performance will affect the the fps of the pipeline. to take a extreme example, the fps of rtsp source is 30, but inference performance is poor, , maybe the fps of pipeline is only10. you can get the origin fps by “nvurisrcbin +fakesink” pipeline.

What if I enable branching after nvurisrcbin using tee plugin to make all pipeline in branch and fakesink with probe function to measure origin fps in another branch, measuring origin fps can affected by another branch?

sorry for the late reply! tee will not duplicate data, it will send data to every branch by turns. adding tee does not help to measure the original fps.