I have a simple inference pipeline for object detection on Jetson Nano. The fps of the detection is low which is not a problem by itself. However I need to display the video at original speed (with results on overlay). I thought about putting a Tee element after the source element so I get two branches: one for processing and the other for visualizing.
However, the visualization branch still runs at the fps of the processing branch. I did put a ¨queue¨ for each branch so to have different threads (as explained in the Gstreamer documentation) but it behaves as if the latency of inference is affecting the displayed video.
thank you very much! based on your answer I get to have different processing and visualization fps over the branches and I can see the original video faster than the processing fps.
Now I am trying to overlay the two results (I need real time video with an overlay that is generated at a lower fps). I am using this pipeline, with a videomixer that is in charge of the overlay. The result is a low fps video. That is, if I plug the branches to the videomixer then everything is slowed down (the videomixer is supposed to generate an output with the fps of the fastest stream)
The problem is that videomixer will only output buffers whenever both sink pads have a buffer ready. You can solve this by using videomixer to artificially increase the frame rate of the slower pipeline with something like this:
In that pipeline even though videotestsrc negotiates 5fps the framerate before the fakesink is 30fps. This is achieved in videorate by duplicating input buffers.
Note: I’m using one of our elements (perf) to measure the framerate. You can follow the instructions on this repo to install it if you are interested:
I am getting a strange behavior. The output is the video at 2fps with the overlay boxes, but the boxes are delayed in time! How is this possible? I expect nvdsosd drawing on the same buffer as that pushed by nvinfer. Why then the bounding boxes and the frame on which nvdsosd draws are shifted in time?
Where I want to use nvcompositor to overlay two transparent videos. One of the paths of the pipeline includes nvinfer. And the other just another video feed.
The problem is that I am not getting a smooth video. At each frame where there is an inference, the video blocks and then continues. I want to have a smooth video with an overlay on top.
Notice I added queues to make several threads, and sync=False in the nvoverlaysink. But still no luck
Hi,
Please run deepstream-test3 as suggested in #6
You should use nvmultistreamtiler instead of nvcompositor. nvmultistreamtiler is implemented for DeepStream SDK usecases.
Also for multi sources, you would need to configure interval accordingly. In reference config file source8_1080p_dec_infer-resnet_tracker_tiled_display_fp16_nano.txt, interval=4 is set.