Videorate affecting framerate of stream on other thread

I’m developing an application with the Gst python bindings where I am capturing video from an IMX219 camera. I must be able to do the following: stream the video as h254/h265, save video to a file and save 1 image per second as a jpeg. As I’ve build things up, they have worked really well, except when I added the videorate element to save jpegs to disk. I’m still learning Gstreamer at this point but it seems that the videorate plugin is changing the whole pipeline upstream of the tee and it causes issues like making the udpstream really choppy with around 1 fps and it messes up the whole stream. I have isolated the behavior in this script where I save a jpeg at 1 fps and output to autovideosink at 30fps. I figure I’m missing some critical component to make it so I can have full 30 fps video and save jpegs at 1 fps. What am I doing wrong? Thank you!

gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! "video/x-raw(memory:NVMM),width=1280,height=720" \
! nvvidconv ! "video/x-raw(memory:NVMM),width=1280,height=720" ! tee name=t ! queue ! videorate \ 
! "video/x-raw(memory:NVMM),framerate=30/1" ! autovideosink t. ! queue ! videorate \
! "video/x-raw(memory:NVMM),framerate=1/1" ! nvvidconv \
! "video/x-raw(memory:NVMM),width=1280,height=720" ! nvjpegenc ! multifilesink location="image.jpg"

Hi,
It looks like videorate plugin does not drop frames properly. There are several properties: videorate and would need other users to share suggestion about how to set the properties.

Another solution is to use appsink plugin. May refer to this sample:
Starvation (?) of gstreamer threads - #11 by DaneLLL

I’ll take a closer look at the videorate props to see what I might be missing. Does anyone else have any insight into how I could get two separate streams at different frame rates?

One thing that I have observed is that if I set one of the streams outside of the NVMM memory space using nvvidconv and caps “video/x-raw,width=1280,height=720”, the framerates are all correct. This leads me to believe that there is a memory copy happening that’s keeping the other stream from being affected by the videorate plugin on the other thread. Here is an example:

gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! "video/x-raw(memory:NVMM),width=1280,height=720" ! tee name=t ! queue ! autovideosink t. ! queue ! nvvidconv ! "video/x-raw,width=1280,height=720" ! videorate ! "video/x-raw,framerate=1/1" ! fakesink

Removing (memory:NVMM) from the nvvidconv and videorate pluging fixes things, and breaks things if you add it back in. I would prefer to not have to do this at it seems like behavior that could get cause problems in performance and I want to keep everything in the NVMM space to take advantage of the NX hardware and codecs.

I spent some time working on getting an appsink/appsrc solution working and was able to get the frame rate changes I needed. It took a while to get it all working because the python bindings for setting up appsink/appsrc links between two pipelines is finicky. Also, I was never able to get a videorate plugin working on the pipeline running getting frames from appsrc. For some reason, adding videorate would just make the pipeline stop completely on the appsrc end. What I did to get everything working was to have the new-sample callback function skip pushing buffers to the appsrc plugin until 30 frames have elapsed. This is working really well and is getting the desired functionality with no discernible difference in CPU usage.

I’ll include some example source code showing how this works.

src_sink_class.py (4.7 KB)

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.