GStreamer pipeline with accelerated elements to get RGB frames

I’m using the following simple pipeline to get RGB frames and RTP timestamp:

rtspsrc location=RTSP_URL ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! videorate ! video/x-raw,format=RGB,framerate=5/1 ! videoconvert ! autovideosink

This works and so using the Python bindings I set up a class with two probes, one to get the RTP timestamp from the rtph264depay element and one at the last videoconvert element to get RGB frames. I use a callback for the RGB frame like this:

def _frame_handler(instance: "GStreamerProbeStream", pad: Gst.Pad, info: Gst.PadProbeInfo, userdata: Any) -> Gst.PadProbeReturn:
        buffer = info.get_buffer()
        caps = pad.get_current_caps()
        success, map_info = buffer.map(Gst.MapFlags.READ)
        if not success:
            return Gst.PadProbeReturn.DROP
        rgb_frame = np.ndarray(
            shape=(
                caps.get_structure(0).get_value("height"),
                caps.get_structure(0).get_value("width"),
                3,
            ),
            dtype=np.uint8,
            buffer=map_info.data,
        )
        buffer.unmap(map_info)
        if rgb_frame is None:
            return Gst.PadProbeReturn.DROP
        instance.frame = rgb_frame
        return Gst.PadProbeReturn.OK

Replace the autovideosink with a fakesink and I use an async read on my self.frame property to stream the frames over my next application. It runs smoothly, perhaps better than I expected.

When I try to port this code to JETSON TX2, I use similar logic with the following pipeline:

rtspsrc location=RTSP_URL ! rtph264depay ! h264parse ! nvv4l2decoder ! nvvidconv ! videorate ! 'video/x-raw(memory:NVMM),format=RGBA,framerate=5/1' ! nv3dsink

This pipeline works as well and so I try to use the same logic in Python again but the above callback does not work. The map_info.data object is too small to construct a frame! I keep reading in these forums that this is a GPU memory type format and thus it is not the byte array that can create a complete frame. So I change my pipeline and omit (memory:NVMM):

rtspsrc location=RTSP_URL ! rtph264depay ! h264parse ! nvv4l2decoder ! nvvidconv ! videorate ! 'video/x-raw,format=RGBA,framerate=5/1' ! nv3dsink

Now I get RGBA frames and I change my callback like this:

.
...
rgb_frame = np.ndarray(
            shape=(
                caps.get_structure(0).get_value("height"),
                caps.get_structure(0).get_value("width"),
                4,
            ),
            dtype=np.uint8,
            buffer=map_info.data,
        )[:,:,:3]
...
.

If I save this as an image, I get a correctly restructured image from my stream.

The problem is:
the video is way off. I’m talking around 30 seconds delay and it grows more and more, eating up my SWP memory at the same time, making the JETSON slower and slower until it crashes completely.
What is going on here? I’m wondering if I’ve set up something wrong but it is pretty straightforward. Does anybody have any type of hint on how to proceed with this?

Hi,
Please check if you can run the pipeline like this:
Doesn't work nvv4l2decoder for decoding RTSP in gstreamer + opencv - #3 by DaneLLL

To get BGR format in appsink like:

rtspsrc location=rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mov ! rtph264depay ! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw, format=(string)BGRx ! videoconvert ! video/x-raw,format=BGR ! appsink

Hi,

I actually do not want BGR but RGB, however your example helped in getting the RGB frame instead of RGBA, so thanks for that. Nevertheless, my SWP is still being eaten up extremely fast.

Did you recommend the appsink specifically for some reason? I’m using probes in my code and fakesink, would this impact the performance of the application in JETSON?

Hi,
Not sure but it sounds like the memory is mapped but not unmapped. Please check if the GstBuffer or np array has to be unmapped in the prob function.

1 Like

There is no memory leak in the class implementation itself. The memory leak seems to appear after the GStreamer frame consumption and downwards. You can consider this topic closed.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.