Hardware decoding, newer versions of GStream and displaying in Qt

Please provide complete information as applicable to your setup.

• Jetson Orin Nano 8gb dev kit
• DeepStream Version: minimum after SDK installation (will update)
• JetPack Version: 6.2.1
• Issue Type: question

I’ll going to keep this short and right to the issue.
Im trying to take a HLS video stream, hardware decode it and then display it within a Qt app.

I got the pipeline working with a public stream:
gst-launch-1.0 souphttpsrc location="https://cdn-004.whatsupcams.com/hls/hr_dubrovnik07.m3u8" ! hlsdemux ! tsdemux ! queue ! h264parse ! nvv4l2decoder ! nvvidconv ! nveglglessing

My problem is displaying the stream within Qt without “leaving“ GPU memory. I saw a GStreamer element qml6glsink be mentioned but i cannot use it because Im stuck with GStreamer 1.20.3 in order to be able to use the NVIDIA elements which enable hardware decoding.

My question is, what can I do to display my hardware accelerated stream within Qt?
As I see it I have a couple of options:

  • Build a newer version of GStreamer (will NVIDIA elements still work?)
  • use nveglglessink + VideoOverlay
  • Use software decoding

Just to note, I am very new with Jetson env and video signal processing so if you have any resources on how to achieve my goal, I would appriciate it 🙏

Hi

I don’t recommend upgrading GStreamer versions unless you also rebuild NVIDIA’s elements (Most of them have their source code now in L4T sources). Otherwise all your nv* elements will probably fail during plugin discovery.

The quickest way to get what you want while keeping GStreamer 1.20 is to embed nvoverlaysink into a Qt native window. Your pipeline would look something like this:

uridecodebin3 uri="https://..." ! nvoverlaysink

Uridecodebin should select the hardware encoder by default according to its rank and also any other needed elements like converters and depayloaders. You can check exactly what gets selected by exporting the graph.

You can use this pipeline in Qt via the GstVideoOverlay interface setting it with gst_video_overlay_set_window_handle. This should display frames in your Qt window without them leaving GPU.


We also have a product that overlays a Qt GUI (from a QML) over an existing GStreamer stream that you can then display or save without additional memory copies and without leaving the pipeline. Here is the link if you want to check it out:

1 Like

Hi,
We don’t work on qt so are not sure whether there is overlay you can make use of. This would need other users to check and share experience. If you would like to manually upgrade gstreamer version, you can refer to
Accelerated GStreamer — NVIDIA Jetson Linux Developer Guide

Do you happen to know whats the usual way of working for my scenario?

As far as I understood your suggestion, it is to render the video stream in a separate borderless window and then overlay it over my wanted position in the GUI?

Somebody before me has to have had a problem with embedding hardware accelerated video stream into Qt. Im wondering whats the best scenario here, upgrade GStreamer and hope for the best, write my own element to allow Qt access to GPU memory or something third

You are right, that’s a pretty common problem on Jetson, that’s why we developed an element to solve it since a lot of clients needed a similar interaction between GStreamer and a Qt GUI. Let me clarify what I meant because I think there was a misunderstanding. There are 2 optimized ways to solve this issue:

  1. Render GStreamer buffers in a Qt widget or window from Qt. This would be Qt native and more versatile.
  2. Render the Qt GUI as an overlay over the original buffer without leaving GStreamer. This would be GStreamer native and more efficient.

For my suggestion with GstVideoOverlay it is not necessary to create a separate window to render GStreamer in. The typical pattern is:

  • In Qt, create a window or widget where you want to render the buffers
  • Get its native window handle (winId())
  • Pass it to GStreamer with gst_video_overlay_set_window_handle
  • Use an accelerated sink to render like nvoverlaysink, nvglglessink or nvdrmvideosink

However, most of our clients do it the other way around with qtoverlay, because that way it is more optimized and you can still in GStreamer pipe the buffer with the GUI to something else (recording, rtsp, webrtc, …). The issue with qtoverlay is that interaction with the GUI is more restricted.

1 Like

I managed to achieve what I wanted. Thank you for your help.

1 Like