use gstreamer or tegra_multimedia_api to decode video would be more efficient and increase throughpu...


I have use gstreamer like this:

rtspsrc -> decodebin -> nvvidconv -> nvvideosink -> EGLImage

and now I want to use multimedia_api to improve performence.
My solution is follow:

rtspsrc -> rtph264depay -> appsink ->

and use the multimedia_api to decode video as the sample 00_video_decode -> fd to EGLImage. Does this solution feasible ? Or there are any solution to replace gstreamer ?


Hi ClancyLian,
You can use NvBuffer APIs to get fd in appsink.

* This method must be used to extract dmabuf_fd of the hardware buffer.
* @param[in] nvbuf Specifies the `hw_buffer`.
* @param[out] dmabuf_fd Returns DMABUF FD of `hw_buffer`.
* @returns 0 for success, -1 for failure.
int ExtractFdFromNvBuffer (void *nvbuf, int *dmabuf_fd);

* This method must be used for releasing dmabuf_fd.
* Obtained using ExtractfdFromNvBuffer API.
* @param[in] dmabuf_fd Specifies the `dmabuf_fd` to release.
* @returns 0 for success, -1 for failure.
int NvReleaseFd (int dmabuf_fd);

Please refer to attached test3.cpp

$ export MMAPI_INCLUDE=/home/nvidia/tegra_multimedia_api/include
$ export MMAPI_CLASS=/home/nvidia/tegra_multimedia_api/samples/common/classes
$ export USR_LIB=/usr/lib/aarch64-linux-gnu
$ g++ -Wall -std=c++11  test3.cpp -o test3 $(pkg-config --cflags --libs gstreamer-app-1.0) -ldl -I$MMAPI_INCLUDE /usr/lib/aarch64-linux-gnu/tegra/ $MMAPI_CLASS/NvEglRenderer.o $MMAPI_CLASS/NvElement.o $MMAPI_CLASS/NvElementProfiler.o $MMAPI_CLASS/NvLogging.o $USR_LIB/ $USR_LIB/ $USR_LIB/
$ export DISPLAY=:0
$ ./test3

Test video is (1.47 KB)

Hi, DaneLLL,

You may misunderstand my question. The gstreamer is high-level API and the multimedia_api is lower-level API. If I use gstreamer, I used decodebin to decode my video, and you gave me demo also use decodebin. My idea is that I use gstreamer to get encode data, and use NvVideoDecoder to decode my video. Does this is more efficient ?

And your demo is also a solution, But I want to improve my throughput, or let deocde more efficient.


Hi ClancyLian,
The decodebin is equal to ‘qtdemux ! h264parse ! omxh264dec’. omxh264dec is with same HW engine as NvVideoDecoder. Your proposal is also good to go.

Hi DaneLLL,

The link for the attachment doesn’t work. Any way to put it back up?


I can download and unzip it. Probably your browser blocks the download?

I tried several browsers. Works today in the browser I originally tried it in.


I compiled “test3.cpp” with Jetson TX2, and it works well.
but it doesn’t work with Jetson nano.
I get following error message…

nvbuf_utils: dmabuf_fd 1125 mapped entry NOT found
nvbuf_utils: Can not get HW buffer from FD… Exiting…

I think that failed to release buffer, function of “NvReleaseFd(dmabuf_fd)”.
Does anyone know measures for this error?


Hi fujigen884,
On r32.1, you do not need to call NvReleaseFd(). Please remove the line and try again.

Hi DaneLLL,

I removed “NvReleaseFd()”, so it works well!
Thanks for your advice!!


Even logged in I cannot dowload the test file provided, I get an access denied error. Is there a way to dowload the sample code?

Best regatds

We are checking why the link is broken.

There is one more sample:

Please take a look.

DaneLLL, thanks for the other sample — I was able to download it.

In case it helps, when I try to download, I am redirected to here and get the error message:

    <Message>Access Denied</Message>

Edit I tried to download using Chromium, Chromium incognito, and Firefox. All three gave me the same error, but with different RequestId and HostId values.

Hi D3 team,
We are checking why the link is broken. Will update.

For using r32 releases, please refer to the sample and manually modify tegra_multimedia_api to jetson_multimedia_api.

Hi all, any news about possibility to download ??
Oleg K.