ExtractFdFromNvBuffer fails with NVMAP_IOC_GET_FD failed: Invalid argument

Hello, we are trying to get the NVMM memory buffer from AppSink in a gstreamer pipeline. We saw in the Jetson forums that a suggested method is to call the function ExtractFdFromNvBuffer on the gst mapped buffer to get the NVMM file descriptor. We did so but the function is failing with the following error:
NVMAP_IOC_GET_FD failed: Invalid argument

The return value is 0 (should be -1 for a failure, according to the docs) and the file descriptor ID is set to -24. Those are the caps of the samples received by the AppSink:

video/x-raw(memory:NVMM), format=(string)NV12, width=(int)1920, height=(int)1080, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)25/1

The mapped gst buffer has the “size” field set to 64 and the “data” field set to what seems a pointer.

Environment is Orin AGX Dev kit in headless mode. Same issue still persist with a monitor plugged.
Jetpack version in 5.0.2

Can you help us with this issue?
Thanks!

I’ll paste the code, which is a sample taken from the forums, with additional logs added;

#include <cstdlib>
#include <gst/gst.h>
#include <gst/gstinfo.h>
#include <gst/app/gstappsink.h>
#include <glib-unix.h>
#include <dlfcn.h>

#include <iostream>
#include <sstream>
#include <thread>

#include "NvEglRenderer.h"
#include "nvbuf_utils.h"
#include <iostream>

using namespace std;

#define USE(x) ((void)(x))

static GstPipeline* gst_pipeline = nullptr;
static string launch_string;
static int frame_count = 0;
static int sleep_count = 0;
static int eos = 0;
static NvEglRenderer* renderer;

static void appsink_eos(GstAppSink* appsink, gpointer user_data)
{
    printf("app sink receive eos\n");
    eos = 1;
    //    g_main_loop_quit (hpipe->loop);
}

static GstFlowReturn new_buffer(GstAppSink* appsink, gpointer user_data)
{
    GstSample* sample = NULL;

    g_signal_emit_by_name(appsink, "pull-sample", &sample, NULL);

    if (sample)
    {
        GstBuffer* buffer = NULL;
        GstCaps* caps = NULL;
        GstMapInfo map = { 0 };
        int dmabuf_fd = 0;

        caps = gst_sample_get_caps(sample);
        if (!caps)
        {
            printf("could not get snapshot format\n");
        }
        gst_caps_get_structure(caps, 0);
        buffer = gst_sample_get_buffer(sample);
        gst_buffer_map(buffer, &map, GST_MAP_READ);

        std::cout << "New sample received" << std::endl;

        int res = ExtractFdFromNvBuffer((void*)map.data, &dmabuf_fd);
        
        std::cout << "Sample Caps: " << gst_caps_to_string(caps) << std::endl;
        std::cout << "Buffer mapping, 'size': " << map.size << " 'data': " << (void*)map.data << " 'flags': " << map.flags << " 'maxsize': " << map.maxsize << std::endl;
        std::cout << "ExtractFdFromNvBuffer res:" << res << " dmabuf_fd: " << dmabuf_fd << std::endl;
        std::cout << std::endl;

        //renderer->render(dmabuf_fd);

        frame_count++;

        gst_buffer_unmap(buffer, &map);

        gst_sample_unref(sample);
    }
    else
    {
        g_print("could not make snapshot\n");
    }

    return GST_FLOW_OK;
}

int main(int argc, char** argv) {
    USE(argc);
    USE(argv);

    gst_init(&argc, &argv);

    GMainLoop* main_loop;
    main_loop = g_main_loop_new(NULL, FALSE);
    ostringstream launch_stream;
    GstAppSinkCallbacks callbacks = { appsink_eos, NULL, new_buffer };

    launch_stream  << "filesrc location=/home/va/test.mp4 ! decodebin ! appsink name=mysink ";

    launch_string = launch_stream.str();

    g_print("Using launch string: %s\n", launch_string.c_str());

    GError* error = nullptr;
    gst_pipeline = (GstPipeline*)gst_parse_launch(launch_string.c_str(), &error);

    if (gst_pipeline == nullptr) {
        g_print("Failed to parse launch: %s\n", error->message);
        return -1;
    }
    if (error) g_error_free(error);

    GstElement* appsink_ = gst_bin_get_by_name(GST_BIN(gst_pipeline), "mysink");
    gst_app_sink_set_callbacks(GST_APP_SINK(appsink_), &callbacks, NULL, NULL);

    //renderer = NvEglRenderer::createEglRenderer("renderer0",w, h, 0, 0);
    //renderer->setFPS(24);

    gst_element_set_state((GstElement*)gst_pipeline, GST_STATE_PLAYING);

    while (eos == 0) {
        sleep(1);
        sleep_count++;
    }
    //sleep(90);
    //g_main_loop_run (main_loop);

    gst_element_set_state((GstElement*)gst_pipeline, GST_STATE_NULL);
    gst_object_unref(GST_OBJECT(gst_pipeline));
    g_main_loop_unref(main_loop);

    //delete renderer;

    g_print("going to exit, decode %d frames in %d seconds \n", frame_count, sleep_count);
    return 0;
}

and those are the logs printed:

Using launch string: filesrc location=/home/va/test.mp4 ! decodebin ! appsink name=mysink
Opening in BLOCKING MODE
NvMMLiteOpen : Block : BlockType = 261
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 261
New sample received
NVMAP_IOC_GET_FD failed: Bad file descriptor
Sample Caps: video/x-raw(memory:NVMM), format=(string)NV12, width=(int)1920, height=(int)1080, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)25/1
Buffer mapping, 'size': 64 'data': 0xffff8015feb0 'flags': 1 'maxsize': 2088960
ExtractFdFromNvBuffer res:0 dmabuf_fd: -24

New sample received
NVMAP_IOC_GET_FD failed: Invalid argument
Sample Caps: video/x-raw(memory:NVMM), format=(string)NV12, width=(int)1920, height=(int)1080, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)25/1
Buffer mapping, 'size': 64 'data': 0xffff80169ad0 'flags': 1 'maxsize': 2088960
ExtractFdFromNvBuffer res:0 dmabuf_fd: -24

New sample received
NVMAP_IOC_GET_FD failed: Invalid argument
Sample Caps: video/x-raw(memory:NVMM), format=(string)NV12, width=(int)1920, height=(int)1080, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)25/1
Buffer mapping, 'size': 64 'data': 0xffff8016a260 'flags': 1 'maxsize': 2088960
ExtractFdFromNvBuffer res:0 dmabuf_fd: -24

New sample received
NVMAP_IOC_GET_FD failed: Invalid argument
Sample Caps: video/x-raw(memory:NVMM), format=(string)NV12, width=(int)1920, height=(int)1080, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)25/1
Buffer mapping, 'size': 64 'data': 0xffff8016a9f0 'flags': 1 'maxsize': 2088960
ExtractFdFromNvBuffer res:0 dmabuf_fd: -24

New sample received
NVMAP_IOC_GET_FD failed: Invalid argument
Sample Caps: video/x-raw(memory:NVMM), format=(string)NV12, width=(int)1920, height=(int)1080, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)25/1
Buffer mapping, 'size': 64 'data': 0xffff80169340 'flags': 1 'maxsize': 2088960
ExtractFdFromNvBuffer res:0 dmabuf_fd: -24

New sample received
NVMAP_IOC_GET_FD failed: Invalid argument
Sample Caps: video/x-raw(memory:NVMM), format=(string)NV12, width=(int)1920, height=(int)1080, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)25/1
Buffer mapping, 'size': 64 'data': 0xffff8016b910 'flags': 1 'maxsize': 2088960
ExtractFdFromNvBuffer res:0 dmabuf_fd: -24

New sample received
NVMAP_IOC_GET_FD failed: Invalid argument
Sample Caps: video/x-raw(memory:NVMM), format=(string)NV12, width=(int)1920, height=(int)1080, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)25/1
Buffer mapping, 'size': 64 'data': 0xffff8016c0a0 'flags': 1 'maxsize': 2088960
ExtractFdFromNvBuffer res:0 dmabuf_fd: -24

New sample received
NVMAP_IOC_GET_FD failed: Invalid argument
Sample Caps: video/x-raw(memory:NVMM), format=(string)NV12, width=(int)1920, height=(int)1080, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)25/1
Buffer mapping, 'size': 64 'data': 0xffff8016c830 'flags': 1 'maxsize': 2088960
ExtractFdFromNvBuffer res:0 dmabuf_fd: -24

New sample received
NVMAP_IOC_GET_FD failed: Invalid argument
Sample Caps: video/x-raw(memory:NVMM), format=(string)NV12, width=(int)1920, height=(int)1080, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)25/1
Buffer mapping, 'size': 64 'data': 0xffff8016b180 'flags': 1 'maxsize': 2088960
ExtractFdFromNvBuffer res:0 dmabuf_fd: -24

New sample received
NVMAP_IOC_GET_FD failed: Bad file descriptor
Sample Caps: video/x-raw(memory:NVMM), format=(string)NV12, width=(int)1920, height=(int)1080, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)25/1
Buffer mapping, 'size': 64 'data': 0xffff8015feb0 'flags': 1 'maxsize': 2088960
ExtractFdFromNvBuffer res:0 dmabuf_fd: -24

We solved the issues by looking at the source code of nvvidconv. There you can see you have to cast the ptr returned by the gst_buffer_map to a NvBufSurface*. Using this pointer you can access the memory and its properties.
Hope this helps someone in the future :)

Glad to know issue fixed, thanks for the sharing!

Hello @dl1 ,

Could you please share more information?

This problem seems like the one jetson-utils has now.

I tried to change the code of jetson-utils with NVMM, and also get the error:

NVMAP_IOC_GET_FD failed: Invalid argument

Hello,
what are you trying to do? if you share a bit of the source code i’ll try to help.

To give you more information: we found that if you want to access the NVMM memory surface decoded by a gstreamer pipeline you have to change the code from this:

GstMapInfo info= { 0 };
int dmabuf_fd = 0;

gst_buffer_map(buffer, &info, GST_MAP_READ);

int res = ExtractFdFromNvBuffer((void*)info.data, &dmabuf_fd);
//the FD for the DMABUF_ID is in dmabuf_fd

to this:

GstMapInfo info = {0};

gst_buffer_map(buffer, &info, GST_MAP_READ);

NvBufSurface* surf = (NvBufSurface*)info.data;

//do whatever you want with surf.
//the FD for the DMABUF_ID is in surf->surfaceList[0].bufferDesc

I checked briefly the repo for jetson-utils, i found some lines you need to update with the above changes. You can find them at jetson-utils/gstBufferManager.cpp at 35593c5ad2c1c62f6f0166d6581945fb58fd1f7b · dusty-nv/jetson-utils · GitHub
Here the NVMM buffer is accessed the old way. I suppose things have changed with orin and the new jetpack releases.
Since i don’t know jetson-utils i don’t know if you have to update the code somewhere else.

Let me know if it helps.

Thanks for your reply.

Exactly this part.

How to get FD with NvBufSurface?
Just using surf->surfaceList[0].bufferDesc?

Yeah, the FD is in the “bufferDesc” field of the NvBufSurfaceParams struct. Keep in mind that there could be more than one surfaceList element in the array. The exact number is in the “numFilled” field of the NvBufSurface. I think if you have an NvBufSurface coming from the jetson decoder you can assume it contains an NV12 image and thus it should have always 1 element in the surfaceList array.

I’ll link the doc page for NvBufSurface:
https://docs.nvidia.com/metropolis/deepstream/sdk-api/structNvBufSurface.html

Thanks for your help!

I have tried to edit jetson-utils code, and NVMM is work now.
Here is the change point:

Thanks everyone, I will test out this patch for jetson-utils and then merge it. Appreciate it.

@dusty_nv Hi,
Here is some other test result for your information:

If do not add sync=0 to appsink, it is very likely to grab duplicated images when input is RTSP stream. Maybe because RTSP stream comes not at the exactly time point?

ss << "appsink name=mysink";

I have tested with 2k/4k/8k streams, all of them may grab duplicated frames. Larger size may occur more often.

And if change to ss << "appsink name=mysink sync=0";, duplicated frames may be disappeared or reduced. (2k: disappeared, 8k: reduced)

Thanks @Up2U, I’ve made a note to try this out as well - certainly I want to avoid duplication of frames and any extra processing. I’m curious if this could also have been happening with other video sources as well (like CSI or V4L2 cameras). Regardless, I’m glad you noticed!

Hi @dusty_nv ,

I have done more tests with RTSP stream.
And I have recorded some video clips by monitor.
I played the saved video clips frame by frame on PC media player to check duplicated frame.

  1. 2k/4k/8k streams, using PC (FFMPEG demux + nvcodec + OpenGL display), no duplicated frames.

  1. The same streams(2k), using Jetson Orin, gst-launch-1.0 or video-viewer, with or without sync=0, there will be duplicated frames. With sync=0 will have less duplicated frames.

gst-launch-1.0 rtspsrc protocols=tcp location=rtsp://192.168.2.1/live_stream latency=0 ! rtph265depay ! h265parse ! nvv4l2decoder enable-max-performance=1 ! nvvidconv ! 'video/x-raw(memory:NVMM), format=(string)NV12' ! nv3dsink

gst-launch-1.0 rtspsrc protocols=tcp location=rtsp://192.168.2.1/live_stream latency=0 ! rtph265depay ! h265parse ! nvv4l2decoder enable-max-performance=1 ! nvvidconv ! 'video/x-raw(memory:NVMM), format=(string)NV12' ! nv3dsink sync=0

  1. The same streams(8k), using Jetson Orin, without sync=0, frame drop is occured seriously.

With sync=0, frame drop is reduced, but there is still duplicated frames.

And I found that even source is local file, without sync=0, there will be duplicated frames too.

The sample is @DaneLLL provided last week (Question about nvv4l2decoder element - #5 by DaneLLL):

gst-launch-1.0 filesrc location= /opt/nvidia/deepstream/deepstream-6.1/samples/streams/sample_1080p_h265.mp4 ! qtdemux ! queue ! h265parse ! nvv4l2decoder ! nv3dsink -e

with sync=0:

without sync=0:

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.