How do I pass gstreamer a NvBuffer as a source for the omxh265enc

I’m currently pulling frames off a camera using Argus and have created a NvBuffer using this below:

EGLStream::NV::IImageNativeBuffer* const iNativeBuffer = Argus::interface_cast<EGLStream::NV::IImageNativeBuffer>(image);
                const int fd = iNativeBuffer->createNvBuffer(
                        _iCameraSensorMode->getResolution(),
                        NvBufferColorFormat_NV12,
                        NvBufferLayout_Pitch
                        );

Given the fileDescriptor of the NvBuffer, how do I pass that data into gstreamer to use as my source to my RTSP pipeline? I have setup the source caps to accept NVVM NV12 images such as:

    /* set the caps on the source */
    GstCaps *caps = gst_caps_new_simple(
            "video/x-raw(memory:NVMM)",
            "format", G_TYPE_STRING, "NV12",
            "width", G_TYPE_INT, _config_width,
            "height", G_TYPE_INT, _config_height,
            "framerate", GST_TYPE_FRACTION, _config_fps, 1,
            NULL);

How I currently push data into gstreamer is that I allocate a buffer of the given image byte size and copy it like:

buffer = gst_buffer_new_allocate(NULL, dataSize, NULL);
if (gst_buffer_map(buffer, &map, GST_MAP_READ))
{
    memcpy((guchar *) map.data, (guchar *) data, dataSize);
    gst_buffer_unmap(buffer, &map);

    gst_frame_buffer_info_meta = (GstFrameBufferInfoMeta *) gst_buffer_add_meta(buffer,
                                                                                GST_BUFFER_INFO_META_INFO,
                                                                                NULL);
    // Add my metadata

    static GstClockTime timestamp = 0;

    GST_BUFFER_PTS (buffer) = timestamp;
    GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale_int(1, GST_SECOND, 30);
    timestamp += GST_BUFFER_DURATION (buffer);

    g_signal_emit_by_name((GstAppSrc *) app.appsrc, "push-buffer", buffer, &ret);
}

But I want to change that so my gstreamer source can accept the NvBuffer and then pass it straight into the omxh265enc.

Hi,
You can use appsrc plugin and wrap the NvBuffer into NVMM buffer in the plugin. We have deprecated omx plugins and please use nvv4l2h265enc.

If you use Jetpack 4 release, please refer to this sample for using appsrc:
Opencv gpu mat into GStreamer without downloading to cpu - #15 by DaneLLL

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.