NVBuffer to nveglstreamsrc

I am trying to modify the 13_multi_camera example from the Jetson Multimedia api to replace the NVEglRenderer with a gstreamer pipeline. I have successfully run the gstVideoEncode sample and modified it to run without X11 according to: [EGL without X11](https://egl without x11) as this is also a requirement for my application.

I want to feed the m_compositedFrame into a gstreamer pipeline.

        CONSUMER_PRINT("Render frame %d\n", g_frame_count - m_framesRemaining);
        if (m_streams.size() > 1)
            /* Composite multiple input to one frame */
            NvBufferComposite(m_dmabufs, m_compositedFrame, &m_compositeParam);

I have also tried modifying simpleEGLStreams_producer to work for my application but have had no success so far.

Any push in the right direction would be appreciated as MMAPI is still quite new to me, I am far more familiar with the accelerated gstreamer on jetson for video pipelines.

Thank you.

A working solution is to use NvBuffer APIs and run a gstreamer pipeline like:

appsrc ! video/x-raw(memory:NVMM) ! ...

Please refer to this sample:
Creating a GStreamer source that publishes to NVMM - #7 by DaneLLL

Great! I did not know you could use appsrc with NVMM. Your example worked perfectly and I was able to make it work with the multi_camera example no problem. I’ll post the complete working multi_camera example modified to pass frames to a gstreamer pipeline soon.