Camera capture with libargus and push appsrc

Hello everyone,

I’m trying to make an application where I want to capture frames from 2 cameras synchronously with libargus API and push frames to a gstreamer pipeline like:
appsrc name=src ! video/x-raw(memory:NVMM), width=3840, height=2160, framerate=30/1, format=RGBA ! nvv4l2h264enc ! h264parse ! mp4mux ! filesink location=output.mp4

I’m trying to avoid GPU-CPU copies in this process but the code I could write didn’t work. I can get dmabuf fd from libargus API but how can I push it to gstreamer pipeline properly?

I don’t want to use other examples in jetson_multimedia_api due to I’m going to change and extend this gstreamer pipeline so I need this flexibility. Thank you all!

Here is my code(doesn’t work):
main.txt (24.8 KB)

hello sarperyurttas36,

your real use-case is frame synchronization, right?
please refer to Argus sample, syncSensor it’s a software approach of software based synchronization,
this example using multiple sensor per single capture session method, which means it’ll duplicate single capture request to these camera sensors.

Hi,

The frame capturing part is already the same as the syncSensor example however I want to push those frames to a gstreamer pipeline with appsrc. I said I don’t want to use other examples because there are some examples with nveglstreamsrc and with NvVideoEncoder. I don’t want to use those methods I want to use specifically appsrc for this purpose and I think there is a way to do that but I couldn’t find it yet.

If someone checks my main.cpp file(I uploaded it as txt), they can see what I’m trying to achieve.

Hi,
For feeding NvBufSurface to appsrc, please refer to the sample:
NvUtils NvBufSurface to gstreamer - #5 by DaneLLL

Hi,

I built and ran your example, it is working. However, I applied the same in my example and converted to capture only 1 camera for the sake of simplicity but it did not work in my case. I think I cannot convert EGLStream::Frame to a nvbuffer in a proper way.

This is the part that I read and push frames:

    bool ConsumerThread::threadExecute()
    {
        IEGLOutputStream *iEglOutputStream = interface_cast<IEGLOutputStream>(m_stream);
        IFrameConsumer *iFrameConsumer = interface_cast<IFrameConsumer>(m_consumer);

        /* Wait until the producer has connected to the stream. */
        CONSUMER_PRINT("Waiting until producer is connected...\n");
        if (iEglOutputStream->waitUntilConnected() != STATUS_OK)
            ORIGINATE_ERROR("Stream failed to connect.");
        CONSUMER_PRINT("Producer has connected; continuing.\n");

        GstBuffer *buffer;
        GstFlowReturn ret;
        GstMapInfo map = {0};
        gpointer data = NULL, user_data = NULL;
        NvBufferParams par;
        GstMemoryFlags flags = (GstMemoryFlags)0;

        while (true)
        {
            Argus::Status argusStatus;

            /* Acquire a frame. */
            UniqueObj<Frame> frame(iFrameConsumer->acquireFrame(ACQUIRE_FRAME_TIMEOUT, &argusStatus));
            if (argusStatus != Argus::STATUS_OK)
            {
                printErrorStatus(argusStatus);
                errorStatus = true;
                break;
            }
            IFrame *iFrame = interface_cast<IFrame>(frame);
            if (!iFrame)
                break;

            /* Get the IImageNativeBuffer extension interface. */
            NV::IImageNativeBuffer *iNativeBuffer =
                interface_cast<NV::IImageNativeBuffer>(iFrame->getImage());
            if (!iNativeBuffer)
                ORIGINATE_ERROR("IImageNativeBuffer not supported by Image.");
            if (m_dmabuf == -1)
            {
                m_dmabuf = iNativeBuffer->createNvBuffer(iEglOutputStream->getResolution(),
                                                         NVBUF_COLOR_FORMAT_RGBA,
                                                         NVBUF_LAYOUT_PITCH);
                if (m_dmabuf == -1)
                    CONSUMER_PRINT("\tFailed to create NvBuffer\n");
            }
            else if (iNativeBuffer->copyToNvBuffer(m_dmabuf) != STATUS_OK)
            {
                ORIGINATE_ERROR("Failed to copy frame to NvBuffer.");
            }

            user_data = g_malloc(sizeof(int));
            GST_INFO("NvBufferCreate %d", m_dmabuf);
            *(int *)user_data = m_dmabuf;
            NvBufferGetParams(m_dmabuf, &par);
            data = g_malloc(par.nv_buffer_size);

            buffer = gst_buffer_new_wrapped_full(flags,
                                                 data,
                                                 par.nv_buffer_size,
                                                 0,
                                                 par.nv_buffer_size,
                                                 user_data,
                                                 notify_to_destroy);
            buffer->pts = timestamp;

            gst_buffer_map(buffer, &map, GST_MAP_WRITE);
            memcpy(map.data, par.nv_buffer, par.nv_buffer_size);
            gst_buffer_unmap(buffer, &map);

            g_signal_emit_by_name(m_source, "push-buffer", buffer, &ret);
            timestamp += 33333333;

            if (ret != GST_FLOW_OK)
            {
                ORIGINATE_ERROR("Error pushing gstBuffer to GStreamer pipeline");
                return false;
            }
        }

        CONSUMER_PRINT("Done.\n");

        requestShutdown();

        return true;
    }

This is my gstreamer pipeline:

appsrc name=src ! video/x-raw(memory:NVMM), width=3840, height=2160, framerate=30/1, format=RGBA ! nvvidconv ! video/x-raw(memory:NVMM), format=NV12 ! nvv4l2h264enc ! h264parse ! mp4mux ! filesink location=output.mp4

Output:

Set governor to performance before enabling profiler
Argus Version: 0.99.3.3 (multi-process)
Sensor mode: 0  width: 3840  height: 2160
Framerate: 30
PRODUCER: Creating left stream.
PRODUCER: Launching consumer thread
Gstreamer pipeline: appsrc name=src ! video/x-raw(memory:NVMM), width=3840, height=2160, framerate=30/1, format=RGBA ! nvvidconv ! video/x-raw(memory:NVMM), format=NV12 ! nvv4l2h264enc ! h264parse ! mp4mux ! filesink location=output
Opening in BLOCKING MODE 
CONSUMER: Waiting until producer is connected...
PRODUCER: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
NvMMLiteOpen : Block : BlockType = 4 
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4 
gst_nvvconv_transform: NvBufSurfTransform Failed 
PRODUCER: Captures complete, disconnecting producer.
CONSUMER: Argus::STATUS_DISCONNECTED 
CONSUMER: Done.
Embedded video playback halted; module src reported: Internal data stream error.
Error generated. /workspace/src/main.cpp, gstFinishPipeline:366 Cannot send EOS to GStreamer pipeline
Error generated. /workspace/src/main.cpp, threadShutdown:303 GStreamer pipeline coudln't finish properly
Error generated. /workspace/src/utils/Thread.cpp, threadFunction:135 (propagating)
PRODUCER: Done -- exiting.
************************************
Total Profiling Time = 0 sec
************************************

There is an error “gst_nvvconv_transform: NvBufSurfTransform Failed”.

main.txt (23.0 KB)

Hi,
NvBuffer APIs are deprecated on Jetpack 5, so NvBufferGetParams() may fail. Please use NvBufSurface APIs instead.

Hi DaneLLL,
Thanks for your reply. You may be right but I solved it by using nvds buffer pool.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.