Argus API Thread Safety on R28.1

Hi Folks,

I am processing YUV420 inputs from two cameras concurrently. I am also encoding/recording from each camera, while feeding frames to opencv pipeline with workload divided between GPU and CPU. I am using Argus library to interface with cameras.

The camera interface of the pipeline looks like -

bool aaCamCaptureThread::threadExecute()
    IStream *iStream = interface_cast<IStream>(m_stream);
    IFrameConsumer *iFrameConsumer = interface_cast<IFrameConsumer>(m_consumer);
    Argus::Status status;
    int ret;

    // Wait until the producer has connected to the stream.
    AACAM_CAPTURE_PRINT("Waiting until producer is connected...%x\n", m_stream);
    if (iStream->waitUntilConnected() != STATUS_OK)
        ORIGINATE_ERROR("Stream failed to connect.");
    AACAM_CAPTURE_PRINT("Producer has connected; continuing. %x\n", iFrameConsumer);

    int frameCount = 0;
    while (m_currentFrame < (m_lastFrameCount-2))
        // Acquire a Frame.
        UniqueObj<Frame> frame(iFrameConsumer->acquireFrame());
        IFrame *iFrame = interface_cast<IFrame>(frame);
        if (!iFrame)
        // Get the Frame's Image.
        Image *image = iFrame->getImage();
        EGLStream::NV::IImageNativeBuffer *iImageNativeBuffer
              = interface_cast<EGLStream::NV::IImageNativeBuffer>(image);
        TEST_ERROR_RETURN(!iImageNativeBuffer, "Failed to create an IImageNativeBuffer");

        int fd = iImageNativeBuffer->createNvBuffer(ARGUSSIZE             {m_pCamInfo->liveParams.inputVideoInfo.width, m_pCamInfo->liveParams.inputVideoInfo.height},
               NvBufferColorFormat_YUV420, NvBufferLayout_Pitch, &status);


This code, alongwith rest of my opencv processing and encoding pipe works well on R24.2.1. However on R28.1 it gets choked in -

UniqueObj<Frame> frame(iFrameConsumer->acquireFrame());

I copied this code from ~/tegra_multimedia_api/argus/samples/multiSensor example.

I am wondering whether there could be an issue with concurrent access of CSI interface by the two cameras ? I tried protecting acquireFrame() call with a mutex - but that still gets results in same choking.

I am assuming that - ~/tegra_multimedia_api/argus/samples/multiSensor example - which does concurrent access to two cameras does not quite run into this issue - because the two cameras are not reading frame at 30 fps, which is what I do in my case.

Please help.


Can the issue be reproduced with multiSensor sample? Or I need to apply any patch to the sample?

Please share a method we can run and compare between r24.2.1 and r28.1