Syncsensor - multiple camera, jpeg Consumer

I wanted to tweak the syncsensor example provided in the Tegra Multimedia for my use case.

Use Case

  1. Synchronization of four cameras
  2. 4 Jpeg Consumer or 4 OpenCV Consumer - Image capture from four cameras based on a trigger from GPIO.
  3. Detail:Four cameras should be opened all the time and a single image from all the cameras should be captured based on a trigger from GPIO. The camera should be opened all the time and should wait for a trigger to capture.

From my understanding, I think, the images can be pulled from this thread attached below. But I don’t know how to go about it.

bool StereoDisparityConsumerThread::threadExecute()
{
    CONSUMER_PRINT("Waiting for Argus producer to connect to left stream.\n");
    m_leftStream->waitUntilConnected();

    CONSUMER_PRINT("Waiting for Argus producer to connect to right stream.\n");
    m_rightStream->waitUntilConnected();

    CONSUMER_PRINT("Streams connected, processing frames.\n");
    unsigned int histogramLeft[HISTOGRAM_BINS];
    unsigned int histogramRight[HISTOGRAM_BINS];
    while (true)
    {
        EGLint streamState = EGL_STREAM_STATE_CONNECTING_KHR;

        // Check both the streams and proceed only if they are not in DISCONNECTED state.
        if (!eglQueryStreamKHR(m_leftStream->getEGLDisplay(),m_leftStream->getEGLStream(),EGL_STREAM_STATE_KHR,&streamState) || (streamState == EGL_STREAM_STATE_DISCONNECTED_KHR))
        {
            CONSUMER_PRINT("left : EGL_STREAM_STATE_DISCONNECTED_KHR received\n");
            break;
        }

        if (!eglQueryStreamKHR(m_rightStream->getEGLDisplay(),m_rightStream->getEGLStream(),EGL_STREAM_STATE_KHR,&streamState) || (streamState == EGL_STREAM_STATE_DISCONNECTED_KHR))
        {
            CONSUMER_PRINT("right : EGL_STREAM_STATE_DISCONNECTED_KHR received\n");
            break;
        }

        ScopedCudaEGLStreamFrameAcquire left(m_cuStreamLeft);
        ScopedCudaEGLStreamFrameAcquire right(m_cuStreamRight);

        if (!left.hasValidFrame() || !right.hasValidFrame())
            break;

        // Calculate histograms.
        float time = 0.0f;
        if (left.generateHistogram(histogramLeft, &time) && right.generateHistogram(histogramRight, &time))
        {
            // Calculate KL distance.
            float distance = 0.0f;
            Size2D<uint32_t> size = right.getSize();
            float dTime = computeKLDistance(histogramRight,histogramLeft,HISTOGRAM_BINS,size.width() * size.height(),&distance);
            CONSUMER_PRINT("KL distance of %6.3f with %5.2f ms computing histograms and " "%5.2f ms spent computing distance\n",distance, time, dTime);
        }
    }
    CONSUMER_PRINT("No more frames. Cleaning up.\n");

    PROPAGATE_ERROR(requestShutdown());

    return true;
}

I want the consumer to be JPEG or OpenCV.

  1. Is it possible to grab the synced image in opencv or Jpeg consumer? If possible, can anyone give a snippet?
  2. Is it possible to push images from CUDA to OpenCV from the above thread? Please guide me in the right direction with a piece of code

Thanks

Hi Sathish,
I would recommend you to check this link. We have all details that I guess would match your use case. Supporting OpenCV and synchronization.

https://devtalk.nvidia.com/default/topic/1069502/jetson-tx2/jetson-camera-application-guide-/post/5417643/#5417643

I was modifying the syncsensor example for my use case. I modified the code of the syncsensor in this particular function

ScopedCudaEGLStreamFrameAcquire::ScopedCudaEGLStreamFrameAcquire(CUeglStreamConnection& connection)
    : m_connection(connection)
    , m_stream(NULL)
    , m_resource(0)
{
    CUresult r = cuEGLStreamConsumerAcquireFrame(&m_connection, &m_resource, &m_stream, -1);
    if (r == CUDA_SUCCESS)
    {
        cuGraphicsResourceGetMappedEglFrame(&m_frame, m_resource, 0, 0);
        cv::cuda::GpuMat d_Mat_RGBA(m_frame.height, m_frame.width, CV_8UC4, m_frame.frame.pPitch[0]); 
        cv::cuda::GpuMat d_Mat_RGB (m_frame.height, m_frame.width, CV_8UC3);
        cv::cuda::cvtColor(d_Mat_RGBA, d_Mat_RGB, cv::COLOR_RGBA2RGB);
        cv::Mat h_Mat_RGB(m_frame.height, m_frame.width, CV_8UC3);
        d_Mat_RGB.download(h_Mat_RGB);
	//cv::imwrite("h.jgp",d_Mat_RGB);

    }
}

I ran into this problem

fcr@fcr-desktop:~/tegra_multimedia_api/argus/build/samples/syncSensor$ ./argus_syncsensor
Executing Argus Sample: argus_syncsensor
Argus Version: 0.97.3 (multi-process)
PRODUCER: Creating left stream.
PRODUCER: Creating right stream.
PRODUCER: Launching disparity checking consumer
Initializing CUDA
CONSUMER: Connecting CUDA consumer to left stream
CONSUMER: Connecting CUDA consumer to right stream
CONSUMER: Waiting for Argus producer to connect to left stream.
PRODUCER: Starting repeat capture requests.
CONSUMER: Waiting for Argus producer to connect to right stream.
CONSUMER: Streams connected, processing frames.
terminate called after throwing an instance of 'cv::Exception'
  what():  OpenCV(4.1.1) /home/fcr/Downloads/JEP-master/script/opencv_contrib-4.1.1/modules/cudev/include/opencv2/cudev/grid/detail/transform.hpp:267: error: (-217:Gpu API call) unspecified launch failure in function 'call'

Aborted (core dumped)

I have Opencv 4.1 version. Can anyone tell me where I’m going wrong?