OpenCV Mat Object And Saving Image Data 09_camera_jepg_capture

I have a Jetson Nano; not the Orion model, and I’m going through the 09_camera_jpeg_capture example to get a better understanding of how the Argus library works.

I understand that in the ‘bool ConsumerThread::threadExecute()’ function is where the frames are being captured.

But when it comes to saving the frames where the pointer iFrames is pointing at is something I’m completely lost on and could use some help

Hello,

Thanks for using the NVIDIA forums. Your topic will be best served in the Jetson category.

I will move this post over for visibility.

Cheers,
Tom

1 Like

Hi,
For further processing on the frame data, please call createNvBuffer()/copyToNvBuffer() to get fd of the buffer. And you can access it through NvBuffer APIs.

I’m still very confused.

In the 09_camera_jepg_capture program, I know that bool ConsumerThread::threadExecute() captures the incoming video frames and I was under the impression that I could just use/copy m_dmabuf instead of creating another buffer.

Am I correct with that assumption or no?

My second question is that once either the m_dmabuf is either copied again or a second buffer is created what’s the best way to get the height/width of the image that’s saved in the mentioned buffer(s)?

For what it’s worth, here’s where I currently stand with the 09_camera_jpeg_capture example with the bool ConsumerThread::threadExecute() function

bool ConsumerThread::threadExecute()
    {
        IEGLOutputStream *iEglOutputStream = interface_cast<IEGLOutputStream>(m_stream);
        IFrameConsumer *iFrameConsumer = interface_cast<IFrameConsumer>(m_consumer);

        /* Wait until the producer has connected to the stream. */
        CONSUMER_PRINT("Waiting until producer is connected...\n");
        if (iEglOutputStream->waitUntilConnected() != STATUS_OK)
            ORIGINATE_ERROR("Stream failed to connect.");
        CONSUMER_PRINT("Producer has connected; continuing.\n");
        int second_dmabuf = -1;
        while (true)
        {
            /* Acquire a frame. */
            UniqueObj<Frame> frame(iFrameConsumer->acquireFrame());
            IFrame *iFrame = interface_cast<IFrame>(frame);
            if (!iFrame)
                break;

            /* Get the IImageNativeBuffer extension interface. */
            // Get access to image data through iFrame->getImage();
            // Image data can be retrieved and processed through ' IImageNativeBuffer interface.
            NV::IImageNativeBuffer *iNativeBuffer = interface_cast<NV::IImageNativeBuffer>(iFrame->getImage());
            if (!iNativeBuffer)
                ORIGINATE_ERROR("IImageNativeBuffer not supported by Image.");

            /* If we don't already have a buffer, create one from this image.
               Otherwise, just blit to our buffer. */
            if (m_dmabuf == -1)
            {
                /*
                    virtual int EGLStream::NV::IImageNativeBuffer::createNvBuffer(Argus::Size2D<...> size, NvBufferColorFormat format, NvBufferLayout layout, EGLStream::NV::Rotation rotation = EGLStream::NV::ROTATION_0, Argus::Status *status = (Argus::Status *)__null) const

                    Returns -1 on failure
                    Returns valid dmabuf-fd on success

                */
                m_dmabuf = iNativeBuffer->createNvBuffer(iEglOutputStream->getResolution(),
                                                         NvBufferColorFormat_YUV420,
                                                         NvBufferLayout_BlockLinear);

                second_dmabuf = iNativeBuffer->createNvBuffer(iEglOutputStream->getResolution(),
                                                              NvBufferColorFormat_YUV420,
                                                              NvBufferLayout_BlockLinear);

                if (m_dmabuf == -1 || second_dmabuf == -1)
                    CONSUMER_PRINT("\tFailed to create NvBuffer\n");
                else
                    CONSUMER_PRINT("\nBUFFER CREATED\n");
            }
            else if (iNativeBuffer->copyToNvBuffer(m_dmabuf) != STATUS_OK || iNativeBuffer->copyToNvBuffer(second_dmabuf) != STATUS_OK)
            {
                ORIGINATE_ERROR("Failed to copy frame to NvBuffer.");
            }

            /* Process frame to be written. */
            //     bool CaptureConsumerThread::processV4L2Fd(int32_t fd, uint64_t frameNumber)
            
            processV4L2Fd(m_dmabuf, iFrame->getNumber());
        }

        CONSUMER_PRINT("Done.\n");

        requestShutdown();

        return true;
    }

Hi,
Please refer to the patch for mapping NvBuffer to cv::Mat:
NVBuffer (FD) to opencv Mat - #6 by DaneLLL

That patch reference didn’t help out much.

I’m still making the assumption that I can grab the frames that are being captured in bool ConsumerThread::threadExecute() or does that really matter if I do it there or in the processV4L2Fd() function?

In the bool ConsumerThread::threadExecute() function I currently have this and every time I try to use the cv::imwrite() function, I get the following error

The error

[1] 9433 bus error (core dumped) ./captureJPEG

The section of code I’m working on

 bool ConsumerThread::threadExecute()
    {
        IEGLOutputStream *iEglOutputStream = interface_cast<IEGLOutputStream>(m_stream);
        IFrameConsumer *iFrameConsumer = interface_cast<IFrameConsumer>(m_consumer);
        void *pdata = NULL;
        /* Wait until the producer has connected to the stream. */
        CONSUMER_PRINT("Waiting until producer is connected...\n");
        if (iEglOutputStream->waitUntilConnected() != STATUS_OK)
            ORIGINATE_ERROR("Stream failed to connect.");
        CONSUMER_PRINT("Producer has connected; continuing.\n");
        while (true)
        {
            /* Acquire a frame. */
            UniqueObj<Frame> frame(iFrameConsumer->acquireFrame());
            IFrame *iFrame = interface_cast<IFrame>(frame);
            if (!iFrame)
                break;

            /* Get the IImageNativeBuffer extension interface. */
            // Get access to image data through iFrame->getImage();
            // Image data can be retrieved and processed through ' IImageNativeBuffer interface.
            NV::IImageNativeBuffer *iNativeBuffer = interface_cast<NV::IImageNativeBuffer>(iFrame->getImage());
            if (!iNativeBuffer)
                ORIGINATE_ERROR("IImageNativeBuffer not supported by Image.");

            /* If we don't already have a buffer, create one from this image.
               Otherwise, just blit to our buffer. */
            if (m_dmabuf == -1)
            {
                /*
                    virtual int EGLStream::NV::IImageNativeBuffer::createNvBuffer(Argus::Size2D<...> size, NvBufferColorFormat format, NvBufferLayout layout, EGLStream::NV::Rotation rotation = EGLStream::NV::ROTATION_0, Argus::Status *status = (Argus::Status *)__null) const

                    Returns -1 on failure
                    Returns valid dmabuf-fd on success

                */
                m_dmabuf = iNativeBuffer->createNvBuffer(iEglOutputStream->getResolution(),
                                                         NvBufferColorFormat_YUV420,
                                                         NvBufferLayout_BlockLinear);

                if (m_dmabuf == -1)
                    CONSUMER_PRINT("\tFailed to create NvBuffer\n");
            }
            else if (iNativeBuffer->copyToNvBuffer(m_dmabuf) != STATUS_OK)
            {
                ORIGINATE_ERROR("Failed to copy frame to NvBuffer.");
            }
            // Do openCV stuff here

            if (NvBufferMemMap(m_dmabuf, 0, NvBufferMem_Read, &pdata) < 0)
            {
                CONSUMER_PRINT("FAILED to mapp virtual address...");
            }
            if (NvBufferMemSyncForCpu(m_dmabuf, 0, &pdata) < 0)
            {
                CONSUMER_PRINT("FAILED to sync hardware memory cache for CPU...");
            }
            cv::Mat imgbuf = cv::Mat(1920, 1080, CV_8UC3, pdata);

            if (imgbuf.empty())
            {
                CONSUMER_PRINT("imgbuf is empty...");
            }
            // Assuming NvBufferMemUnMap() isn't needed; as referenced below this comment,if I'm using a single camera
            // Reference : https://forums.developer.nvidia.com/t/nvbuffer-fd-to-opencv-mat/83012/6
            // NvBufferMemUnMap(m_dmabuf, 0, &pdata);

            std::string openCVStringWrite = "imgbuf_" + std::to_string(iFrame->getNumber()) + ".jpg";

            // Currently getting this error : [1]    9433 bus error (core dumped)  ./captureJPEG
            cv::imwrite(openCVStringWrite, imgbuf);

            // cv::imshow("imgbuf", imgbuf);

            // cv::waitKey(1);
            /* Process frame to be written. */
            //     bool CaptureConsumerThread::processV4L2Fd(int32_t fd, uint64_t frameNumber)

            processV4L2Fd(m_dmabuf, iFrame->getNumber());
        }

        CONSUMER_PRINT("Done.\n");

        requestShutdown();

        return true;
    }

Hi,
Hardware engines do not support 24-bit formats such as BGR/RGB. Please change to NvBufferColorFormat_ABGR32 pitch linear in createNvBuffer(), and map to CV_8UC4 in cv::Mat.

Made the changes with NvBufferColorFormat_ABGR32 and there was no pitch_linear argument that I could pass in createNvBuffer() so I used NvBufferLayout_Pitch

But I still got a segmentation fault

Recent changes I made to bool ConsumerThread::threadExecute()

 bool ConsumerThread::threadExecute()
    {
        IEGLOutputStream *iEglOutputStream = interface_cast<IEGLOutputStream>(m_stream);
        IFrameConsumer *iFrameConsumer = interface_cast<IFrameConsumer>(m_consumer);
        void *pdata = NULL;
        /* Wait until the producer has connected to the stream. */
        CONSUMER_PRINT("Waiting until producer is connected...\n");
        if (iEglOutputStream->waitUntilConnected() != STATUS_OK)
            ORIGINATE_ERROR("Stream failed to connect.");
        CONSUMER_PRINT("Producer has connected; continuing.\n");
        while (true)
        {
            /* Acquire a frame. */
            UniqueObj<Frame> frame(iFrameConsumer->acquireFrame());
            IFrame *iFrame = interface_cast<IFrame>(frame);
            if (!iFrame)
                break;

            /* Get the IImageNativeBuffer extension interface. */
            // Get access to image data through iFrame->getImage();
            // Image data can be retrieved and processed through ' IImageNativeBuffer interface.
            NV::IImageNativeBuffer *iNativeBuffer = interface_cast<NV::IImageNativeBuffer>(iFrame->getImage());
            if (!iNativeBuffer)
                ORIGINATE_ERROR("IImageNativeBuffer not supported by Image.");

            /* If we don't already have a buffer, create one from this image.
               Otherwise, just blit to our buffer. */
            if (m_dmabuf == -1)
            {
                /*
                    virtual int EGLStream::NV::IImageNativeBuffer::createNvBuffer(Argus::Size2D<...> size, NvBufferColorFormat format, NvBufferLayout layout, EGLStream::NV::Rotation rotation = EGLStream::NV::ROTATION_0, Argus::Status *status = (Argus::Status *)__null) const

                    Returns -1 on failure
                    Returns valid dmabuf-fd on success

                */
                // m_dmabuf = iNativeBuffer->createNvBuffer(iEglOutputStream->getResolution(),
                //                                          NvBufferColorFormat_YUV420,
                //                                          NvBufferLayout_BlockLinear);
                m_dmabuf = iNativeBuffer->createNvBuffer(iEglOutputStream->getResolution(),
                                                         NvBufferColorFormat_ARGB32,
                                                         NvBufferLayout_Pitch);

                if (m_dmabuf == -1)
                    CONSUMER_PRINT("\tFailed to create NvBuffer\n");
            }
            else if (iNativeBuffer->copyToNvBuffer(m_dmabuf) != STATUS_OK)
            {
                ORIGINATE_ERROR("Failed to copy frame to NvBuffer.");
            }
            // Do openCV stuff here

            if (NvBufferMemMap(m_dmabuf, 0, NvBufferMem_Read, &pdata) < 0)
            {
                CONSUMER_PRINT("FAILED to mapp virtual address...");
            }
            if (NvBufferMemSyncForCpu(m_dmabuf, 0, &pdata) < 0)
            {
                CONSUMER_PRINT("FAILED to sync hardware memory cache for CPU...");
            }
            cv::Mat imgbuf = cv::Mat(1920, 1080, CV_8UC4, pdata);

            if (imgbuf.empty())
            {
                CONSUMER_PRINT("imgbuf is empty...");
            }
            // Assuming NvBufferMemUnMap() isn't needed if I'm using a single camera
            // NvBufferMemUnMap(m_dmabuf, 0, &pdata);

            std::string openCVStringWrite = "imgbuf_" + std::to_string(iFrame->getNumber()) + ".jpg";

            cv::imwrite(openCVStringWrite, imgbuf);

            // cv::imshow("imgbuf", imgbuf);

            // cv::waitKey(1);
            /* Process frame to be written. */
            //     bool CaptureConsumerThread::processV4L2Fd(int32_t fd, uint64_t frameNumber)

            processV4L2Fd(m_dmabuf, iFrame->getNumber());
        }

        CONSUMER_PRINT("Done.\n");

        requestShutdown();

        return true;
    }

Terminal output stating segmentation fault

> export DISPLAY=:0 && make && ./captureJPEG
make: 'captureJPEG' is up to date.
[INFO] (NvEglRenderer.cpp:110) <renderer0> Setting Screen width 640 height 480
Number of cameras : 1
PRODUCER: Creating output stream
PRODUCER: Launching consumer thread
CONSUMER: Waiting until producer is connected...
CONSUMER: Waiting until producer is connected...
PRODUCER: Available Sensor modes :
PRODUCER: [0] W=3264 H=2464
PRODUCER: [1] W=3264 H=1848
PRODUCER: [2] W=1920 H=1080
PRODUCER: [3] W=1640 H=1232
PRODUCER: [4] W=1280 H=720
PRODUCER: [5] W=1280 H=720
PRODUCER: Requested FPS out of range. Fall back to 30
PRODUCER: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
CONSUMER: Producer has connected; continuing.
[1]    8979 segmentation fault (core dumped)  ./captureJPEG

Hi,
Please make sure the buffer is in 1920x1080. And try

            cv::Mat imgbuf = cv::Mat(1080, 1920, CV_8UC4, pdata);

If it still does not work, you may apply the patch to 13 sample to make sure it works with the sample.

Still got the segmentation fault despite adding cv::Mat imgbuf = cv::Mat(1080, 1920, CV_8UC4, pdata);

I’ll try out applying the patch that you mentioned here : Argus and OpenCV - #3 by DaneLLL

I tried applying the patch you mentioned here but with all the changes that were made since then; to not include the patch, I’m horribly confused on what exactly needs to get added and removed.

Especially when it came to the static bool execute() function

Hi,
The patch should be clear. It is to have NvBuffer in RGBA pitch linear and map it to cv::Mat. There is one patch to 09 sample and the other patch to 13 sample. Please apply it and try.

I’ve applied the patch as you stated here on Jul 30 '18

It compiles fine but I’m getting a segmentation fault

The following is from my terminal

> ./captureJPEG
nvbuf_utils: Could not get EGL display connection
Program started
Number of cameras : 1
PRODUCER: Creating output stream
PRODUCER: Launching consumer thread
CONSUMER: Waiting until producer is connected...
CONSUMER: Waiting until producer is connected...
PRODUCER: Available Sensor modes :
PRODUCER: [0] W=3264 H=2464
PRODUCER: [1] W=3264 H=1848
PRODUCER: [2] W=1920 H=1080
PRODUCER: [3] W=1640 H=1232
PRODUCER: [4] W=1280 H=720
PRODUCER: [5] W=1280 H=720
PRODUCER: Requested FPS out of range. Fall back to 30
PRODUCER: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
CONSUMER: Producer has connected; continuing.
[1]    9120 segmentation fault (core dumped)  ./captureJPEG

For what it’s worth at this point I’m leaving both the Makefile and my main.cpp here for transparency sake if others want to take a stab at the problem

Makefile

# Include the Rules.mk file from the parent directory
include /usr/src/jetson_multimedia_api/samples/Rules.mk

# Define the name of the application
APP := captureJPEG

# Define the directory where Argus utilities are located
# TOP_DIR 	:= $(shell pwd | awk '{split($$0, f, "/samples"); print f[1]}')

# ARGUS_UTILS_DIR := /home/ctnano/Desktop/jetson_multimedia_api/argus/samples/utils
# CLASS_DIR := /home/ctnano/Desktop/jetson_multimedia_api/samples/common/classes

ARGUS_UTILS_DIR := /usr/src/jetson_multimedia_api/argus/samples/utils
CLASS_DIR := /usr/src/jetson_multimedia_api/samples/common/classes
OPENCV_DIR := /usr/include/opencv4/opencv2

# Define the source files for the application
# CLASS_DIR 	:= $(TOP_DIR)/samples/common/classes

SRCS := \
	main.cpp \
	$(wildcard $(CLASS_DIR)/*.cpp) \
	$(ARGUS_UTILS_DIR)/Thread.cpp

# Define the object files to be generated from the source files
OBJS := $(SRCS:.cpp=.o)

# Add additional compiler flslags to include the Argus utilities directory
# Need to add include folder
CPPFLAGS += \
    -I"/usr/src/jetson_multimedia_api/argus/include" \
	-I"/usr/src/jetson_multimedia_api/include" \
	-I"$(ARGUS_UTILS_DIR)" \
	-I"$(OPENCV_DIR)"

# Add debugging flags
CPPFLAGS += -g -O0 -std=c++11 -Wall    

# Add additional linker flags
LDFLAGS += \
	-lnveglstream_camconsumer -lnvargus_socketclient -lopencv_core -lopencv_imgproc -lopencv_highgui

# Define the default target (the application)
all: $(APP)

# # Compile each class individually (this is a dependency target)
# This is needed for some reason 5/25/24
$(CLASS_DIR)/%.o: $(CLASS_DIR)/%.cpp
	$(AT)$(MAKE) -C $(CLASS_DIR)

# Compile each source file into an object file
%.o: %.cpp
	@echo "Compiling: $<"
	$(CPP) $(CPPFLAGS) -c $< -o $@

# Link the object files to create the application
$(APP): $(OBJS)
	@echo "Linking: $@"
	$(CPP) -o $@ $(OBJS) $(CPPFLAGS) $(LDFLAGS)

# Clean up generated files
clean:
	$(AT)rm -rf $(APP) $(OBJS)

Main.cpp

/*
 * Copyright (c) 2016-2018, NVIDIA CORPORATION. All rights reserved.
 *
 * Redistribution and use in source and binary forms, with or without
 * modification, are permitted provided that the following conditions
 * are met:
 *  * Redistributions of source code must retain the above copyright
 *    notice, this list of conditions and the following disclaimer.
 *  * Redistributions in binary form must reproduce the above copyright
 *    notice, this list of conditions and the following disclaimer in the
 *    documentation and/or other materials provided with the distribution.
 *  * Neither the name of NVIDIA CORPORATION nor the names of its
 *    contributors may be used to endorse or promote products derived
 *    from this software without specific prior written permission.
 *
 * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS ``AS IS'' AND ANY
 * EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
 * IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
 * PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL THE COPYRIGHT OWNER OR
 * CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
 * EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
 * PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
 * PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY
 * OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
 * (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
 * OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
 */

#include "Error.h"
#include "Thread.h"

#include <Argus/Argus.h>
#include <EGLStream/EGLStream.h>
#include <EGLStream/NV/ImageNativeBuffer.h>

#include <NvEglRenderer.h>
#include <NvJpegEncoder.h>

#include <unistd.h>
#include <stdio.h>
#include <stdlib.h>
#include <getopt.h>
#include <iostream>
#include <fstream>

#include <opencv2/opencv.hpp>
#include <cstdlib>

using namespace Argus;
using namespace EGLStream;
/* Configurations below can be overrided by cmdline */
static uint32_t CAPTURE_TIME = 1; /* In seconds. */
static int CAPTURE_FPS = 30;
static uint32_t SENSOR_MODE = 0;
static Size2D<uint32_t> PREVIEW_SIZE(640, 480);
static Size2D<uint32_t> CAPTURE_SIZE(1920, 1080);
static bool DO_STAT = false;
static bool VERBOSE_ENABLE = false;
static bool DO_JPEG_ENCODE = true;

#define JPEG_BUFFER_SIZE (CAPTURE_SIZE.area() * 3 / 2)

/* Debug print macros. */
#define PRODUCER_PRINT(...) printf("PRODUCER: " __VA_ARGS__)
#define CONSUMER_PRINT(...) printf("CONSUMER: " __VA_ARGS__)

namespace ArgusSamples
{

    /*******************************************************************************
     * Base Consumer thread:
     *   Creates an EGLStream::FrameConsumer object to read frames from the
     *   OutputStream, then creates/populates an NvBuffer (dmabuf) from the frames
     *   to be processed by processV4L2Fd.
     ******************************************************************************/

    // ConsumerThread --> derived class
    // public Threads --> parent class
    class ConsumerThread : public Thread
    {
    public:
        explicit ConsumerThread(OutputStream *stream) : m_stream(stream),
                                                        m_dmabuf(-1)
        {
        }
        virtual ~ConsumerThread();

    protected:
        /** @name Thread methods */
        /**@{*/
        virtual bool threadInitialize();
        virtual bool threadExecute();
        virtual bool threadShutdown();
        /**@}*/

        virtual bool processV4L2Fd(int32_t fd, uint64_t frameNumber) = 0;

        OutputStream *m_stream;
        UniqueObj<FrameConsumer> m_consumer;
        int m_dmabuf;
        cv::Mat opencvFrame;
    };

    ConsumerThread::~ConsumerThread()
    {
        if (m_dmabuf != -1)
            NvBufferDestroy(m_dmabuf);
    }

    bool ConsumerThread::threadInitialize()
    {
        /* Create the FrameConsumer. */
        m_consumer = UniqueObj<FrameConsumer>(FrameConsumer::create(m_stream));
        if (!m_consumer)
            ORIGINATE_ERROR("Failed to create FrameConsumer");

        return true;
    }

    bool ConsumerThread::threadExecute()
    {
        IEGLOutputStream *iEglOutputStream = interface_cast<IEGLOutputStream>(m_stream);
        IFrameConsumer *iFrameConsumer = interface_cast<IFrameConsumer>(m_consumer);

        /* Wait until the producer has connected to the stream. */
        CONSUMER_PRINT("Waiting until producer is connected...\n");
        if (iEglOutputStream->waitUntilConnected() != STATUS_OK)
            ORIGINATE_ERROR("Stream failed to connect.");
        CONSUMER_PRINT("Producer has connected; continuing.\n");

        while (true)
        {
            /* Acquire a frame. */
            UniqueObj<Frame> frame(iFrameConsumer->acquireFrame());
            IFrame *iFrame = interface_cast<IFrame>(frame);
            if (!iFrame)
                break;

            /* Get the image data from iFrame */
            EGLStream::Image *image = iFrame->getImage();
            if (!image)
                ORIGINATE_ERROR("Failed to get image from frame");

            /* Get image properties */

            // Size2D<uint32_t> size = image->getResolution();
            // uint32_t width = size.width();
            // uint32_t height = size.height();
            // const uint8_t *imageData = static_cast<const uint8_t *>(image->map());
            // if (!imageData)
            //     ORIGINATE_ERROR("Failed to map image data");

            // /* Create a cv::Mat object using the image data */
            // cv::Mat opencvFrame(height * 3 / 2, width, CV_8UC1, (void *)imageData);

            /* Get the IImageNativeBuffer extension interface. */
            NV::IImageNativeBuffer *iNativeBuffer =
                interface_cast<NV::IImageNativeBuffer>(iFrame->getImage());
            if (!iNativeBuffer)
                ORIGINATE_ERROR("IImageNativeBuffer not supported by Image.");

            /* If we don't already have a buffer, create one from this image.
               Otherwise, just blit to our buffer. */
            if (m_dmabuf == -1)
            {
                // m_dmabuf = iNativeBuffer->createNvBuffer(iEglOutputStream->getResolution(),
                //                                          NvBufferColorFormat_YUV420,
                //                                          NvBufferLayout_BlockLinear);
                m_dmabuf = iNativeBuffer->createNvBuffer(iEglOutputStream->getResolution(),
                                                         NvBufferColorFormat_ABGR32, NvBufferLayout_Pitch);
                if (m_dmabuf == -1)
                    CONSUMER_PRINT("\tFailed to create NvBuffer\n");
            }
            else if (iNativeBuffer->copyToNvBuffer(m_dmabuf) != STATUS_OK)
            {
                ORIGINATE_ERROR("Failed to copy frame to NvBuffer.");
            }

            /* Process frame. */
            processV4L2Fd(m_dmabuf, iFrame->getNumber());
        }

        CONSUMER_PRINT("Done.\n");

        requestShutdown();

        return true;
    }

    bool ConsumerThread::threadShutdown()
    {
        return true;
    }

    /*******************************************************************************
     * Preview Consumer thread:
     *   Read frames from the OutputStream and render it on display.
     ******************************************************************************/
    class PreviewConsumerThread : public ConsumerThread
    {
    public:
        // PreviewConsumerThread(OutputStream *stream, NvEglRenderer *renderer);
        PreviewConsumerThread(OutputStream *stream);
        ~PreviewConsumerThread();

    private:
        bool threadInitialize();
        bool threadShutdown();
        bool processV4L2Fd(int32_t fd, uint64_t frameNumber);

        // NvEglRenderer *m_renderer;
    };

    // PreviewConsumerThread::PreviewConsumerThread(OutputStream *stream,
    //                                              NvEglRenderer *renderer) : ConsumerThread(stream),
    //                                                                         m_renderer(renderer)
    PreviewConsumerThread::PreviewConsumerThread(OutputStream *stream) : ConsumerThread(stream)
    {
    }

    PreviewConsumerThread::~PreviewConsumerThread()
    {
    }

    bool PreviewConsumerThread::threadInitialize()
    {
        if (!ConsumerThread::threadInitialize())
            return false;

        // if (DO_STAT)
        //     m_renderer->enableProfiling();

        return true;
    }

    bool PreviewConsumerThread::threadShutdown()
    {
        // if (DO_STAT)
        //     m_renderer->printProfilingStats();

        return ConsumerThread::threadShutdown();
    }

    bool PreviewConsumerThread::processV4L2Fd(int32_t fd, uint64_t frameNumber)
    {
        // m_renderer->render(fd);
        void *pdata = NULL;
        NvBufferMemMap(fd, 0, NvBufferMem_Read, &pdata);
        NvBufferMemSyncForCpu(fd, 0, &pdata);
        cv::Mat imgbuf = cv::Mat(PREVIEW_SIZE.height(),
                                 PREVIEW_SIZE.width(),
                                 CV_8UC4, pdata);
        cv::Mat display_img;
        cvtColor(imgbuf, display_img, cv::COLOR_RGB2BGR);
        // cvtColor(imgbuf, display_img, CV_RGB);
        NvBufferMemUnMap(fd, 0, &pdata);
        cv::imshow("img", display_img);
        cv::waitKey(1);
        return true;
    }

    /*******************************************************************************
     * Capture Consumer thread:
     *   Read frames from the OutputStream and save it to JPEG file.
     ******************************************************************************/
    class CaptureConsumerThread : public ConsumerThread
    {
    public:
        CaptureConsumerThread(OutputStream *stream);
        ~CaptureConsumerThread();

    private:
        bool threadInitialize();
        bool threadShutdown();
        bool processV4L2Fd(int32_t fd, uint64_t frameNumber);

        NvJPEGEncoder *m_JpegEncoder;
        unsigned char *m_OutputBuffer;
    };

    CaptureConsumerThread::CaptureConsumerThread(OutputStream *stream) : ConsumerThread(stream),
                                                                         m_JpegEncoder(NULL),
                                                                         m_OutputBuffer(NULL)
    {
    }

    CaptureConsumerThread::~CaptureConsumerThread()
    {
        if (m_JpegEncoder)
            delete m_JpegEncoder;

        if (m_OutputBuffer)
            delete[] m_OutputBuffer;
    }

    bool CaptureConsumerThread::threadInitialize()
    {
        if (!ConsumerThread::threadInitialize())
            return false;

        m_OutputBuffer = new unsigned char[JPEG_BUFFER_SIZE];
        if (!m_OutputBuffer)
            return false;

        m_JpegEncoder = NvJPEGEncoder::createJPEGEncoder("jpenenc");
        if (!m_JpegEncoder)
            ORIGINATE_ERROR("Failed to create JPEGEncoder.");

        if (DO_STAT)
            m_JpegEncoder->enableProfiling();

        return true;
    }

    bool CaptureConsumerThread::threadShutdown()
    {
        if (DO_STAT)
            m_JpegEncoder->printProfilingStats();

        return ConsumerThread::threadShutdown();
    }

    bool CaptureConsumerThread::processV4L2Fd(int32_t fd, uint64_t frameNumber)
    {
        char filename[FILENAME_MAX];
        sprintf(filename, "output%03u.jpg", (unsigned)frameNumber);

        std::ofstream *outputFile = new std::ofstream(filename);
        if (outputFile)
        {
            unsigned long size = JPEG_BUFFER_SIZE;
            unsigned char *buffer = m_OutputBuffer;
            m_JpegEncoder->encodeFromFd(fd, JCS_YCbCr, &buffer, size);
            outputFile->write((char *)buffer, size);
            delete outputFile;
        }

        return true;
    }

    /**
     * Argus Producer thread:
     *   Opens the Argus camera driver, creates two OutputStreams to output to
     *   Preview Consumer and Capture Consumer respectively, then performs repeating
     *   capture requests for CAPTURE_TIME seconds before closing the producer and
     *   Argus driver.
     *
     * @param renderer     : render handler for camera preview
     */
    // static bool execute(NvEglRenderer *renderer)
    static bool execute()
    {
        /*
            class Argus::OutputStream
                - Object representing an output stream capable of receiving image frames from a capture.
                - OutputStream objects are used as the destination for image frames output from capture requests.
                - The operation of a stream, the source for its buffers, and the interfaces it supports depend on the StreamType of the stream.

            Argus::UniqueObj< T > Class Template Reference
                - Moveable smart pointer
                    - Mimicks std::unique_ptr
        */
        UniqueObj<OutputStream> captureStream;

        // Set to NULL to ensure well defined value
        CaptureConsumerThread *captureConsumerThread = NULL;

        /*
            Create the CameraProvider object and get the core interface

            class Argus::CameraProvider
                - Object providing the entry point to the libargus runtime.
                - It provides methods for querying the cameras in the system and for creating camera devices.

            static Argus::CameraProvider *Argus::CameraProvider::create(Argus::Status *status = (Argus::Status *)__null)
                - Creates and returns a new CameraProvider.
                - If a CameraProvider object has already been created, this method will return a pointer to that object.
                - Parameters:
                    status – Optional pointer to return success/status of the call.

            UniqueObj<CameraProvider>
                - Smart pointer
                - Takes ownership of 'CameraProvider' object that's returned by create()
        */

        // Smart pointer takes ownership of CameraProvider object
        UniqueObj<CameraProvider> cameraProvider = UniqueObj<CameraProvider>(CameraProvider::create());

        // If successful, returns pointer to 'ICameraProvider' interface
        // If not, returns nullptr
        /*
            interface_cast<>
                - Similar to dynamic_cast
                - Performs downcoasting of pointeres/reference to objects in heritance hierarchies
                - Designed to be used in polymorphic classes that have at least one virtual function
                - If conversion not safe
                    - It'll return a nullptr
                    - Or throw a std::bad_cast
        */
        ICameraProvider *iCameraProvider = interface_cast<ICameraProvider>(cameraProvider);
        if (!iCameraProvider)
            ORIGINATE_ERROR("Failed to create CameraProvider");

        /* Get the camera devices */
        std::vector<CameraDevice *> cameraDevices;
        /*
            getCameraDevices
                - virtual Argus::Status Argus::ICameraProvider::getCameraDevices(std::vector<Argus::CameraDevice *> *devices) const
                - Return list of camera devices exposed by provider
                - Includes devices already in use by active CaptureSession
                - Application responsibility
                    - Check device availability
                    - Handle errors returned when CaptureSession creation fails due to device already being used
        */
        // Pointer to ICameraProvider
        iCameraProvider->getCameraDevices(&cameraDevices);
        if (cameraDevices.size() == 0)
            ORIGINATE_ERROR("No cameras available");
        else
        {
            std::cout << "Number of cameras : " << cameraDevices.size() << std::endl;
        }

        ICameraProperties *iCameraProperties = interface_cast<ICameraProperties>(cameraDevices[0]);
        if (!iCameraProperties)
            ORIGINATE_ERROR("Failed to get ICameraProperties interface");

        /* Create the capture session using the first device and get the core interface */

        /*
            class Argus::CaptureSession
                - Object that controls all operations on a single sensor.
                - A capture session is bound to a single sensor (or, in future, a group of synchronized sensors)
                - Provides methods to perform captures on that sensor (via the ICaptureSession interface).

            virtual Argus::CaptureSession *Argus::ICameraProvider::createCaptureSession(Argus::CameraDevice *device, Argus::Status *status = (Argus::Status *)__null)
                - Creates and returns a new CaptureSession using the given device.
                - STATUS_UNAVAILABLE will be placed into status if the device is already in use.
                - Parameters:
                    - device – The device to use for the CaptureSession.
                    - status – Optional pointer to return success/status of the call.
                - Returns:
                    - The new CaptureSession
                    - Or NULL if an error occurred.
        */
        UniqueObj<CaptureSession> captureSession(iCameraProvider->createCaptureSession(cameraDevices[0]));
        ICaptureSession *iCaptureSession = interface_cast<ICaptureSession>(captureSession);
        if (!iCaptureSession)
            ORIGINATE_ERROR("Failed to get ICaptureSession interface");

        /* Initiaialize the settings of output stream */
        PRODUCER_PRINT("Creating output stream\n");

        /*
            Initiaialize the settings of output stream

            UniqueObj<OutputStreamSettings> --> class Argus::OutputStreamSettings
                - Container for settings used to configure/create an OutputStream.
                - The interfaces and configuration supported by these settings objects depend on the StreamType that was provided during settings creation (see ICaptureSession::createOutputStreamSettings).
                - These objects are passed to ICaptureSession::createOutputStream to create OutputStream objects, after which they may be destroyed.

            virtual Argus::OutputStreamSettings *Argus::ICaptureSession::createOutputStreamSettings(const Argus::StreamType &type, Argus::Status *status = (Argus::Status *)__null)
                - Creates an OutputStreamSettings object that is used to configure the creation of an OutputStream (see createOutputStream).
                - The type of OutputStream that will be configured and created by these settings are determined by the StreamType.
                - Parameters:
                    - type – The type of the OutputStream to configure/create with these settings.
                    - status – An optional pointer to return success/status.
                - Returns:
                    - The newly created OutputStreamSettings, or NULL on failure.

            IEGLOutputStreamSettings
                - Interface that exposes settings used for EGLStream-linked OutputStream creation
        */
        UniqueObj<OutputStreamSettings> streamSettings(iCaptureSession->createOutputStreamSettings(STREAM_TYPE_EGL));
        IEGLOutputStreamSettings *iEglStreamSettings = interface_cast<IEGLOutputStreamSettings>(streamSettings);
        if (!iEglStreamSettings)
            ORIGINATE_ERROR("Failed to get IEGLOutputStreamSettings interface");

        iEglStreamSettings->setPixelFormat(PIXEL_FMT_YCbCr_420_888);
        // iEglStreamSettings->setEGLDisplay(renderer->getEGLDisplay());
        iEglStreamSettings->setResolution(PREVIEW_SIZE);

        /*
            Based on above streamSettings, create the preview stream, and capture stream if JPEG Encode is required

            class Argus::OutputStream
                - Object representing an output stream capable of receiving image frames from a capture.
                - OutputStream objects are used as the destination for image frames output from capture requests.
                    - In this case it should be previewStream as the object
                - The operation of a stream, the source for its buffers, and the interfaces it supports depend on the StreamType of the stream.


            virtual Argus::OutputStream *Argus::ICaptureSession::createOutputStream(const Argus::OutputStreamSettings *settings, Argus::Status *status = (Argus::Status *)__null)
                - Creates an OutputStream object using the settings configured by an OutputStreamSettings object (see createOutputStreamSettings).
                - Parameters:
                    - settings – The settings to use for the new output stream.
                    - status – An optional pointer to return success/status.
                - Returns:
                    - The newly created OutputStream, or NULL on failure.

        */
        UniqueObj<OutputStream> previewStream(iCaptureSession->createOutputStream(streamSettings.get()));
        if (DO_JPEG_ENCODE)
        {
            // static Size2D<uint32_t> CAPTURE_SIZE(1920, 1080);
            iEglStreamSettings->setResolution(CAPTURE_SIZE);

            /*
                class Argus::OutputStream
                    - Object representing an output stream capable of receiving image frames from a capture.
                    - OutputStream objects are used as the destination for image frames output from capture requests.
                    - The operation of a stream, the source for its buffers, and the interfaces it supports depend on the StreamType of the stream.
            */
            captureStream = (UniqueObj<OutputStream>)iCaptureSession->createOutputStream(streamSettings.get());
        }

        /*
            Launch the FrameConsumer thread to consume frames from the OutputStream

            NvEglRenderer *renderer
                - Argus Producer thread: Opens the Argus camera driver,
                    - creates two OutputStreams to output to Preview Consumer and Capture Consumer respectively,
                    - Then performs repeating capture requests for CAPTURE_TIME seconds before closing the producer and Argus driver.
                - Parameters:
                    - renderer – : render handler for camera preview

            PreviewConsumerThread
                - Read frames from the OutputStream and render it on display.

        */
        PRODUCER_PRINT("Launching consumer thread\n");
        // PreviewConsumerThread previewConsumerThread(previewStream.get(), renderer);
        PreviewConsumerThread previewConsumerThread(previewStream.get());

        PROPAGATE_ERROR(previewConsumerThread.initialize());
        if (DO_JPEG_ENCODE)
        {
            captureConsumerThread = new ArgusSamples::CaptureConsumerThread(captureStream.get());
            PROPAGATE_ERROR(captureConsumerThread->initialize());
        }

        /* Wait until the consumer thread is connected to the stream */
        PROPAGATE_ERROR(previewConsumerThread.waitRunning());
        if (DO_JPEG_ENCODE)
            PROPAGATE_ERROR(captureConsumerThread->waitRunning());

        /* Create capture request and enable its output stream */
        UniqueObj<Request> request(iCaptureSession->createRequest());
        IRequest *iRequest = interface_cast<IRequest>(request);
        if (!iRequest)
            ORIGINATE_ERROR("Failed to create Request");
        iRequest->enableOutputStream(previewStream.get());
        if (DO_JPEG_ENCODE)
            iRequest->enableOutputStream(captureStream.get());

        ISensorMode *iSensorMode;
        std::vector<SensorMode *> sensorModes;
        iCameraProperties->getBasicSensorModes(&sensorModes);
        if (sensorModes.size() == 0)
            ORIGINATE_ERROR("Failed to get sensor modes");

        PRODUCER_PRINT("Available Sensor modes :\n");
        for (uint32_t i = 0; i < sensorModes.size(); i++)
        {
            iSensorMode = interface_cast<ISensorMode>(sensorModes[i]);
            Size2D<uint32_t> resolution = iSensorMode->getResolution();
            PRODUCER_PRINT("[%u] W=%u H=%u\n", i, resolution.width(), resolution.height());
        }

        ISourceSettings *iSourceSettings = interface_cast<ISourceSettings>(iRequest->getSourceSettings());
        if (!iSourceSettings)
            ORIGINATE_ERROR("Failed to get ISourceSettings interface");

        /* Check and set sensor mode */
        if (SENSOR_MODE >= sensorModes.size())
            ORIGINATE_ERROR("Sensor mode index is out of range");
        SensorMode *sensorMode = sensorModes[SENSOR_MODE];
        iSensorMode = interface_cast<ISensorMode>(sensorMode);
        iSourceSettings->setSensorMode(sensorMode);

        /* Check fps */
        Range<uint64_t> sensorDuration(iSensorMode->getFrameDurationRange());
        Range<uint64_t> desireDuration(1e9 / CAPTURE_FPS + 0.9);
        if (desireDuration.min() < sensorDuration.min() ||
            desireDuration.max() > sensorDuration.max())
        {
            PRODUCER_PRINT("Requested FPS out of range. Fall back to 30\n");
            CAPTURE_FPS = 30;
        }
        /* Set the fps */
        iSourceSettings->setFrameDurationRange(Range<uint64_t>(1e9 / CAPTURE_FPS));
        // renderer->setFPS((float)CAPTURE_FPS);

        /* Submit capture requests. */
        PRODUCER_PRINT("Starting repeat capture requests.\n");
        if (iCaptureSession->repeat(request.get()) != STATUS_OK)
            ORIGINATE_ERROR("Failed to start repeat capture request");

        /* Wait for CAPTURE_TIME seconds. */
        sleep(CAPTURE_TIME);

        /* Stop the repeating request and wait for idle. */
        iCaptureSession->stopRepeat();
        iCaptureSession->waitForIdle();

        /* Destroy the output stream to end the consumer thread. */
        previewStream.reset();
        if (DO_JPEG_ENCODE)
            captureStream.reset();

        /* Wait for the consumer thread to complete. */
        PROPAGATE_ERROR(previewConsumerThread.shutdown());
        if (DO_JPEG_ENCODE)
        {
            PROPAGATE_ERROR(captureConsumerThread->shutdown());
            delete captureConsumerThread;
        }

        PRODUCER_PRINT("Done -- exiting.\n");

        return true;
    }

}; /* namespace ArgusSamples */

int main()
{
    if (setenv("DISPLAY", ":0", 1) != 0)
    {
        std::cerr << "COULDN'T SET DISPLAY VARIABLE TO 0 EXITING PROGRAM NOW..." << std::endl;
        return 1;
    }
    else
    {
        printf("Program started\r\n");
    }

    // Make sure to set 'export DISPLAY=:0' or you'll throw a fault and not be able to run the program
    // NvEglRenderer *renderer = NvEglRenderer::createEglRenderer("renderer0", PREVIEW_SIZE.width(),
    //                                                            PREVIEW_SIZE.height(), 0, 0);
    // if (!renderer)
    //     ORIGINATE_ERROR("Failed to create EGLRenderer.");
    // else
    //     printf("EGLRenderer created successful\r\n");

    // Create object representing output stream to receive image frames
    UniqueObj<OutputStream> captureStream;
    ArgusSamples::CaptureConsumerThread *captureConsumerThread = NULL;

    // Create CameraProvider Object to get core interface to control camera
    // UniqueObj == Smart pointer
    // Don't place breakpoint before this or the debugger will freeze up
    UniqueObj<CameraProvider> cameraProvider = UniqueObj<CameraProvider>(CameraProvider::create());

    // Use interface_cast<> on cameraProvider to get/use pointer with 'ICameraProvider' interface
    ICameraProvider *iCameraProvider = interface_cast<ICameraProvider>(cameraProvider);
    if (!iCameraProvider)
        ORIGINATE_ERROR("Failed to create CameraProvider");

    // Create vector to store the numbers the jetson detects
    std::vector<CameraDevice *> cameraDevices;

    //
    iCameraProvider->getCameraDevices(&cameraDevices);
    if (cameraDevices.size() == 0)
        ORIGINATE_ERROR("No cameras available");
    else
    {
        std::cout << "Number of cameras : " << cameraDevices.size() << std::endl;
    }

    // Use interface_cast<> again on cameraDevice[0] to get/use pointer with ICameraProperties
    ICameraProperties *iCameraProperties = interface_cast<ICameraProperties>(cameraDevices[0]);
    if (!iCameraProperties)
        ORIGINATE_ERROR("Failed to get ICameraProperties interface");

    // Create CaptureSession with first camera device and get core interface

    UniqueObj<CaptureSession> captureSession(iCameraProvider->createCaptureSession(cameraDevices[0]));
    ICaptureSession *iCaptureSession = interface_cast<ICaptureSession>(captureSession);
    if (!iCaptureSession)
        ORIGINATE_ERROR("Failed to get ICaptureSession interface");

    /* Initiaialize the settings of output stream */
    PRODUCER_PRINT("Creating output stream\n");

    // Initiaialize the settings of output stream
    UniqueObj<OutputStreamSettings> streamSettings(iCaptureSession->createOutputStreamSettings(STREAM_TYPE_EGL));
    IEGLOutputStreamSettings *iEglStreamSettings = interface_cast<IEGLOutputStreamSettings>(streamSettings);
    if (!iEglStreamSettings)
        ORIGINATE_ERROR("Failed to get IEGLOutputStreamSettings interface");

    iEglStreamSettings->setPixelFormat(PIXEL_FMT_YCbCr_420_888);
    // iEglStreamSettings->setEGLDisplay(renderer->getEGLDisplay());
    iEglStreamSettings->setResolution(PREVIEW_SIZE);

    // Based on above streamSettings, create the preview stream, and capture stream if JPEG Encode is required
    UniqueObj<OutputStream> previewStream(iCaptureSession->createOutputStream(streamSettings.get()));
    if (DO_JPEG_ENCODE)
    {
        // static Size2D<uint32_t> CAPTURE_SIZE(1920, 1080);
        iEglStreamSettings->setResolution(CAPTURE_SIZE);

        /*
            class Argus::OutputStream
                - Object representing an output stream capable of receiving image frames from a capture.
                - OutputStream objects are used as the destination for image frames output from capture requests.
                - The operation of a stream, the source for its buffers, and the interfaces it supports depend on the StreamType of the stream.
        */
        captureStream = (UniqueObj<OutputStream>)iCaptureSession->createOutputStream(streamSettings.get());
    }

    PRODUCER_PRINT("Launching consumer thread\n");
    // ArgusSamples::previewConsumerThread(previewStream.get(), renderer);
    // Must declare namespace | The actual class | Then the object name
    // ArgusSamples::PreviewConsumerThread previewConsumerThread(previewStream.get(), renderer);
    ArgusSamples::PreviewConsumerThread previewConsumerThread(previewStream.get());

    PROPAGATE_ERROR(previewConsumerThread.initialize());
    if (DO_JPEG_ENCODE)
    {
        captureConsumerThread = new ArgusSamples::CaptureConsumerThread(captureStream.get());
        PROPAGATE_ERROR(captureConsumerThread->initialize());
    }

    /* Wait until the consumer thread is connected to the stream */
    PROPAGATE_ERROR(previewConsumerThread.waitRunning());
    if (DO_JPEG_ENCODE)
        PROPAGATE_ERROR(captureConsumerThread->waitRunning());

    /* Create capture request and enable its output stream */
    UniqueObj<Request> request(iCaptureSession->createRequest());
    IRequest *iRequest = interface_cast<IRequest>(request);
    if (!iRequest)
        ORIGINATE_ERROR("Failed to create Request");
    iRequest->enableOutputStream(previewStream.get());
    if (DO_JPEG_ENCODE)
        iRequest->enableOutputStream(captureStream.get());

    ISensorMode *iSensorMode;
    std::vector<SensorMode *> sensorModes;
    iCameraProperties->getBasicSensorModes(&sensorModes);
    if (sensorModes.size() == 0)
        ORIGINATE_ERROR("Failed to get sensor modes");

    PRODUCER_PRINT("Available Sensor modes :\n");
    for (uint32_t i = 0; i < sensorModes.size(); i++)
    {
        iSensorMode = interface_cast<ISensorMode>(sensorModes[i]);
        Size2D<uint32_t> resolution = iSensorMode->getResolution();
        PRODUCER_PRINT("[%u] W=%u H=%u\n", i, resolution.width(), resolution.height());
    }

    ISourceSettings *iSourceSettings = interface_cast<ISourceSettings>(iRequest->getSourceSettings());
    if (!iSourceSettings)
        ORIGINATE_ERROR("Failed to get ISourceSettings interface");

    /* Check and set sensor mode */
    if (SENSOR_MODE >= sensorModes.size())
        ORIGINATE_ERROR("Sensor mode index is out of range");
    SensorMode *sensorMode = sensorModes[SENSOR_MODE];
    iSensorMode = interface_cast<ISensorMode>(sensorMode);
    iSourceSettings->setSensorMode(sensorMode);

    /* Check fps */
    Range<uint64_t> sensorDuration(iSensorMode->getFrameDurationRange());
    Range<uint64_t> desireDuration(1e9 / CAPTURE_FPS + 0.9);
    if (desireDuration.min() < sensorDuration.min() ||
        desireDuration.max() > sensorDuration.max())
    {
        PRODUCER_PRINT("Requested FPS out of range. Fall back to 30\n");
        CAPTURE_FPS = 30;
    }
    /* Set the fps */
    iSourceSettings->setFrameDurationRange(Range<uint64_t>(1e9 / CAPTURE_FPS));
    // renderer->setFPS((float)CAPTURE_FPS);

    /* Submit capture requests. */
    PRODUCER_PRINT("Starting repeat capture requests.\n");
    if (iCaptureSession->repeat(request.get()) != STATUS_OK)
        ORIGINATE_ERROR("Failed to start repeat capture request");

    /* Wait for CAPTURE_TIME seconds. */
    sleep(CAPTURE_TIME);

    /* Stop the repeating request and wait for idle. */
    iCaptureSession->stopRepeat();
    iCaptureSession->waitForIdle();

    /* Destroy the output stream to end the consumer thread. */
    previewStream.reset();
    if (DO_JPEG_ENCODE)
        captureStream.reset();

    /* Wait for the consumer thread to complete. */
    PROPAGATE_ERROR(previewConsumerThread.shutdown());
    if (DO_JPEG_ENCODE)
    {
        PROPAGATE_ERROR(captureConsumerThread->shutdown());
        delete captureConsumerThread;
    }

    return 0;
}

Disregard my last comment about the segmentation fault

I had two ‘versions’ of the 09_camera_jpeg_capture example on my Jetson Nano and I was editing the wrong one with my last comment

OpenCV did work with the patch

09_camera_jpeg with OpenCV

/*
 * Copyright (c) 2016-2018, NVIDIA CORPORATION. All rights reserved.
 *
 * Redistribution and use in source and binary forms, with or without
 * modification, are permitted provided that the following conditions
 * are met:
 *  * Redistributions of source code must retain the above copyright
 *    notice, this list of conditions and the following disclaimer.
 *  * Redistributions in binary form must reproduce the above copyright
 *    notice, this list of conditions and the following disclaimer in the
 *    documentation and/or other materials provided with the distribution.
 *  * Neither the name of NVIDIA CORPORATION nor the names of its
 *    contributors may be used to endorse or promote products derived
 *    from this software without specific prior written permission.
 *
 * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS ``AS IS'' AND ANY
 * EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
 * IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
 * PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL THE COPYRIGHT OWNER OR
 * CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
 * EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
 * PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
 * PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY
 * OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
 * (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
 * OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
 */

#include "Error.h"
#include "Thread.h"

#include <Argus/Argus.h>
#include <EGLStream/EGLStream.h>
#include <EGLStream/NV/ImageNativeBuffer.h>

#include <NvEglRenderer.h>
#include <NvJpegEncoder.h>

#include <unistd.h>
#include <stdio.h>
#include <stdlib.h>
#include <getopt.h>
#include <iostream>
#include <fstream>

#include <opencv2/opencv.hpp>
// Test
using namespace Argus;
using namespace EGLStream;

/* Configurations below can be overrided by cmdline */
static uint32_t CAPTURE_TIME = 10; /* In seconds. */
static int CAPTURE_FPS = 30;
static uint32_t SENSOR_MODE = 0;
static Size2D<uint32_t> PREVIEW_SIZE(640, 480);
static Size2D<uint32_t> CAPTURE_SIZE(1920, 1080);
static bool DO_STAT = false;
static bool VERBOSE_ENABLE = false;
static bool DO_JPEG_ENCODE = true;

#define JPEG_BUFFER_SIZE (CAPTURE_SIZE.area() * 3 / 2)

/* Debug print macros. */
#define PRODUCER_PRINT(...) printf("PRODUCER: " __VA_ARGS__)
#define CONSUMER_PRINT(...) printf("CONSUMER: " __VA_ARGS__)

namespace ArgusSamples
{

    /*******************************************************************************
     * Base Consumer thread:
     *   Creates an EGLStream::FrameConsumer object to read frames from the
     *   OutputStream, then creates/populates an NvBuffer (dmabuf) from the frames
     *   to be processed by processV4L2Fd.
     ******************************************************************************/

    // ConsumerThread --> derived class
    // public Threads --> parent class
    class ConsumerThread : public Thread
    {
    public:
        explicit ConsumerThread(OutputStream *stream) : m_stream(stream),
                                                        m_dmabuf(-1)
        {
        }
        virtual ~ConsumerThread();

    protected:
        /** @name Thread methods */
        /**@{*/
        virtual bool threadInitialize();
        virtual bool threadExecute();
        virtual bool threadShutdown();
        /**@}*/

        virtual bool processV4L2Fd(int32_t fd, uint64_t frameNumber) = 0;

        OutputStream *m_stream;
        UniqueObj<FrameConsumer> m_consumer;
        int m_dmabuf;
    };

    ConsumerThread::~ConsumerThread()
    {
        if (m_dmabuf != -1)
            NvBufferDestroy(m_dmabuf);
    }

    bool ConsumerThread::threadInitialize()
    {
        /* Create the FrameConsumer. */
        m_consumer = UniqueObj<FrameConsumer>(FrameConsumer::create(m_stream));
        if (!m_consumer)
            ORIGINATE_ERROR("Failed to create FrameConsumer");

        return true;
    }

    bool ConsumerThread::threadExecute()
    {
        IEGLOutputStream *iEglOutputStream = interface_cast<IEGLOutputStream>(m_stream);
        IFrameConsumer *iFrameConsumer = interface_cast<IFrameConsumer>(m_consumer);

        /* Wait until the producer has connected to the stream. */
        CONSUMER_PRINT("Waiting until producer is connected...\n");
        if (iEglOutputStream->waitUntilConnected() != STATUS_OK)
            ORIGINATE_ERROR("Stream failed to connect.");
        CONSUMER_PRINT("Producer has connected; continuing.\n");

        while (true)
        {
            /* Acquire a frame. */
            UniqueObj<Frame> frame(iFrameConsumer->acquireFrame());
            IFrame *iFrame = interface_cast<IFrame>(frame);
            if (!iFrame)
                break;

            /* Get the IImageNativeBuffer extension interface. */
            NV::IImageNativeBuffer *iNativeBuffer =
                interface_cast<NV::IImageNativeBuffer>(iFrame->getImage());
            if (!iNativeBuffer)
                ORIGINATE_ERROR("IImageNativeBuffer not supported by Image.");

            /* If we don't already have a buffer, create one from this image.
               Otherwise, just blit to our buffer. */
            if (m_dmabuf == -1)
            {
                // m_dmabuf = iNativeBuffer->createNvBuffer(iEglOutputStream->getResolution(),
                //                                          NvBufferColorFormat_YUV420,
                //                                          NvBufferLayout_BlockLinear);

                m_dmabuf = iNativeBuffer->createNvBuffer(iEglOutputStream->getResolution(),
                                                         NvBufferColorFormat_ABGR32,
                                                         NvBufferLayout_Pitch);
                if (m_dmabuf == -1)
                    CONSUMER_PRINT("\tFailed to create NvBuffer\n");
            }
            else if (iNativeBuffer->copyToNvBuffer(m_dmabuf) != STATUS_OK)
            {
                ORIGINATE_ERROR("Failed to copy frame to NvBuffer.");
            }

            /* Process frame. */
            processV4L2Fd(m_dmabuf, iFrame->getNumber());
        }

        CONSUMER_PRINT("Done.\n");

        requestShutdown();

        return true;
    }

    bool ConsumerThread::threadShutdown()
    {
        return true;
    }

    /*******************************************************************************
     * Preview Consumer thread:
     *   Read frames from the OutputStream and render it on display.
     ******************************************************************************/
    class PreviewConsumerThread : public ConsumerThread
    {
    public:
        // PreviewConsumerThread(OutputStream *stream, NvEglRenderer *renderer);
        PreviewConsumerThread(OutputStream *stream);
        ~PreviewConsumerThread();

    private:
        bool threadInitialize();
        bool threadShutdown();
        bool processV4L2Fd(int32_t fd, uint64_t frameNumber);

        // NvEglRenderer *m_renderer;
    };

    // PreviewConsumerThread::PreviewConsumerThread(OutputStream *stream,
    //                                              NvEglRenderer *renderer) : ConsumerThread(stream),
    //                                                                         m_renderer(renderer)
    PreviewConsumerThread::PreviewConsumerThread(OutputStream *stream) : ConsumerThread(stream)
    {
    }

    PreviewConsumerThread::~PreviewConsumerThread()
    {
    }

    bool PreviewConsumerThread::threadInitialize()
    {
        if (!ConsumerThread::threadInitialize())
            return false;

        // if (DO_STAT)
        //     m_renderer->enableProfiling();

        return true;
    }

    bool PreviewConsumerThread::threadShutdown()
    {
        // if (DO_STAT)
        //     m_renderer->printProfilingStats();

        return ConsumerThread::threadShutdown();
    }

    bool PreviewConsumerThread::processV4L2Fd(int32_t fd, uint64_t frameNumber)
    {
        // m_renderer->render(fd);
        void *pdata = NULL;
        NvBufferMemMap(fd, 0, NvBufferMem_Read, &pdata);
        NvBufferMemSyncForCpu(fd, 0, &pdata);
        cv::Mat imgbuf = cv::Mat(PREVIEW_SIZE.height(),
                                 PREVIEW_SIZE.width(),
                                 CV_8UC4, pdata);
        cv::Mat display_img;
        cvtColor(imgbuf, display_img, cv::COLOR_RGB2BGR);
        NvBufferMemUnMap(fd, 0, &pdata);
        cv::imshow("img", display_img);
        cv::waitKey(1);
        return true;
    }

    /*******************************************************************************
     * Capture Consumer thread:
     *   Read frames from the OutputStream and save it to JPEG file.
     ******************************************************************************/
    class CaptureConsumerThread : public ConsumerThread
    {
    public:
        CaptureConsumerThread(OutputStream *stream);
        ~CaptureConsumerThread();

    private:
        bool threadInitialize();
        bool threadShutdown();
        bool processV4L2Fd(int32_t fd, uint64_t frameNumber);

        NvJPEGEncoder *m_JpegEncoder;
        unsigned char *m_OutputBuffer;
    };

    CaptureConsumerThread::CaptureConsumerThread(OutputStream *stream) : ConsumerThread(stream),
                                                                         m_JpegEncoder(NULL),
                                                                         m_OutputBuffer(NULL)
    {
    }

    CaptureConsumerThread::~CaptureConsumerThread()
    {
        if (m_JpegEncoder)
            delete m_JpegEncoder;

        if (m_OutputBuffer)
            delete[] m_OutputBuffer;
    }

    bool CaptureConsumerThread::threadInitialize()
    {
        if (!ConsumerThread::threadInitialize())
            return false;

        m_OutputBuffer = new unsigned char[JPEG_BUFFER_SIZE];
        if (!m_OutputBuffer)
            return false;

        m_JpegEncoder = NvJPEGEncoder::createJPEGEncoder("jpenenc");
        if (!m_JpegEncoder)
            ORIGINATE_ERROR("Failed to create JPEGEncoder.");

        if (DO_STAT)
            m_JpegEncoder->enableProfiling();

        return true;
    }

    bool CaptureConsumerThread::threadShutdown()
    {
        if (DO_STAT)
            m_JpegEncoder->printProfilingStats();

        return ConsumerThread::threadShutdown();
    }

    bool CaptureConsumerThread::processV4L2Fd(int32_t fd, uint64_t frameNumber)
    {
        // char filename[FILENAME_MAX];
        // sprintf(filename, "output%03u.jpg", (unsigned)frameNumber);

        // std::ofstream *outputFile = new std::ofstream(filename);
        // if (outputFile)
        // {
        //     unsigned long size = JPEG_BUFFER_SIZE;
        //     unsigned char *buffer = m_OutputBuffer;
        //     m_JpegEncoder->encodeFromFd(fd, JCS_YCbCr, &buffer, size);
        // outputFile->write((char *)buffer, size);
        // delete outputFile;
        // }

        return true;
    }

    /**
     * Argus Producer thread:
     *   Opens the Argus camera driver, creates two OutputStreams to output to
     *   Preview Consumer and Capture Consumer respectively, then performs repeating
     *   capture requests for CAPTURE_TIME seconds before closing the producer and
     *   Argus driver.
     *
     * @param renderer     : render handler for camera preview
     */
    // static bool execute(NvEglRenderer *renderer)
    static bool execute()
    {
        /*
            class Argus::OutputStream
                - Object representing an output stream capable of receiving image frames from a capture.
                - OutputStream objects are used as the destination for image frames output from capture requests.
                - The operation of a stream, the source for its buffers, and the interfaces it supports depend on the StreamType of the stream.

            Argus::UniqueObj< T > Class Template Reference
                - Moveable smart pointer
                    - Mimicks std::unique_ptr
        */
        UniqueObj<OutputStream> captureStream;

        // Set to NULL to ensure well defined value
        CaptureConsumerThread *captureConsumerThread = NULL;

        /*
            Create the CameraProvider object and get the core interface

            class Argus::CameraProvider
                - Object providing the entry point to the libargus runtime.
                - It provides methods for querying the cameras in the system and for creating camera devices.

            static Argus::CameraProvider *Argus::CameraProvider::create(Argus::Status *status = (Argus::Status *)__null)
                - Creates and returns a new CameraProvider.
                - If a CameraProvider object has already been created, this method will return a pointer to that object.
                - Parameters:
                    status – Optional pointer to return success/status of the call.

            UniqueObj<CameraProvider>
                - Smart pointer
                - Takes ownership of 'CameraProvider' object that's returned by create()
        */

        // Smart pointer takes ownership of CameraProvider object
        UniqueObj<CameraProvider> cameraProvider = UniqueObj<CameraProvider>(CameraProvider::create());

        // If successful, returns pointer to 'ICameraProvider' interface
        // If not, returns nullptr
        /*
            interface_cast<>
                - Similar to dynamic_cast
                - Performs downcoasting of pointeres/reference to objects in heritance hierarchies
                - Designed to be used in polymorphic classes that have at least one virtual function
                - If conversion not safe
                    - It'll return a nullptr
                    - Or throw a std::bad_cast
                - interface_cast<> being used on 'cameraProvider' to get a pointer to the 'ICameraProvider' interface
                - Why tho?
                    - Ownership management
                        - UniqueObj smart pointer OWNS the 'CameraProvider' object
                        - To use functionality specific to the 'ICameraProvider' interface like with the 'getCameraDevices()' function, you need to get a pointer to the 'ICameraProvider' interface
                    - Interface Useage
                        - In many object-oriented frameworks
                            - Objects expose functionality through interfaces vs concrete classes
                            - By getting the interface pointer 'ICameraProvider' you can access the methods/member functions defined in that interface regardless of the concrete type of the object
        */
        ICameraProvider *iCameraProvider = interface_cast<ICameraProvider>(cameraProvider);
        if (!iCameraProvider)
            ORIGINATE_ERROR("Failed to create CameraProvider");

        /* Get the camera devices */
        std::vector<CameraDevice *> cameraDevices;
        /*
            virtual Argus::Status Argus::ICameraProvider::getCameraDevices(std::vector<Argus::CameraDevice *> *devices) const
                - Returns the list of camera devices that are exposed by the provider.
                - This includes devices that may already be in use by active CaptureSessions,
                    - And it's the application's responsibility to check
                        - Device availability
                        - Or handle any errors returned when CaptureSession creation fails due to a device already being in use.
                - Parameters:
                    - devices – A vector that will be populated by the available devices.
                - Returns:
                    - success/status of the call.
        */
        // Pointer to ICameraProvider
        iCameraProvider->getCameraDevices(&cameraDevices);
        if (cameraDevices.size() == 0)
            ORIGINATE_ERROR("No cameras available");
        else
        {
            std::cout << "Number of cameras : " << cameraDevices.size() << std::endl;
        }

        ICameraProperties *iCameraProperties = interface_cast<ICameraProperties>(cameraDevices[0]);
        if (!iCameraProperties)
            ORIGINATE_ERROR("Failed to get ICameraProperties interface");

        /* Create the capture session using the first device and get the core interface */

        /*
            class Argus::CaptureSession
                - Object that controls all operations on a single sensor.
                - A capture session is bound to a single sensor (or, in future, a group of synchronized sensors)
                - Provides methods to perform captures on that sensor (via the ICaptureSession interface).

            virtual Argus::CaptureSession *Argus::ICameraProvider::createCaptureSession(Argus::CameraDevice *device, Argus::Status *status = (Argus::Status *)__null)
                - Creates and returns a new CaptureSession using the given device.
                - STATUS_UNAVAILABLE will be placed into status if the device is already in use.
                - Parameters:
                    - device – The device to use for the CaptureSession.
                    - status – Optional pointer to return success/status of the call.
                - Returns:
                    - The new CaptureSession
                    - Or NULL if an error occurred.
        */
        UniqueObj<CaptureSession> captureSession(iCameraProvider->createCaptureSession(cameraDevices[0]));
        ICaptureSession *iCaptureSession = interface_cast<ICaptureSession>(captureSession);
        if (!iCaptureSession)
            ORIGINATE_ERROR("Failed to get ICaptureSession interface");

        /* Initiaialize the settings of output stream */
        PRODUCER_PRINT("Creating output stream\n");

        /*
            Initiaialize the settings of output stream

            UniqueObj<OutputStreamSettings> --> class Argus::OutputStreamSettings
                - Container for settings used to configure/create an OutputStream.
                - The interfaces and configuration supported by these settings objects depend on the StreamType that was provided during settings creation (see ICaptureSession::createOutputStreamSettings).
                - These objects are passed to ICaptureSession::createOutputStream to create OutputStream objects, after which they may be destroyed.

            virtual Argus::OutputStreamSettings *Argus::ICaptureSession::createOutputStreamSettings(const Argus::StreamType &type, Argus::Status *status = (Argus::Status *)__null)
                - Creates an OutputStreamSettings object that is used to configure the creation of an OutputStream (see createOutputStream).
                - The type of OutputStream that will be configured and created by these settings are determined by the StreamType.
                - Parameters:
                    - type – The type of the OutputStream to configure/create with these settings.
                    - status – An optional pointer to return success/status.
                - Returns:
                    - The newly created OutputStreamSettings, or NULL on failure.

            IEGLOutputStreamSettings
                - Interface that exposes settings used for EGLStream-linked OutputStream creation
        */
        UniqueObj<OutputStreamSettings> streamSettings(iCaptureSession->createOutputStreamSettings(STREAM_TYPE_EGL));
        IEGLOutputStreamSettings *iEglStreamSettings = interface_cast<IEGLOutputStreamSettings>(streamSettings);
        if (!iEglStreamSettings)
            ORIGINATE_ERROR("Failed to get IEGLOutputStreamSettings interface");

        iEglStreamSettings->setPixelFormat(PIXEL_FMT_YCbCr_420_888);
        // iEglStreamSettings->setEGLDisplay(renderer->getEGLDisplay());
        iEglStreamSettings->setResolution(PREVIEW_SIZE);

        /*
            Based on above streamSettings, create the preview stream, and capture stream if JPEG Encode is required

            class Argus::OutputStream
                - Object representing an output stream capable of receiving image frames from a capture.
                - OutputStream objects are used as the destination for image frames output from capture requests.
                    - In this case it should be previewStream as the object
                - The operation of a stream, the source for its buffers, and the interfaces it supports depend on the StreamType of the stream.


            virtual Argus::OutputStream *Argus::ICaptureSession::createOutputStream(const Argus::OutputStreamSettings *settings, Argus::Status *status = (Argus::Status *)__null)
                - Creates an OutputStream object using the settings configured by an OutputStreamSettings object (see createOutputStreamSettings).
                - Parameters:
                    - settings – The settings to use for the new output stream.
                    - status – An optional pointer to return success/status.
                - Returns:
                    - The newly created OutputStream, or NULL on failure.

        */
        UniqueObj<OutputStream> previewStream(iCaptureSession->createOutputStream(streamSettings.get()));
        if (DO_JPEG_ENCODE)
        {
            // static Size2D<uint32_t> CAPTURE_SIZE(1920, 1080);
            iEglStreamSettings->setResolution(CAPTURE_SIZE);

            /*
                class Argus::OutputStream
                    - Object representing an output stream capable of receiving image frames from a capture.
                    - OutputStream objects are used as the destination for image frames output from capture requests.
                    - The operation of a stream, the source for its buffers, and the interfaces it supports depend on the StreamType of the stream.
            */
            captureStream = (UniqueObj<OutputStream>)iCaptureSession->createOutputStream(streamSettings.get());
        }

        /*
            Launch the FrameConsumer thread to consume frames from the OutputStream

            NvEglRenderer *renderer
                - Argus Producer thread: Opens the Argus camera driver,
                    - creates two OutputStreams to output to Preview Consumer and Capture Consumer respectively,
                    - Then performs repeating capture requests for CAPTURE_TIME seconds before closing the producer and Argus driver.
                - Parameters:
                    - renderer – : render handler for camera preview

            PreviewConsumerThread
                - Read frames from the OutputStream and render it on display.

        */
        PRODUCER_PRINT("Launching consumer thread\n");
        /*
            PreviewConsumerThread is a subclass of ConsumerThread
                - Both previewStream and renderer gets passed
                - PreviewConsumerThread is a subclass of ConsumerThread
                    - The ConsumerThread constructor will be called and initialize some variables
                - previewConsumerThread.initialize() then gets called to set up frame consumers and other needed resources for processing frames
                - previewConsumerThread.waitRunning() waits for consumer thread to connect to the stream
                - Once consumer thread running and connected IT THEN ENTERS THE LOOP OF ConsumerThread::threadExecute to acquire frames from the stream and process them
                    - Loop keeps running until consumer thread is requested to shutdown
                    - Typically through requestShutdown()
        */
        // PreviewConsumerThread previewConsumerThread(previewStream.get(), renderer);
        PreviewConsumerThread previewConsumerThread(previewStream.get());

        PROPAGATE_ERROR(previewConsumerThread.initialize());
        if (DO_JPEG_ENCODE)
        {
            captureConsumerThread = new CaptureConsumerThread(captureStream.get());
            PROPAGATE_ERROR(captureConsumerThread->initialize());
        }

        /* Wait until the consumer thread is connected to the stream */
        PROPAGATE_ERROR(previewConsumerThread.waitRunning());
        if (DO_JPEG_ENCODE)
            PROPAGATE_ERROR(captureConsumerThread->waitRunning());

        /* Create capture request and enable its output stream */
        UniqueObj<Request> request(iCaptureSession->createRequest());
        IRequest *iRequest = interface_cast<IRequest>(request);
        if (!iRequest)
            ORIGINATE_ERROR("Failed to create Request");
        iRequest->enableOutputStream(previewStream.get());
        if (DO_JPEG_ENCODE)
            iRequest->enableOutputStream(captureStream.get());

        ISensorMode *iSensorMode;
        std::vector<SensorMode *> sensorModes;
        iCameraProperties->getBasicSensorModes(&sensorModes);
        if (sensorModes.size() == 0)
            ORIGINATE_ERROR("Failed to get sensor modes");

        PRODUCER_PRINT("Available Sensor modes :\n");
        for (uint32_t i = 0; i < sensorModes.size(); i++)
        {
            iSensorMode = interface_cast<ISensorMode>(sensorModes[i]);
            Size2D<uint32_t> resolution = iSensorMode->getResolution();
            PRODUCER_PRINT("[%u] W=%u H=%u\n", i, resolution.width(), resolution.height());
        }

        ISourceSettings *iSourceSettings = interface_cast<ISourceSettings>(iRequest->getSourceSettings());
        if (!iSourceSettings)
            ORIGINATE_ERROR("Failed to get ISourceSettings interface");

        /* Check and set sensor mode */
        if (SENSOR_MODE >= sensorModes.size())
            ORIGINATE_ERROR("Sensor mode index is out of range");
        SensorMode *sensorMode = sensorModes[SENSOR_MODE];
        iSensorMode = interface_cast<ISensorMode>(sensorMode);
        iSourceSettings->setSensorMode(sensorMode);

        /* Check fps */
        Range<uint64_t> sensorDuration(iSensorMode->getFrameDurationRange());
        Range<uint64_t> desireDuration(1e9 / CAPTURE_FPS + 0.9);
        if (desireDuration.min() < sensorDuration.min() ||
            desireDuration.max() > sensorDuration.max())
        {
            PRODUCER_PRINT("Requested FPS out of range. Fall back to 30\n");
            CAPTURE_FPS = 30;
        }
        /* Set the fps */
        iSourceSettings->setFrameDurationRange(Range<uint64_t>(1e9 / CAPTURE_FPS));
        // renderer->setFPS((float)CAPTURE_FPS);

        /* Submit capture requests. */
        PRODUCER_PRINT("Starting repeat capture requests.\n");
        if (iCaptureSession->repeat(request.get()) != STATUS_OK)
            ORIGINATE_ERROR("Failed to start repeat capture request");

        /* Wait for CAPTURE_TIME seconds. */
        sleep(CAPTURE_TIME);

        /* Stop the repeating request and wait for idle. */
        iCaptureSession->stopRepeat();
        iCaptureSession->waitForIdle();

        /* Destroy the output stream to end the consumer thread. */
        previewStream.reset();
        if (DO_JPEG_ENCODE)
            captureStream.reset();

        /* Wait for the consumer thread to complete. */
        PROPAGATE_ERROR(previewConsumerThread.shutdown());
        if (DO_JPEG_ENCODE)
        {
            PROPAGATE_ERROR(captureConsumerThread->shutdown());
            delete captureConsumerThread;
        }

        PRODUCER_PRINT("Done -- exiting.\n");

        return true;
    }

}; /* namespace ArgusSamples */

static void printHelp()
{
    printf("Usage: camera_jpeg_capture [OPTIONS]\n"
           "Options:\n"
           "  --pre-res     Preview resolution WxH [Default 640x480]\n"
           "  --img-res     Capture resolution WxH [Default 1920x1080]\n"
           "  --cap-time    Capture time in sec    [Default 1]\n"
           "  --fps         Frame per second       [Default 30]\n"
           "  --sensor-mode Sensor mode            [Default 0]\n"
           "  --disable-jpg Disable JPEG encode    [Default Enable]\n"
           "  -s            Enable profiling\n"
           "  -v            Enable verbose message\n"
           "  -h            Print this help\n");
}

static bool parseCmdline(int argc, char *argv[])
{
    enum
    {
        OPTION_PREVIEW_RESOLUTION = 0x100,
        OPTION_CAPTURE_RESOLUTION,
        OPTION_CAPTURE_TIME,
        OPTION_FPS,
        OPTION_SENSOR_MODE,
        OPTION_DISABLE_JPEG_ENCODE,
    };

    static struct option longOptions[] =
        {
            {"pre-res", 1, NULL, OPTION_PREVIEW_RESOLUTION},
            {"img-res", 1, NULL, OPTION_CAPTURE_RESOLUTION},
            {"cap-time", 1, NULL, OPTION_CAPTURE_TIME},
            {"fps", 1, NULL, OPTION_FPS},
            {"sensor-mode", 1, NULL, OPTION_SENSOR_MODE},
            {"disable-jpg", 0, NULL, OPTION_DISABLE_JPEG_ENCODE},
            {0},
        };

    int c, w, h;
    uint32_t t;
    while ((c = getopt_long(argc, argv, "s::v::h", longOptions, NULL)) != -1)
    {
        switch (c)
        {
        case OPTION_PREVIEW_RESOLUTION:
            if (sscanf(optarg, "%dx%d", &w, &h) != 2)
                return false;
            PREVIEW_SIZE.width() = w;
            PREVIEW_SIZE.height() = h;
            break;
        case OPTION_CAPTURE_RESOLUTION:
            if (sscanf(optarg, "%dx%d", &w, &h) != 2)
                return false;
            CAPTURE_SIZE.width() = w;
            CAPTURE_SIZE.height() = h;
            break;
        case OPTION_CAPTURE_TIME:
            if (sscanf(optarg, "%d", &t) != 1)
                return false;
            CAPTURE_TIME = t;
            break;
        case OPTION_FPS:
            if (sscanf(optarg, "%d", &w) != 1)
                return false;
            CAPTURE_FPS = w;
            break;
        case OPTION_SENSOR_MODE:
            if (sscanf(optarg, "%d", &t) != 1)
                return false;
            SENSOR_MODE = t;
            break;
        case OPTION_DISABLE_JPEG_ENCODE:
            DO_JPEG_ENCODE = false;
            break;
        case 's':
            DO_STAT = true;
            break;
        case 'v':
            VERBOSE_ENABLE = true;
            break;
        default:
            return false;
        }
    }
    return true;
}

int main(int argc, char *argv[])
{
    if (!parseCmdline(argc, argv))
    {
        printHelp();
        return EXIT_FAILURE;
    }

    // NvEglRenderer *renderer = NvEglRenderer::createEglRenderer("renderer0", PREVIEW_SIZE.width(),
    //                                                            PREVIEW_SIZE.height(), 0, 0);
    // if (!renderer)
    //     ORIGINATE_ERROR("Failed to create EGLRenderer.");

    /*
        Actually does stuff

    */
    // if (!ArgusSamples::execute(renderer))
    if (!ArgusSamples::execute())
        return EXIT_FAILURE;

    // delete renderer;

    return EXIT_SUCCESS;
}

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.