L4t 24.2 using libargus on a headless system

I have a board which doesn’t have a display.

We don’t need it for our application. We do need to capture video through cameras though.

Currently we have CSI cameras that only outputs bayer and we would like to convert it to RGB or YUV. It seems like the only options we have are:

  • Convert the images using an ISP within the TX1
  • Use the GPU to convert the images
  • Switch to an image sensor with a built in ISP

I would like to see if an ISP within the TX1 is a viable option. I’m looking into libargus, in particular the ‘oneShot’ application. I’ve built it and when I run it I see an error:

ubuntu@tegra-ubuntu:~/tegra_multimedia_api/argus/build/samples/oneShot$ ./argus_oneshot 
(Argus) Error NotSupported: Failed to initialize EGLDisplay (in src/eglutils/EGLUtils.cpp, function getDefaultDisplay(), line 75)
EGL :Referencing unknown handle: (nil) type=5
(Argus) Error NotInitialized:  (propagating from src/eglstream/FrameConsumerImpl.cpp, function initialize(), line 109)
(Argus) Error NotInitialized:  (propagating from src/eglstream/FrameConsumerImpl.cpp, function create(), line 48)
Failed to initialize Consumer

I’ve modified the code by just basically adding a bunch of ‘printf’ statements to zero in on where the error is:

$ ./one_shot
NvPclHwGetModuleList: WARNING: Could not map module to ISP config string
NvPclHwGetModuleList: No module data found
Sensor_LoadModeModeType: mode 0: Failed to load pixeltype
Sensor_LoadModePixelPhase: mode 0: Failed to load pixeltype
Sensor_LoadModeModeType: mode 1: Failed to load pixeltype
Sensor_LoadModePixelPhase: mode 1: Failed to load pixeltype
Getting a vector of camera devices
Create capture session
cast the capture session
Created capture session
Configure stream setting
Set stream pixel format
Set stream size
Create output stream
(Argus) Error NotSupported: Failed to initialize EGLDisplay (in src/eglutils/EGLUtils.cpp, function getDefaultDisplay(), line 75)
(Argus) Error NotSupported: Failed to get default display (in src/api/OutputStreamImpl.cpp, function initialize(), line 74)
(Argus) Error NotSupported: (propagating from src/api/CaptureSessionImpl.cpp, function createOutputStreamInternal(), line 526)
Cast output stream
Cannot get OutputStream Interface

I understand that there is some other issues associated with the camera and I plan to address this in another post but for now I would like to focus on the display problem.

The above error points me to these lines:

...
    printf ("Create output stream\n");                                                                                                  
    Argus::UniqueObj<Argus::OutputStream> stream(                                                                                       
        iSession->createOutputStream(streamSettings.get()));                                                                            
                                                                                                                                        
    printf ("Cast output stream\n");                                                                                                    
                                                                                                                                        
    Argus::IStream *iStream = Argus::interface_cast<Argus::IStream>(stream);                                                            
    EXIT_IF_NULL(iStream, "Cannot get OutputStream Interface");                                                                         
    printf ("Created output stream\n");     
...

I looked in dmesg and there was no reported messages on this.

It looks like argus attempts to find the default display output (like HDMI). Is there anyway to bypass this behavior and allow me to write directly to a file.

Basically I would like to isolate my problem down to:

‘camera → ISP → file’

and not:

‘camera → ISP → display → file’

I am still learning about this but it seems as though I need an EGLStream to to capture the output of the ISP and send it to something like a file. Is this true?

I also looked inside the argus header files, in particular: CaptureSession.h and Settings.h and there doesn’t seem to be a way to constrain the outputStreams to not look for a display. Perhaps I’m thinking about this incorrectly.

Thanks in advanced for any help.

Dave

hello cospan,

please refer to below link for JetPackCameraAPI webinar,

according to slides #71,
you should able to copy the image to native NvBuffers.
thanks

Thanks for getting back to me. I remember something about NvBuffers but I didn’t understand their significance. I’ll look into this.

Thanks again,

Dave