CSI Camera acquisition multiple problems when capturing more than (libArgus ; L4T-32.2 ; JetPack 4.2

Hello,

We have an application that perform IMX274 CSI camera acquisition on Jetson TX2(i) with ConnectTech’s Spacely carrier board using libArgus. This application works well with 3 cameras both using the CSI 4-lanes configuration (which allows up to 3 cameras) and the CSI 2-lanes configuration (which allows up to 6 cameras).

However using more than 3 cameras (up to 6 using the CSI 2-lanes configuration) leads to multiple (and sometimes critical) problems depending on the Argus capture-session-number/output-stream-number configuration and camera number. Sometimes, we observe different behaviors despite not changing anything in the configuration (just by restarting the application.) We can have one of the following behavior:
- everything works well
- Argus produces errors/warnings but valid data is produced
- Argus produces errors/warnings and does not produce any data
- Argus refuses to create capture session
- Argus produces Segmentation Fault
- Argus produces errors/warnings, does not produce data and gets stuck, preventing our application from exiting
- Argus produces errors/warnings and data is produced, but data is actually always from the camera #0

I am providing below detailed information about what configuration produces what behaviors. I will provide the LOGs in the next post in order not to flood this one. I am also providing our hardware + OS configuration. I would like to highlight that we took all the precautions needed in order to prevent mistakes such as not setting DIP switches right or flashing the wrong configuration (BSP ConnectTech).


Hardware + OS configuration:

  • Platform = Jetson TX2i
  • Carrier board = ConnectTech Spacely
    • DIP Switches = TX2 ; CSIx2
  • Cameras : 6x Leopard Imaging IMX274 csi
  • OS : L4T-32.2 (JetPack-4.2.1)
  • BSP ConnectTech : V126 (support for L4T-32.2)
    • support for Spacely
    • support for TX2i
    • support for 6x IMX274 (CSI 2-lanes)

Details:
Our application allow us to choose at startup whether we want to use 1 capture session for all the cameras or 1 capture
session per camera (1 camera means 1 output stream in libArgus’ terminology). Both modes were tested successfully with
3 cameras. Our application runs in several threads (producer; consumer; data processing…) and allow us to select
which parts we want to use or not.

Following behaviors were diagnosed by saving frames in JPEG.

  • 1 capture session ; 3 cameras : everything works well
  • 1 capture session ; 4 cameras : Argus produces errors/warnings and data is actually always from the camera #0 (despite the dma buffers not being the same!!)
  • 1 capture session ; 5 cameras : Argus refuses to create the capture session and returns properly (Argus::ICameraProvider::createCaptureSession returns a NULL capture session)
  • 1 capture session ; 6 cameras : Argus produces a Segmentation Fault when trying to create the capture session (gdb backtrace provided along with the LOGs)
  • 3 capture sessions ; 3 cameras : everything works well
  • 4 capture sessions ; 4 cameras : everything works well
  • 5 capture sessions ; 5 cameras : multiple possible behaviors:
    1. works well (sometimes with Argus errors/warnings
    2. Argus produces errors/warnings and gets stuck
  • 6 capture sessions ; 6 cameras : (same as above)

Notes:
1. We observe the same behaviors with the other data processing options (retrieving dma buffers ; compositing ; converting data format… ).
2. We also observe the same behaviors when activating only the EGL producer (data processing and EGL consumer not even initialized), except that when no error occurs we cannot check the validity of the data
3. Our application always check the available camera number and the available camera modes. Camera modules are found in “/proc/device-tree/tegra-camera-platform/modules/module#” (# = [0;5]).


I have sent the same message to ConnectTech’s support and I am waiting for their answer.

Where do those problems come from? How can we fix them?
Is there any sample/app that we could use in order to test camera acquisition with 6 cameras? Samples provided in the Tegra Multimedia API do not use more than 2 cameras at a time.

Thank you in advance for any information you might share.

LOGs:

>>> 1 capture session ; 4 cameras

[START REPEAT CAPTURE]
ClipHelper
allowIspClipping: true
maxIspDownscale: 4.0:1 4000
maxIspOutWidth: 6144,4096
ispIn:  (3840 x 2160)
PRU enabled: false, interleaved input: (0 x 0)
postProcessingSize: (3840 x 2160)
postIspClip: (0.00,0.00, 1.00,1.00)
ispOut[0]: (3840 x 2160)
ispClip[0]: (0.00,0.00, 1.00,1.00)
ispOut[1]: (0 x 0)
ispClip[1]: (0.00,0.00, 1.00,1.00)
out[0] 3840x2160 req (0.00,0.00, 1.00,1.00) final (0.00,0.00, 1.00,1.00) isp from isp[0]
[CONSUMER CONNECTS TO EGL STREAM #0]

StageGroup 0x7f40003b60 parent=(nil) 3840x2160 (1 exposure) obufMask=f finalMask=0
StageGroup 0x7f40003e80 parent=0x7f40003b60 3840x2160 (1 exposure) obufMask=f finalMask=f
  stages[0] = 36 SensorISPCaptureStage(in = 12, outA= 0, outB = 12, outThumb = 4, outMeta = 5, outStats = 6) routed
m_bufStates[0] = 0 attached output done readOrder=0 writeOrder=1 group=0x7f40003e80 fbs=isp0
 3840x2160 BL U8_V8_ER 420SP
m_bufStates[1] = 1 attached output done readOrder=0 writeOrder=1 group=0x7f40003b60 fbs=none
 3840x2160 BL U8_V8_ER 420SP
[CONSUMER CONNECTS TO EGL STREAM #1]
m_bufStates[2] = 2 attached output done readOrder=0 writeOrder=1 group=0x7f40003b60 fbs=none
 3840x2160 BL U8_V8_ER 420SP
m_bufStates[3] = 3 attached output done readOrder=0 writeOrder=1 group=0x7f40003b60 fbs=none
 3840x2160 BL U8_V8_ER 420SP
m_bufStates[4] = 4 attached readOrder=0 writeOrder=1 group=0x7f40003e80 AF fbs=none
 640x360 Pitch Y8_ER 420
m_bufStates[5] = 5 readOrder=0 writeOrder=1 group=0x7f40003e80 fbs=none
 3840x1 Pitch NonColor8
m_bufStates[6] = 6 readOrder=0 writeOrder=1 group=0x7f40003e80 fbs=none
[CONSUMER CONNECTS TO EGL STREAM #2]
 524288x1 Pitch NonColor8
GraphHelper blit pixel count=49766400 != ClipHelper blit pixel count=0

[CONSUMER CONNECTS TO EGL STREAM #3]
[START JPEG ENCODING/WRITING]

→ everything fine except data is from always camera #0


>>> 1 capture session ; 5 cameras

[TRYING TO CREATE 1 CAPTURE SESSION WITH A LIST OF 5 CAMERAS DEVICES]
SCF: Error BadValue: device not found (in src/common/DeviceRegistry.h, function assign(), line 142)
SCF: Error BadValue:  (propagating from src/api/CameraDriver.cpp, function createSession(), line 525)
(Argus) Error BadValue:  (propagating from src/api/CaptureSessionImpl.cpp, function initialize(), line 120)
(Argus) Error BadValue:  (propagating from src/api/CameraProviderImpl.cpp, function createCaptureSession(), line 250)
(NvCameraUtils) Error InvalidState: Mutex not initialized (/dvs/git/dirty/git-master_linux/camera/argus/src/api/CaptureSessionImpl.cpp:197) (in Mutex.cpp, function lock(), line 79)
(Argus) Error InvalidState: Element not found (in /dvs/git/dirty/git-master_linux/camera/utils/nvcamerautils/inc/Vector.h, function remove(), line 172)
(NvCameraUtils) Error InvalidState: Mutex has not been initialized (in Mutex.cpp, function unlock(), line 88)
[APPLICATION DIAGNOSES THAT THE CAPTURE SESSION POINTER IS NULL ; EXITING...]

>>> 1 capture session ; 6 cameras

[TRYING TO CREATE 1 CAPTURE SESSION WITH A LIST OF 6 CAMERAS DEVICES]
SCF: Error BadValue: device not found (in src/common/DeviceRegistry.h, function assign(), line 142)
SCF: Error BadValue:  (propagating from src/api/CameraDriver.cpp, function createSession(), line 525)
(Argus) Error BadValue:  (propagating from src/api/CaptureSessionImpl.cpp, function initialize(), line 120)
(Argus) Error BadValue:  (propagating from src/api/CameraProviderImpl.cpp, function createCaptureSession(), line 250)
Segmentation fault (core dumped)

gdb backtrace:

Thread 1 "our_application" received signal SIGSEGV, Segmentation fault.
0x0000007fb6fd76d8 in typeinfo for nvcamerautils::ManagedObject () from /usr/lib/aarch64-linux-gnu/tegra/libnvcamerautils.so
(gdb) backtrace
#0  0x0000007fb6fd76d8 in typeinfo for nvcamerautils::ManagedObject () from /usr/lib/aarch64-linux-gnu/tegra/libnvcamerautils.so
#1  0x0000007fb7f1c880 in ?? () from /usr/lib/aarch64-linux-gnu/tegra/libnvargus.so
#2  0x0000007fb7f1aa38 in ?? () from /usr/lib/aarch64-linux-gnu/tegra/libnvargus.so
#3  0x0000005555644064 in our_function_to_create_capture_sessions (this=0x55557db970) at ...
#4  0x0000005555641f40 in our_init_function (this=0x55557db970) at ....
#5  0x000000555560a304 in a_higher_level_function (argc=5, argv=0x7fffffd8c8) at ...
#6  0x000000555560a7b8 in main (argc=., argv=0x..........) at ...

>>> 5 capture sessions ; 5 cameras

When it hangs:

[CONSUMER CONNECTS TO EGL STREAM #4 AND START PROCESSING DATA]
(NvCamV4l2) Error IoctlFailed:  (in /dvs/git/dirty/git-master_linux/camera/utils/nvcamv4l2/v4l2_device.cpp, function streamControl(), line 1661)
(NvOdmDevice) Error IoctlFailed:  (propagating from dvs/git/dirty/git-master_linux/camera-partner/imager/src/devices/V4L2SensorViCsi.cpp, function apply(), line 205)
(NvCamV4l2) Error IoctlFailed:  (in /dvs/git/dirty/git-master_linux/camera/utils/nvcamv4l2/v4l2_device.cpp, function setControlValMultiple(), line 792)
(NvOdmDevice) Error IoctlFailed:  (propagating from dvs/git/dirty/git-master_linux/camera-partner/imager/src/devices/V4L2SensorViCsi.cpp, function setDeviceControls(), line 1856)
updateOutputSettings: Set Control failed. Use cached values
(NvCamV4l2) Error IoctlFailed:  (in /dvs/git/dirty/git-master_linux/camera/utils/nvcamv4l2/v4l2_device.cpp, function streamControl(), line 1661)
(NvOdmDevice) Error IoctlFailed:  (propagating from dvs/git/dirty/git-master_linux/camera-partner/imager/src/devices/V4L2SensorViCsi.cpp, function apply(), line 205)
(NvCamV4l2) Error IoctlFailed:  (in /dvs/git/dirty/git-master_linux/camera/utils/nvcamv4l2/v4l2_device.cpp, function setControlValMultiple(), line 792)
(NvOdmDevice) Error IoctlFailed:  (propagating from dvs/git/dirty/git-master_linux/camera-partner/imager/src/devices/V4L2SensorViCsi.cpp, function setDeviceControls(), line 1856)
updateOutputSettings: Set Control failed. Use cached values
Error: waitCsiFrameStart timeout guid 2
VI Stream Id = 3 Virtual Channel = 0
************VI Debug Registers**********
VI_CSIMUX_STAT_FRAME_12	 = 0x00000000
VI_CSIMUX_FRAME_STATUS_0	 = 0x00000000
VI_CFG_INTERRUPT_STATUS_0	 = 0x3f000000
VI_ISPBUFA_ERROR_0	 = 0x00000000
VI_FMLITE_ERROR_0	 = 0x00000000
VI_NOTIFY_ERROR_0	 = 0x00000000
*****************************************
CSI Stream Id = 3 Brick Id = 1
************CSI Debug Registers**********
CILA_INTR_STATUS_CILA[0x20400]	 = 0x00000000
CILB_INTR_STATUS_CILB[0x20c00]	 = 0x00000000
INTR_STATUS[0x208a4]	 = 0x00000000
INTR_STATUS[0x208a4]	 = 0x00000000
ERR_INTR_STATUS[0x208ac]	 = 0x00000000
ERROR_STATUS2VI_VC0[0x20894]	 = 0x00000000
ERROR_STATUS2VI_VC1[0x20898]	 = 0x00000000
ERROR_STATUS2VI_VC2[0x2089c]	 = 0x00000000
ERROR_STATUS2VI_VC3[0x208a0]	 = 0x00000000
*****************************************
SCF: Error BadValue: timestamp cannot be 0 (in src/services/capture/NvViCsiHw.cpp, function waitCsiFrameStart(), line 624)
SCF: Error BadValue:  (propagating from src/common/Utils.cpp, function workerThread(), line 116)
SCF: Error BadValue: Worker thread ViCsiHw frameStart failed (in src/common/Utils.cpp, function workerThread(), line 133)
Error: waitCsiFrameEnd timeout guid 2
VI Stream Id = 3 Virtual Channel = 0
************VI Debug Registers**********
VI_CSIMUX_STAT_FRAME_12	 = 0x00000000
VI_CSIMUX_FRAME_STATUS_0	 = 0x00000000
VI_CFG_INTERRUPT_STATUS_0	 = 0x3f000000
VI_ISPBUFA_ERROR_0	 = 0x00000000
VI_FMLITE_ERROR_0	 = 0x00000000
VI_NOTIFY_ERROR_0	 = 0x00000000
*****************************************
CSI Stream Id = 3 Brick Id = 1
************CSI Debug Registers**********
CILA_INTR_STATUS_CILA[0x20400]	 = 0x00000000
CILB_INTR_STATUS_CILB[0x20c00]	 = 0x00000000
INTR_STATUS[0x208a4]	 = 0x00000000
INTR_STATUS[0x208a4]	 = 0x00000000
ERR_INTR_STATUS[0x208ac]	 = 0x00000000
ERROR_STATUS2VI_VC0[0x20894]	 = 0x00000000
ERROR_STATUS2VI_VC1[0x20898]	 = 0x00000000
ERROR_STATUS2VI_VC2[0x2089c]	 = 0x00000000
ERROR_STATUS2VI_VC3[0x208a0]	 = 0x00000000
*****************************************
SCF: Error BadValue: timestamp cannot be 0 (in src/services/capture/NvViCsiHw.cpp, function waitCsiFrameEnd(), line 711)
SCF: Error BadValue:  (propagating from src/common/Utils.cpp, function workerThread(), line 116)
SCF: Error BadValue: Worker thread ViCsiHw frameComplete failed (in src/common/Utils.cpp, function workerThread(), line 133)
Error: waitCsiFrameStart timeout guid 3
VI Stream Id = 4 Virtual Channel = 0
************VI Debug Registers**********
VI_CSIMUX_STAT_FRAME_16	 = 0x00000000
VI_CSIMUX_FRAME_STATUS_0	 = 0x00000010
VI_CFG_INTERRUPT_STATUS_0	 = 0x3f000000
VI_ISPBUFA_ERROR_0	 = 0x00000000
VI_FMLITE_ERROR_0	 = 0x00000000
VI_NOTIFY_ERROR_0	 = 0x00000000
*****************************************
CSI Stream Id = 4 Brick Id = 2
************CSI Debug Registers**********
CILA_INTR_STATUS_CILA[0x30400]	 = 0x00000000
CILB_INTR_STATUS_CILB[0x30c00]	 = 0x00000000
INTR_STATUS[0x300a4]	 = 0x00000000
INTR_STATUS[0x300a4]	 = 0x00000000
ERR_INTR_STATUS[0x300ac]	 = 0x00000000
ERROR_STATUS2VI_VC0[0x30094]	 = 0x00000000
ERROR_STATUS2VI_VC1[0x30098]	 = 0x00000000
ERROR_STATUS2VI_VC2[0x3009c]	 = 0x00000000
ERROR_STATUS2VI_VC3[0x300a0]	 = 0x00000000
*****************************************
SCF: Error BadValue: timestamp cannot be 0 (in src/services/capture/NvViCsiHw.cpp, function waitCsiFrameStart(), line 624)
SCF: Error BadValue:  (propagating from src/common/Utils.cpp, function workerThread(), line 116)
SCF: Error BadValue: Worker thread ViCsiHw frameStart failed (in src/common/Utils.cpp, function workerThread(), line 133)
Error: waitCsiFrameEnd timeout guid 3
VI Stream Id = 4 Virtual Channel = 0
************VI Debug Registers**********
VI_CSIMUX_STAT_FRAME_16	 = 0x00000000
VI_CSIMUX_FRAME_STATUS_0	 = 0x00000010
VI_CFG_INTERRUPT_STATUS_0	 = 0x3f000000
VI_ISPBUFA_ERROR_0	 = 0x00000000
VI_FMLITE_ERROR_0	 = 0x00000000
VI_NOTIFY_ERROR_0	 = 0x00000000
*****************************************
CSI Stream Id = 4 Brick Id = 2
************CSI Debug Registers**********
CILA_INTR_STATUS_CILA[0x30400]	 = 0x00000000
CILB_INTR_STATUS_CILB[0x30c00]	 = 0x00000000
INTR_STATUS[0x300a4]	 = 0x00000000
INTR_STATUS[0x300a4]	 = 0x00000000
ERR_INTR_STATUS[0x300ac]	 = 0x00000000
ERROR_STATUS2VI_VC0[0x30094]	 = 0x00000000
ERROR_STATUS2VI_VC1[0x30098]	 = 0x00000000
ERROR_STATUS2VI_VC2[0x3009c]	 = 0x00000000
ERROR_STATUS2VI_VC3[0x300a0]	 = 0x00000000
*****************************************
SCF: Error BadValue: timestamp cannot be 0 (in src/services/capture/NvViCsiHw.cpp, function waitCsiFrameEnd(), line 711)
SCF: Error BadValue:  (propagating from src/common/Utils.cpp, function workerThread(), line 116)
SCF: Error BadValue: Worker thread ViCsiHw frameComplete failed (in src/common/Utils.cpp, function workerThread(), line 133)
(NvCamV4l2) Error IoctlFailed:  (in /dvs/git/dirty/git-master_linux/camera/utils/nvcamv4l2/v4l2_device.cpp, function setControlValMultiple(), line 792)
(NvOdmDevice) Error IoctlFailed:  (propagating from dvs/git/dirty/git-master_linux/camera-partner/imager/src/devices/V4L2SensorViCsi.cpp, function setDeviceControls(), line 1856)
updateOutputSettings: Set Control failed. Use cached values

[TRY TO STOP CONSUMER]
SCF: Error Timeout:  (propagating from src/services/capture/CaptureServiceEvent.cpp, function wait(), line 59)
Error: Camera HwEvents wait, this may indicate a hardware timeout occured,abort current/incoming cc
[HANGING...]

>>> 6 capture sessions ; 6 cameras

When it hangs:

[CONSUMER CONNECTS TO EGL STREAM #5 AND START PROCESSING DATA]
(NvCamV4l2) Error IoctlFailed:  (in /dvs/git/dirty/git-master_linux/camera/utils/nvcamv4l2/v4l2_device.cpp, function streamControl(), line 1661)
(NvOdmDevice) Error IoctlFailed:  (propagating from dvs/git/dirty/git-master_linux/camera-partner/imager/src/devices/V4L2SensorViCsi.cpp, function apply(), line 205)
Error: waitCsiFrameEnd timeout guid 0
VI Stream Id = 5 Virtual Channel = 0
************VI Debug Registers**********
VI_CSIMUX_STAT_FRAME_20	 = 0x00000000
VI_CSIMUX_FRAME_STATUS_0	 = 0x00000010
VI_CFG_INTERRUPT_STATUS_0	 = 0x3f000000
VI_ISPBUFA_ERROR_0	 = 0x00000000
VI_FMLITE_ERROR_0	 = 0x00000000
VI_NOTIFY_ERROR_0	 = 0x00000000
*****************************************
CSI Stream Id = 5 Brick Id = 2
************CSI Debug Registers**********
CILA_INTR_STATUS_CILA[0x30400]	 = 0x00000000
CILB_INTR_STATUS_CILB[0x30c00]	 = 0x00000000
INTR_STATUS[0x308a4]	 = 0x00000000
INTR_STATUS[0x308a4]	 = 0x00000000
ERR_INTR_STATUS[0x308ac]	 = 0x00000000
ERROR_STATUS2VI_VC0[0x30894]	 = 0x00000000
ERROR_STATUS2VI_VC1[0x30898]	 = 0x00000000
ERROR_STATUS2VI_VC2[0x3089c]	 = 0x00000000
ERROR_STATUS2VI_VC3[0x308a0]	 = 0x00000000
*****************************************
SCF: Error BadValue: timestamp cannot be 0 (in src/services/capture/NvViCsiHw.cpp, function waitCsiFrameEnd(), line 711)
SCF: Error BadValue:  (propagating from src/common/Utils.cpp, function workerThread(), line 116)
SCF: Error BadValue: Worker thread ViCsiHw frameComplete failed (in src/common/Utils.cpp, function workerThread(), line 133)
Error: waitCsiFrameStart timeout guid 0
VI Stream Id = 5 Virtual Channel = 0
************VI Debug Registers**********
VI_CSIMUX_STAT_FRAME_20	 = 0x00000000
VI_CSIMUX_FRAME_STATUS_0	 = 0x00000000
VI_CFG_INTERRUPT_STATUS_0	 = 0x3f000000
VI_ISPBUFA_ERROR_0	 = 0x00000000
VI_FMLITE_ERROR_0	 = 0x00000000
VI_NOTIFY_ERROR_0	 = 0x00000000
*****************************************
CSI Stream Id = 5 Brick Id = 2
************CSI Debug Registers**********
CILA_INTR_STATUS_CILA[0x30400]	 = 0x00000000
CILB_INTR_STATUS_CILB[0x30c00]	 = 0x00000000
INTR_STATUS[0x308a4]	 = 0x00000000
INTR_STATUS[0x308a4]	 = 0x00000000
ERR_INTR_STATUS[0x308ac]	 = 0x00000000
ERROR_STATUS2VI_VC0[0x30894]	 = 0x00000000
ERROR_STATUS2VI_VC1[0x30898]	 = 0x00000000
ERROR_STATUS2VI_VC2[0x3089c]	 = 0x00000000
ERROR_STATUS2VI_VC3[0x308a0]	 = 0x00000000
*****************************************
SCF: Error BadValue: timestamp cannot be 0 (in src/services/capture/NvViCsiHw.cpp, function waitCsiFrameStart(), line 624)
SCF: Error BadValue:  (propagating from src/common/Utils.cpp, function workerThread(), line 116)
SCF: Error BadValue: Worker thread ViCsiHw frameStart failed (in src/common/Utils.cpp, function workerThread(), line 133)
(NvCamV4l2) Error IoctlFailed:  (in /dvs/git/dirty/git-master_linux/camera/utils/nvcamv4l2/v4l2_device.cpp, function setControlValMultiple(), line 792)
(NvOdmDevice) Error IoctlFailed:  (propagating from dvs/git/dirty/git-master_linux/camera-partner/imager/src/devices/V4L2SensorViCsi.cpp, function setDeviceControls(), line 1856)
updateOutputSettings: Set Control failed. Use cached values

[TRY TO STOP CONSUMER]
SCF: Error Timeout:  (propagating from src/services/capture/CaptureServiceEvent.cpp, function wait(), line 59)
Error: Camera HwEvents wait, this may indicate a hardware timeout occured,abort current/incoming cc
[HANGING...]
  1. Did you run the jetson_clocks to boost the clocks?
  2. Does any lower resolution to verify. like 1080p

Hi ShaneCCC, thank you for answering.

We cannot run jetson_clocks on our TX2i, even when connecting a fan and using sudo. It returns “Can’t access fan!”. I found a thread on this forum (https://devtalk.nvidia.com/default/topic/1061767/jetson-tx2/error-quot-can-t-acces-fan-quot-when-trying-to-run-usr-bin-jetson_clocks-on-tx2-with-jetpack-4-2/post/5376503/#5376503). It would seem like ConnectTech’s BSPs does not support accessing the fan. I have written to their support to find out… Also, they are having issues supporting the NVPModel utility, therefore we basically cannot drive the power performances on our TX2i.

We did: we get exactly the same results (we tried 1080p and 540p).

Notes:

  1. Critical crashes and Segmentation Fault occurs while trying to create capture sessions, before setting the resolution.
  2. We tried setting the resolution of the output streams (with Argus::IEGLOutputStreamSettings::setResolution()), not the actual resolution of the cameras! Our application checks for available cameras and camera modes during init, and the only mode available is 4K so we cannot change this.

Hi again,

We tried to replace our TX2i by a TX2 flashed with exactly the same configuration (in order to be able to run jetson_clocks) and we tried again our application. We get the same results.

It really seems like it is an Argus issue.

Update:

We managed to set NVPModel power mode on TX2i following ConnectTech’s instructions (for those who also encountered this issue, you have to open the correct conf file which is not the by-default file :

sudo nvpmodel -f /etc/nvpmodel/nvpmodel_186.conf -q 
sudo nvpmodel -f /etc/nvpmodel/nvpmodel_186.conf -m #

We got the same results again. Power management is not in cause neither on TX2 nor TX2i.

We also tried to force a very slow init with outrageous calls of sleep(), to ensure that our problem is not cause by an init running too fast and conflicts happening, for again the same results.

Could you try the sample APP argus_camera to launch 6 cameras.

External Media

We have confirmation from ConnectTech’s support that the problem is located into their BSP (they could reproduce the problem with their own platform). I will keep this thread updated with what is found and how to fix it.


Nonetheless! :

We tried the argus_camera app, and we got something weird and worth pointing out:

  1. The application works with 6 camera from times to times (more or less in the same proportions of our 6sessions-6cameras configuration). Does the argus_camera use 6 capture sessions as well? (I am currently digging into this argus_camera sample but is taking time…)

  2. When running argus_camera, it prints the Argus Version which is 0.97.3 (multi-process) while doing the same call in our code returns 0.97.3 (single-process)!

How do you start Argus in multi-process mode?
Does it make any difference?

  1. Running argus_camera results into a few /usr/sbin/nvargus-daemon processes being created. Does the app use the daemon instead of directly the libArgus API? Is it linked with the multi-process?

  2. The argus_camera app does not produce any error/warning in the terminal, neither while everything is OK nor encountering problems.

Thank you in advance for your answers.

  1. Yes.
  2. during the makefile generation step:
    ‘cmake -DMULTIPROCESS=ON …’
  3. No, argus API call the low level camera function via daemon.
  4. The message will direct to the /var/log/syslog while error happening

The option MULTIPROCESS does not seem to exist (“Manually-specified variables were not used by the project”). Do you pass this option when building something against libArgus or when building libArgus itself? Please provide instructions.

Also, could you please explain what difference does it make to be single-process or multi-process?

Please have a check the tegra_multimedia_api/argus/RELEASE.TXT

===============================================================================
Argus Camera API Release Notes
===============================================================================

-------------------------------------------------------------------------------
Release 0.97 (08/10/2018)
-------------------------------------------------------------------------------
NEW FEATURES:
* Application-managed Buffer Streams
    - In addition to outputting capture results to an EGLStream, clients now
      have the ability to output capture results directly to client-allocated
      image buffers. Initial support is limited to the use of EGLImages and
      EGLSync objects, and is provided through a number of new types and
      interfaces:

      StreamType
          UUID type that defines the core type of the stream and defines which
          interfaces the stream will support. Applications that continue to use
          EGLStreams will use STREAM_TYPE_EGL (and must provide that as an
          argument to ICaptureSession::createOutputStreamSettings), while the
          new application-managed buffer streams support is provided with the
          use of STREAM_TYPE_BUFFER.

      BufferType
          UUID type that defines the type of an application-managed Buffer
          resource. In this release, the only BufferType that is supported is
          BUFFER_TYPE_EGL_IMAGE, which specifies that Buffer objects will wrap
          EGLImage resources.

      SyncType
          UUID type that defines the sync type of an application-managed Buffer
          resource. The use of sync objects is optional, but they may be used to
          optimize application pipelining by synchronizing Buffer data access
          using hardware sync primitives. This allows Buffers to be output from
          Argus before the capture is fully complete so that the application
          may program its downstream pipeline earlier, reducing CPU threads and
          latency. The only hardware SyncType supported in this release is
          SYNC_TYPE_EGL_SYNC, which specifies that Buffer objects will use
          EGLSync objects for synchronizing data access to the Buffer's image
          store. Note that SYNC_TYPE_NONE is the default SyncType, which means
          that hardware sync will not be used, and so Buffers will only be
          passed between Argus and the client once all buffer access is
          complete (ie. it will CPU wait for any data access before returning).

      IBufferOutputStreamSettings
          This is used to configure Buffer OutputStream creation, and is exposed
          by OutputStreamSettings objects created using STREAM_TYPE_BUFFER.
          This interface provides methods to set the BufferType and SyncType
          that will be used with the new stream.

      IBufferOutputStream
          This is the primary interface for application-managed buffer
          OutputStreams (STREAM_TYPE_BUFFER). It provides methods for
          configuring and creating Buffer objects, as well as the acquire and
          release and synchronization mechanisms required to pass those Buffers
          between Argus and the client application.

      BufferSettings
          Provided by IBufferOutputStream, this object is used to configure the
          creation of Buffer objects. The interfaces exposed by this object
          depends on the BufferType of the OutputStream (which is currently
          limited to BUFFER_TYPE_EGL_IMAGE and IEGLImageBufferSettings).

      IEGLImageBufferSettings
          This is used to set the EGLDisplay and EGLImage handles of the EGLImage
          that will be wrapped by the new Buffer object.

      Buffer
          Created and owned by an OutputStream, these objects wrap application-
          managed buffer resources and are used to synchronize data access to
          the buffer resources between Argus and the client application. Buffers
          are released to their stream to be used by Argus for a capture
          request, and are acquired back when the capture is complete and the
          client is ready to consume the output.

      IBuffer
          This provides the core features available for all BufferTypes,
          including the ability to associate client pointers with a Buffer using
          set/getClientData.

      IEGLImageBuffer
          Provides getters for the EGLImage resources that are being wrapped by
          a Buffer having the BUFFER_TYPE_EGL_IMAGE type.

      IEGLSync
          When the SyncType is SYNC_TYPE_EGL_SYNC, this interface provides the
          means to get or set EGLSync objects on the Buffer after acquiring the
          Buffer or before releasing it for use with another capture request,
          respectively.

      Note that the use of application-managed buffers, including all of the
      objects and interfaces above, are demonstrated by the new eglImage sample.

* Ext::DolWdrSendorMode extension
    - Adds extra functionalities for Digital Overlap (DOL) Wide Dynamic Range
      (WDR) sensor modes.

* EGLStream::IImageHeaderlessFile interface
    - Provides a means to write acquired EGLStream::Images to headerless files
      (ie. raw data dump).

CHANGES:
* StreamType parameter added to ICaptureSession:createOutputStreamSettings, and
  IStream renamed to IEGLOutputStream
    - Existing applications that use EGLStreams must use the new interface
      name and pass STREAM_TYPE_EGL to createOutputStreamSettings to be
      compatible with this release.

* All VideoStabilization capabilities were removed.
    - This feature was never fully supported, and the methods given to control
      video stabilization were likely insufficient to support a proper
      implementation in the future anyways. Thus, all video stabilization-
      related components are being removed from the core API in favor of a
      future extension (if video stabilization were to ever be added again).

* Ext::FaceDetect extension support and sample were removed.
    - NVIDIA does not offer face detection capabilities in this Argus release,
      and so the face detection sample is being removed so as to not be
      misleading to users (ie. providing a sample that does not actually work).
      Since it's possible that face detection may be added again in another
      release, the Ext::FaceDetect extension header has been left in place.

* Added optional Rotation parameter to EGLStream::NV::ImageNativeBuffer.

* Sample changes:
    - CommonOptions added to sample utils to provide a more consistent set of
      command-line parameters for all of the samples (including camera device,
      sensor mode index, preview window rectangle, etc).
    - NativeBuffer class added to utils to allocate native buffer objects that
      may be used with Buffer OutputStreams. This NativeBuffer class wraps and
      uses the NvBuffer API provided by the JetPack Multimedia APIs, and uses
      the methods that API provides to create EGLImages from the NvBuffers for
      use with Argus.

* New samples:
    eglImage - Demonstrates the use of the new application-managed buffer
               support by creating a Buffer OutputStream from a set of EGLImages
               given by the application and then making capture requests with
               those Buffers. The results are then displayed by binding the
               EGLImage to an OpenGL texture and then rendering on screen.
               EGLSyncs are also used to increase the hardware pipelining
               between Argus and OpenGL.

    cudaBayerDemosaic - Demonstrates a pipelined application that chains two
                        EGLStreams -- one between Argus and CUDA, and another
                        between CUDA and OpenGL -- in order to consume a Bayer
                        EGLStream with CUDA and (crudely) demosaic into another
                        RGBA stream which is then rendered on screen by OpenGL.

BUG FIXES:
* N/A

KNOWN ISSUES:
* N/A

ADDITIONAL NOTES:
* N/A

-------------------------------------------------------------------------------
Release 0.96 (10/05/2016)
-------------------------------------------------------------------------------
NEW FEATURES:
* Ext::BayerAverageMap - Provides pixel averages and clipping statistics.
* Edge enhancement controls and metadata.
* Saturation controls and metadata.
* Improved denoise controls and metadata.
* Improved white balance controls.
* Per-stream post-processing enable.

* New samples:
    userAutoExposure     - Demonstrates manual exposure time and analog gain
                           controls using a basic auto-exposure algorithm.
    userAutoWhiteBalance - Demonstrates manual white balance controls.
    denoise              - Uses two streams to render a side-by-side comparison
                           of the effect of enabling denoise algorithms.
    bayerAverageMap      - Uses the BayerAverageMap extension to visualize the
                           average color and clipping statistics.

CHANGES:
* The multiprocess implementation is now used by default.
* Various IStreamSettings controls moved into IRequest.
* BayerTuple class replaces generic float[4] usage.
* BayerSharpnessMap::getBinData returns the entire sharpness map instead of
  requiring per-bin/per-channel calls.
* ICaptureSession::stopRepeat() returns the range of capture IDs that were
  submitted by the repeat[Burst]() call being stopped.

BUG FIXES:
* Gain ranges/values corrected.
* Many stability and multiprocess fixes.

KNOWN ISSUES:
* N/A

ADDITIONAL NOTES:
* Various debug information and all Argus API errors will be output by the
  nvargus-daemon service, which writes to syslog by default. The following may
  be used to monitor this logging:
    $ tail -f /var/log/syslog | grep argus
* If an application crash/hang occurs, the nvargus-daemon service may be left in
  a bad state, and the hardware may be unavailable for a short time afterwards.
  When this occurs it is best to restart the nvargus-daemon service and wait for
  about 15 seconds before attempting to run another application:
    sudo service nvargus-daemon restart

-------------------------------------------------------------------------------
Release 0.95 (07/25/2016)
-------------------------------------------------------------------------------
NEW FEATURES:
* Multi-process support. This adds an nvargus-daemon system service and
  corresponding client-side library that applications link against in order to
  enable multi-process support. Besides changing the Argus library being linked
  against, no other application changes are required for multi-process support.

  To link applications for multi-process use, replace the libnvargus.so library
  name with libnvargus_socketclient.so. Alternatively, if using the CMake build
  system with the sample applications, provide the optional MULTIPROCESS flag
  during the makefile generation step:
      'cmake -DMULTIPROCESS=ON ..'

  Multiprocess applications require the nvargus-daemon service to be running.
  This service runs automatically at boot time, though it remains uninitialized
  until the first multi-process Argus application is launched.  Once the
  service is initialized the daemon will lock the camera resources on the
  device and single-process Argus applications will fail to operate.
  The nvargus-daemon service can be controlled using the following:
      'sudo service nvargus-daemon [stop|start|restart]'
  Error logging from the daemon will be output to the syslog (/var/log/syslog)

* Feature-rich camera application sample with full GUI support.
  (Previously 'samples/camera', moved to 'apps/camera'.)

* New samples:
    gstVideoEncode - Encodes video through a GStreamer consumer.
    multiStream    - Uses two OutputStreams for simultaneous still captures and
                     preview from a single device.
    multiSensor    - Opens two devices for simultaneous still captures from one
                     device and preview from the other.
    oneShot        - "Bare minimum" Argus app; takes a single snapshot.
    syncSensor     - Computes the stereo disparity between two syncronized
                     sensors.

CHANGES:
* EGLStream creation is now handled directly by Argus during OutputStream
  creation instead of requring applications to first create an EGLStream before
  connecting Argus.

* Argus::EGLStream components (FrameConsumer, Frame, Image, etc.) moved out of
  Argus to become its own API. It continues to use Argus types, and includes
  functionality that interact directly with Argus (eg. Argus metadata support
  and FrameConsumer creation from an OutputStream), but is otherwise independent
  from Argus and can be used without an open CameraProvider. This API is still
  contained in libnvargus.so, but is expected to be moved to its own library in a
  future release.

BUG FIXES:
* Face detection and sample rendering orientation fixed.
* All sensor modes are exposed (previously limited to one mode).

KNOWN ISSUES:
* Histogram stats only returning first 64 of 256 bins.
* Instability with multi-sensor boards (eg. E3323).

-------------------------------------------------------------------------------
Release 0.91 (03/10/2016)
-------------------------------------------------------------------------------
NEW FEATURES:
* Initial Argus release.

NOTES:
* As this is a beta release, the interfaces are not guaranteed to be
immutable; they may not be compatible with interfaces of the same name
from either previous or future releases.  Interface changes since the
last release are listed in the "CHANGES" section below.

* Current EGLStream Buffer Format Support:

    Format                     State
    =======================    ==========
    PIXEL_FMT_Y8               Not Supported
    PIXEL_FMT_Y16              Not Supported
    PIXEL_FMT_YCbCr_420_888    Works with all consumers.
    PIXEL_FMT_YCbCr_422_888    Not Supported
    PIXEL_FMT_YCbCr_444_888    Not Supported
    PIXEL_FMT_JPEG_BLOB        Not Supported (TODO: Remove format)
    PIXEL_FMT_RAW16            Not Supported

CHANGES:
* N/A

BUG FIXES:
* N/A

KNOWN ISSUES:
* The face detection algorithm is not detecting the device orientation
  correctly, and thus the input to the detection algorithm may be inverted
  with respect to the sensor. If face detection does not appear to be working,
  rotating the sensor 180 degrees may help.

Thank you for pointing that out. It is at lines 215-216 : in fact I have to link against libnvargus_socketclient.so instead of libnvargus.so

I am testing it right away and will publish results.

Using the multi-process version of Argus leads to following results:

  • capturing from more than 3 camera with only 1 capture session still does not work (same behavior)
  • capturing from more than 3 cameras with 1 capture session per camera does not result in the same errors as described before. However, we only managed to get data when encoding jpegs and it leads to other problems. I opened a new thread : https://devtalk.nvidia.com/default/topic/1066614/jetson-tx2/argus-tegra-multimedia-api-in-multi-process-mode-does-not-terminate-quot-nvargus-daemon-quot-process-after-exiting-application/

I also just found that encoding from more than 3 cameras (trying 6 cameras 4K 5fps) does not work neither. I will open a new thread when I have more details…

EDIT: In fact, using the multi-session mode seams to result in the same behaviors, but just not logging the errors the same way.

Note: To develop our application, notably the producer/consumer part, we took example mostly on the sample tegra_multimedia_api/samples/09_camera_jpeg_capture. Objects used and calls order are the same as in this sample.

Hi @videlo, Did you ever get the Argus library to work properly? i am trying to acquire frames from 6 cameras and I get different results each time I run the same source code. I also tool example on the sample 09_camera_jpeg_capture and 13_multi_camera. Thanks!!