Using multiple camera’s nvarguscamerasrc hangs on the Jetson Nano.
Do the logs go somewhere? The closest thing to a recommendation on other threads was to monitor CPU usage of the nvargus process and reduce the resolution of the cameras, but that is not really satisfactory.
The command we are using is:
this.cap = new cv.VideoCapture( nvarguscamerasrc wbmode=5 ! \ video/x-raw(memory:NVMM), width=1920, height=1080, format=NV12 ! \ nvvidconv flip_method=2 ! \ video/x-raw, format=BGRx ! \ appsink max-buffers=1 drop=True
);
This command works — but when I add sensor-id=0 (or) sensor-id=1 to the command the command hangs.
We are using openCvForNodeJs and running:
jetson-210_linux_r32.5.2_aarch64.tbz2
gstreamer1.0-tools gstreamer1.0-alsa
gstreamer1.0-plugins-base gstreamer1.0-plugins-good
gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly
gstreamer1.0-libav \
OpenCV 4.5.4
Is there any other information that would be helpful for debugging?
Thanks for your help Jerry,
Running the commands outputted a couple errors:
vPclHwScanExternalCameras: Failed to stat '/dev//log'; No such file or directory
NvPclHwScanExternalCameras: Failed to stat '/dev//initctl'; No such file or directory
NvPclHwScanExternalCameras: Failed to stat '/dev//log'; No such file or directory
NvPclHwScanExternalCameras: Failed to stat '/dev//initctl'; No such file or directory
NvPclHwScanExternalCameras: Failed to stat '/dev//log'; No such file or directory
NvPclHwScanExternalCameras: Failed to stat '/dev//initctl'; No such file or directory
I guess these are pseudo devices that map to /run/systemd:
mercury@mercury:~$ ls -lh /dev//initctl
lrwxrwxrwx 1 root root 25 Mar 2 2023 /dev//initctl -> /run/systemd/initctl/fifo
mercury@mercury:~$ ls -lh /dev//log
lrwxrwxrwx 1 root root 28 Mar 2 2023 /dev//log -> /run/systemd/journal/dev-log
So I added them as bind mountings in the container.
We also got this error:
ARGUS_ERROR: Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute: 787 Frame Rate specified is greater than supported
Reducing the frame rate to 20 – and then the 3 commands with sensor-id=0, sensor-id=1 and without specifying sensor-id all began to work.
There are still an error (I don’t know if it is significant):
NvIspAfConfigParamsSanityCheck: Error: positionWorkingHigh is not larger than positionWorkingLow positionWorkingHigh = 0, positionWorkingLow = 0
But as of now our two cameras are working and we can switch between them. I suspect it will be a bit of lift to get them streaming in parallel.
I will look into upgrading tegra as time permits (it never permits)
We are now successfully streaming 2 cameras sequentially-- we would need more application code to support 2 cameras simultaneously–but I think from the NVIDIA side of things as long as we have two sinks it should probably just work as expected.
And thanks for the tip on positionWorkingHigh log line. It was irking me so knowing that it is related to the auto-focuser is helpful.
Issue: we were not specifying sensor-id, sensor-mode or framerate. Solution: Investigating with gst-launch-1.0 revealed that the default framerate was too high for the default sensor mode.