I am trying to read a camera using argus library and a C++ code similar to 13_argus_multi_camera sample. I am using cmakelists.txt to build the code.
When I run the code, I get this error in the initialization:
CAM: serial no file already exists, skips storing again---- imager: Found override file [/var/nvidia/nvcam/settings/camera_overrides.isp]. ----
(NvCamV4l2) Error ModuleNotPresent: V4L2Device not available (in /dvs/git/dirty/git-master_linux/camera/utils/nvcamv4l2/v4l2_device.cpp, function findDevice(), line 256)
(NvCamV4l2) Error ModuleNotPresent: (propagating from /dvs/git/dirty/git-master_linux/camera/utils/nvcamv4l2/v4l2_device.cpp, function initialize(), line 60)
(NvOdmDevice) Error ModuleNotPresent: (propagating from dvs/git/dirty/git-master_linux/camera-partner/imager/src/devices/V4L2SensorViCsi.cpp, function initialize(), line 107)
NvPclDriverInitializeData: Unable to initialize driver v4l2_sensor
NvPclInitializeDrivers: error: Failed to init camera sub module v4l2_sensor
NvPclStartPlatformDrivers: Failed to start module drivers
NvPclStateControllerOpen: Failed ImagerGUID 2. (error 0xA000E)
NvPclOpen: PCL Open Failed. Error: 0xf
SCF: Error BadParameter: Sensor could not be opened. (in src/services/capture/CaptureServiceDeviceSensor.cpp, function getSourceFromGuid(), line 677)
SCF: Error BadParameter: (propagating from src/services/capture/CaptureService.cpp, function addSourceByGuid(), line 453)
SCF: Error BadParameter: (propagating from src/api/CameraDriver.cpp, function addSourceByIndex(), line 333)
SCF: Error BadParameter: (propagating from src/api/CameraDriver.cpp, function getSource(), line 505)
Argus Version: 0.98.3 (single-process)
(Argus) Error BadParameter: Unknown device specified (in src/api/CameraProviderImpl.cpp, function createCaptureSessionInternal(), line 254)
Error generated. Failed to create CaptureSession
Error generated. Failed to initialize Camera session 0
Which I have no idea about them. When I add nvargus_socketclient lib to my cmakelist instead of nvargus, all the above warnings and errors vanish, but I get a segmentation fault error exactly at this command:
/* Create the capture session using the first device and get the core interface */
CaptureSession* cap_session = iCameraProvider->createCaptureSession(device);
I have no clue what is wrong with the argus.
My working system is Jetson Xavier NX with Jetpack 5.1.3.
Hello Jery
I ran the command and it saved an image but I cannot open it (unknown format).
However I ran this gst command and saved a meaningful image which confirms the camera working:
v4l2 it fetches camera raw directly, you may using 3rdparty tools, such as 7yuv to view the content.
could you please further check its preview stream with following. $ gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM),framerate=30/1,format=NV12' ! nvvidconv ! xvimagesink
or… you may add video converter for small resolution display. $ gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM),framerate=30/1,format=NV12' ! nvvidconv ! 'video/x-raw, format=(string)I420, width=640, height=480' ! queue ! xvimagesink -e
$ gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM),framerate=50/1,format=NV12' ! nvvidconv
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected...
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 4480 x 4504 FR = 50.000000 fps Duration = 20000000 ; Analog Gain range min 1.000000, max 27.000000; Exposure Range min 100000, max 500000000;
GST_ARGUS: 4480 x 4504 FR = 43.000000 fps Duration = 23255814 ; Analog Gain range min 1.000000, max 27.000000; Exposure Range min 100000, max 500000000;
GST_ARGUS: 2240 x 2252 FR = 180.000018 fps Duration = 5555555 ; Analog Gain range min 1.000000, max 27.000000; Exposure Range min 100000, max 500000000;
GST_ARGUS: Running with following settings:
Camera index = 0
Camera mode = 2
Output Stream W = 2240 H = 2252
seconds to Run = 0
Frame Rate = 180.000018
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
ERROR: from element /GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3072): gst_base_src_loop (): /GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0:
streaming stopped, reason not-linked (-1)
Execution ended after 0:00:01.188459709
Setting pipeline to NULL ...
GST_ARGUS: Cleaning up
CONSUMER: Done Success
GST_ARGUS: Done Success
Freeing pipeline ...
If I add xvimagesink I get Xv output error, which I think is becasue of the fact that I don’t have display (I am working with a remote board).
One new point, I set up another board, and build up the environment with an older version of Jetpack 5.1-b147 and L4T core package: 35.2.1. That one works fine.
I even copied the binary from that one to this faulty board and that worked as well.
Whatever argus code I build on this board only (Jetpack 5.1.3) has the seg fault issue.
all right,
you may try with below to disable preview and shows frame-rate only
for instance, $ gst-launch-1.0 nvarguscamerasrc sensor-id=0 sensor-mode=0 ! 'video/x-raw(memory:NVMM),width=4480, height=4504, framerate=50/1, format=NV12' ! nvvidconv ! fpsdisplaysink text-overlay=0 name=sink_0 video-sink=fakesink sync=0 -v
is it some header file mismatch?
how you obtain the code, and what’s your complete steps to build it.
thanks for issue narrow down, it seems you’re able to fetch the stream via nvarguscamerasrc correctly.
so…
let’s back to your original question,
your failure it might due to Argus app that based-on EGL implementation to render the display
images.
is it possible to setup a display monitor for verification?
At the moment I have no access to display but I will check later.
However as I mentioned, the binary that is built on the other similar board, is working on this board too (without display of course).
Is there some setting in EGL codes to make it completely independant from display?
since the camera stream works with gst pipeline. had you also tested with Argus sample applications to verify the functionality?
if all of above works. it would the implementation bug of your code.
I tried to build the multimedia sample (13_argus_multi_camera) and I encountered these error:
undefined reference to NvBufSurfaceGetMapParams
undefined reference to NvBufSurfaceImport
I already linked against the nvbufsurface, but later I realized that nvbufsurface.h is different in Jetpack 5.1.3. So the nvbufsurface.so is different too.
The problem here is that in the api of Jetpack 5.1.3 the nvbufsurface.h has the NvBufSurfaceGetMapParams member but the ${TEGRA_DIR}/libnvbufsurface.so doesn’t have this member! Even though that I installed the whole api at once from one Jetpack (5.1.3).
My other seg fault error might be related to this!?
FYI,
the same application (13_argus_multi_camera) works normally on developer kit with JP-5.1.3 relesse version.
suggest you may re-download the MMAPI sources and re-build the code for confirmation.
There was a mismatch between MMAPI version and the L4T core package (I renewed the API from this link accordingly: Jetson Linux 35.2.1 | NVIDIA Developer)
I had /usr/include/libdrm/drm.h directory linked to my project, which now I changed it to /usr/include/drm/drm.h
I already made the whole /usr/lib/aarch64-linux-gnu/tegra accessible for my project (using link_directories command in cmake), but now I removed it and added the full path to the *.so files I am using in the project.
I don’t know which is exactly the root of the issue, but now it is resolved.