Jetson AGX Orin nvargus_nvraw doesn't show sensors

Hello,
I am developing a camera driver and faced with a one more problem. I am finally able to capture raw image with v4l-ctl utility but when I try to check available sensors with nvargus_nvraw --lps it doesn’t show anything.
Here is nvargus-daemon logs I got:

=== NVIDIA Libargus Camera Service (0.99.33)=== Listening for connections...=== nvargus_nvraw[3569]: Connection established (FFFF9611B840)OFParserListModules: module list: /proc/device-tree/tegra-camera-platform/modules/module0
NvPclHwGetModuleList: WARNING: Could not map module to ISP config string
NvPclHwGetModuleList: No module data found
OFParserGetVirtualDevice: NVIDIA Camera virtual enumerator not found in proc device-tree
---- imager: No override file found. ----
(NvOdmDevice) Error BadParameter:  (propagating from dvs/git/dirty/git-master_linux/camera-partner/imager/src/devices/V4L2SensorViCsi.cpp, function initializeV4L2Items(), line 352)
(NvOdmDevice) Error BadParameter:  (propagating from dvs/git/dirty/git-master_linux/camera-partner/imager/src/devices/V4L2SensorViCsi.cpp, function initialize(), line 121)
NvPclDriverInitializeData: Unable to initialize driver v4l2_sensor
NvPclInitializeDrivers: error: Failed to init camera sub module v4l2_sensor
NvPclStartPlatformDrivers: Failed to start module drivers
NvPclStateControllerOpen: Failed ImagerGUID 0. (error 0x4)
NvPclOpen: PCL Open Failed. Error: 0xf
SCF: Error BadParameter: Sensor could not be opened. (in src/services/capture/CaptureServiceDeviceSensor.cpp, function getSourceFromGuid(), line 725)
SCF: Error BadParameter:  (propagating from src/services/capture/CaptureService.cpp, function addSourceByGuid(), line 453)
SCF: Error BadParameter:  (propagating from src/api/CameraDriver.cpp, function addSourceByIndex(), line 347)
SCF: Error BadParameter:  (propagating from src/api/CameraDriver.cpp, function getSource(), line 519)
=== nvargus_nvraw[3569]: CameraProvider initialized (0xffff90a0b630)=== nvargus_nvraw[3569]: CameraProvider destroyed (0xffff90a0b630)=== nvargus_nvraw[3569]: Connection closed (FFFF9611B840)=== nvargus_nvraw[3569]: Connection cleaned up (FFFF9611B840)

Taking into account all other forum threads I can assume that there is an issue in my dts overlay file but I couldn’t find anything sufficient there. So here are my dts overlay file in the attachments and device dts dump whitch I got with dtc -I fs -O dts /proc/device-tree/
tegra234-p3737-camera-dual-hawk-ar0234-e3653-overlay.dts.log (6.8 KB)
dts_dump.dts.log (324.2 KB)

hello nikita.pichugin,

device tree settings looks okay.

may I know which Jetpack/l4t release version you’re working with?
are you able to fetch the stream via gst pipeline with nvarguscamerasrc plugin?
for instance,
$ gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! 'video/x-raw(memory:NVMM),width=1920, height=1080, framerate=30/1, format=NV12' ! nvvidconv ! fpsdisplaysink text-overlay=0 name=sink_0 video-sink=fakesink sync=0 -v

Hi,
I am working with L4T 36.3 I built from sources and a base rootfs I generated with ./nv_build_samplefs.sh --abi aarch64 --distro ubuntu --flavor desktop --version jammy. I use fjash.sh script to flash my Jetson.

I also have some problems with nvarguscamerasrc and have WARNING: erroneous pipeline: no element "nvarguscamerasrc" when I try to run the command you gave.

Probably I should do something to make nvarguscamerasrc appears in my Linux image but I am not sure what exactly.

hello nikita.pichugin,

there’s additional MMAPI package, you should running below to download necessary packages.
for instance,
$ sudo apt-get install nvidia-l4t-gstreamer
$ sudo apt install nvidia-l4t-jetson-multimedia-api

Hi,
Here are my logs after running the command:

$ gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! 'video/x-raw(memory:NVMM),width=1920, height=1080, framerate=30/1, format=NV12' ! nvvidconv ! fpsdisplaysink text-overlay=0 name=sink_0 video-sink=fakesink sync=0 -v
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
/GstPipeline:pipeline0/GstFPSDisplaySink:sink_0/GstFakeSink:fakesink0: sync = false
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:792 No cameras available
/GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstFPSDisplaySink:sink_0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstFPSDisplaySink:sink_0/GstFakeSink:fakesink0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstFPSDisplaySink:sink_0.GstGhostPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
Redistribute latency...
/GstPipeline:pipeline0/GstFPSDisplaySink:sink_0/GstFakeSink:fakesink0: sync = false
Got EOS from element "pipeline0".
Execution ended after 0:00:00.007528404
Setting pipeline to NULL ...
Freeing pipeline ...

And nvargus-daemon logs:

=== NVIDIA Libargus Camera Service (0.99.33)=== Listening for connections...=== gst-launch-1.0[2768]: Connection established (FFFFB2D9B840)OFParserListModules: module list: /proc/device-tree/tegra-camera-platform/modules/module0
NvPclHwGetModuleList: WARNING: Could not map module to ISP config string
NvPclHwGetModuleList: No module data found
OFParserGetVirtualDevice: NVIDIA Camera virtual enumerator not found in proc device-tree
---- imager: No override file found. ----
(NvOdmDevice) Error BadParameter:  (propagating from dvs/git/dirty/git-master_linux/camera-partner/imager/src/devices/V4L2SensorViCsi.cpp, function initializeV4L2Items(), line 352)
(NvOdmDevice) Error BadParameter:  (propagating from dvs/git/dirty/git-master_linux/camera-partner/imager/src/devices/V4L2SensorViCsi.cpp, function initialize(), line 121)
NvPclDriverInitializeData: Unable to initialize driver v4l2_sensor
NvPclInitializeDrivers: error: Failed to init camera sub module v4l2_sensor
NvPclStartPlatformDrivers: Failed to start module drivers
NvPclStateControllerOpen: Failed ImagerGUID 0. (error 0x4)
NvPclOpen: PCL Open Failed. Error: 0xf
SCF: Error BadParameter: Sensor could not be opened. (in src/services/capture/CaptureServiceDeviceSensor.cpp, function getSourceFromGuid(), line 725)
SCF: Error BadParameter:  (propagating from src/services/capture/CaptureService.cpp, function addSourceByGuid(), line 453)
SCF: Error BadParameter:  (propagating from src/api/CameraDriver.cpp, function addSourceByIndex(), line 347)
SCF: Error BadParameter:  (propagating from src/api/CameraDriver.cpp, function getSource(), line 519)
=== gst-launch-1.0[2768]: CameraProvider initialized (0xffffac9f0280)=== gst-launch-1.0[2768]: CameraProvider destroyed (0xffffac9f0280)=== gst-launch-1.0[2768]: Connection closed (FFFFB2D9B840)=== gst-launch-1.0[2768]: Connection cleaned up (FFFFB2D9B840)

And here is also the output of media-ctl:

sudo media-ctl -p -d /dev/media0
Media controller API version 5.15.136

Media device information
------------------------
driver          tegra-camrtc-ca
model           NVIDIA Tegra Video Input Device
serial
bus info
hw revision     0x3
driver version  5.15.136

Device topology
- entity 1: 13e00000.host1x:nvcsi@15a00000- (2 pads, 2 links)
            type V4L2 subdev subtype Unknown flags 0
            device node name /dev/v4l-subdev0
        pad0: Sink
                <- "ar0234 8-0012":0 [ENABLED]
        pad1: Source
                -> "vi-output, ar0234 8-0012":0 [ENABLED]

- entity 4: ar0234 8-0012 (1 pad, 1 link)
            type V4L2 subdev subtype Sensor flags 0
            device node name /dev/v4l-subdev1
        pad0: Source
                [fmt:SRGGB12_1X12/1920x1080 field:none colorspace:srgb]
                -> "13e00000.host1x:nvcsi@15a00000-":0 [ENABLED]

- entity 6: vi-output, ar0234 8-0012 (1 pad, 1 link)
            type Node subtype V4L flags 0
            device node name /dev/video0
        pad0: Sink
                <- "13e00000.host1x:nvcsi@15a00000-":1 [ENABLED]

hello nikita.pichugin,

here’s the failure according to the logs.

it’s usually the device tree issues, please examine your sensor device tree, especially the settings within Property-Value Pairs.

Hi,
So yep, that is why I initially give you my DTSs, unfortunately I have already checked everything I know and do not see any problems there

please give it a quick try with below commands to boost all the VI/CSI/ISP clocks.

sudo su
echo 1 > /sys/kernel/debug/bpmp/debug/clk/vi/mrq_rate_locked
echo 1 > /sys/kernel/debug/bpmp/debug/clk/isp/mrq_rate_locked
echo 1 > /sys/kernel/debug/bpmp/debug/clk/nvcsi/mrq_rate_locked
echo 1 > /sys/kernel/debug/bpmp/debug/clk/emc/mrq_rate_locked
cat /sys/kernel/debug/bpmp/debug/clk/vi/max_rate |tee /sys/kernel/debug/bpmp/debug/clk/vi/rate
cat /sys/kernel/debug/bpmp/debug/clk/isp/max_rate | tee /sys/kernel/debug/bpmp/debug/clk/isp/rate
cat /sys/kernel/debug/bpmp/debug/clk/nvcsi/max_rate | tee /sys/kernel/debug/bpmp/debug/clk/nvcsi/rate
cat /sys/kernel/debug/bpmp/debug/clk/emc/max_rate | tee /sys/kernel/debug/bpmp/debug/clk/emc/rate

Here are my logs:

root@nvidia:/home/nvidia# echo 1 > /sys/kernel/debug/bpmp/debug/clk/vi/mrq_rate_locked
root@nvidia:/home/nvidia# echo 1 > /sys/kernel/debug/bpmp/debug/clk/isp/mrq_rate_locked
root@nvidia:/home/nvidia# echo 1 > /sys/kernel/debug/bpmp/debug/clk/nvcsi/mrq_rate_locked
root@nvidia:/home/nvidia# echo 1 > /sys/kernel/debug/bpmp/debug/clk/emc/mrq_rate_locked
root@nvidia:/home/nvidia# cat /sys/kernel/debug/bpmp/debug/clk/vi/max_rate |tee /sys/kernel/debug/bpmp/debug/clk/vi/rate
832000000
root@nvidia:/home/nvidia# cat /sys/kernel/debug/bpmp/debug/clk/isp/max_rate | tee /sys/kernel/debug/bpmp/debug/clk/isp/rate
1011200000
root@nvidia:/home/nvidia# cat /sys/kernel/debug/bpmp/debug/clk/nvcsi/max_rate | tee /sys/kernel/debug/bpmp/debug/clk/nvcsi/rate
642900000
root@nvidia:/home/nvidia# cat /sys/kernel/debug/bpmp/debug/clk/emc/max_rate | tee /sys/kernel/debug/bpmp/debug/clk/emc/rate
3199000000
nvidia@nvidia:~$ gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! 'video/x-raw(memory:NVMM),width=1920, height=1080, framerate=30/1, format=NV12' ! nvvidconv ! fpsdisplaysink text-overlay=0 name=sink_0 video-sink=fakesink sync=0 -v
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
/GstPipeline:pipeline0/GstFPSDisplaySink:sink_0/GstFakeSink:fakesink0: sync = false
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:792 No cameras available
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstFPSDisplaySink:sink_0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstFPSDisplaySink:sink_0/GstFakeSink:fakesink0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstFPSDisplaySink:sink_0.GstGhostPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
Redistribute latency...
/GstPipeline:pipeline0/GstFPSDisplaySink:sink_0/GstFakeSink:fakesink0: sync = false
Got EOS from element "pipeline0".
Execution ended after 0:00:00.008344733
Setting pipeline to NULL ...
Freeing pipeline ...

hello nikita.pichugin,

may I know what’s your logs of v4l2-ctl?
for instance, are you able running below pipeline correctly?
$ v4l2-ctl -d /dev/video0 --set-fmt-video=width=1920,height=1080,pixelformat=RG12 --set-ctrl bypass_mode=0 --stream-mmap --stream-count=100

Hello,
Yes, as I previously said, my v4l2-ctl pipeline works great and captures frames. There are no any significant logs there, only dmesg prints from my driver.

nvidia@nvidia:~$ v4l2-ctl -d /dev/video0 --set-fmt-video=width=1920,height=1080,pixelformat=RG12 --stream-mmap --stream-count=100 --stream-to=out1.raw
<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 27.77 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<< 27.77 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<< 27.77 fps
<<<<<<<<<<<<<<<
[   31.882979] ar0234 8-0010: ar0234_power_on: power on
[   31.918870] ar0234 8-0010: set mode
[   31.918883] ar0234 8-0010: start stream
[   35.549375] ar0234 8-0010: stop stream
[   35.557046] ar0234 8-0010: ar0234_power_off:

UPD: Also checked it with --set-ctrl bypass_mode=0 and got the same results:

v4l2-ctl -d /dev/video0 --set-fmt-video=width=1920,height=1080,pixelformat=RG12 --stream-mmap --stream-count=100 --set-ctrl bypass_mode=0 --stream-to=out1.raw
<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 27.77 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<< 27.77 fps
<<<<<<<<<<<<<<<<<<<<<<<<<<<< 27.77 fps
<<<<<<<<<<<<<<<

hello nikita.pichugin,

please see-also reference driver, it reads the sync_sensor property to fill eeprom with its sync sensor index accordingly. i.e. r35.5.0/Linux_for_Tegra/source/public/kernel/nvidia/drivers/media/i2c/nv_hawk_owl.c

there should be initialization failure, the number of supported sensor entries 0.
FYI, this sync_sensor property will call stereo EEPROM CID controls, please see-also reference driver, nv_hawk_owl.c,
please check whether you’ve driver implementation to fill-in calibration data correctly.

Hello,
Hm, sounds interesting. I work with a single non-stereo camera and do not have any EEPROM needs, so I got rid off EEPROM CID controls and I also do not have the sync_sensor property.
What exactly should I do to make nvargus work?

Did I understand it right: nvargus calls EEPROM CID control and wants to have some meaningful data in:

priv->EepromCalib.cam_intr
priv->EepromCalib.cam_extr 
priv->EepromCalib.imu_present 
priv->EepromCalib.imu.imu_data_v1
priv->EepromCalib.imu.nm
priv->EepromCalib.serial_number
priv->EepromCalib.rls

hello nikita.pichugin,

as you can see, here’s a special device tree property, i.e. sync_sensor = "HAWK1";
it’ll call low-level driver for below CID control to fill those stereo data,

static int ar0234_fill_eeprom(...)
{
...
        switch (ctrl->id) {
                case TEGRA_CAMERA_CID_STEREO_EEPROM:

could you please give it a try to remove that property since you’re working with a single non-stereo camera,
thanks

Hi,
I tried to remove sync_sensor prop but did not get any positive results. I also tried to remove sync_sensor_index prop and also no results.
Here are my logs:

nvidia@nvidia:~$ gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! 'video/x-raw(memory:NVMM),width=1920, height=1080, framerate=30/1, format=NV12' ! nvvidconv ! fpsdisplaysink text-overlay=0 name=sink_0 video-sink=fakesink sync=0 -v
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
/GstPipeline:pipeline0/GstFPSDisplaySink:sink_0/GstFakeSink:fakesink0: sync = false
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:792 No cameras available
/GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstFPSDisplaySink:sink_0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstFPSDisplaySink:sink_0/GstFakeSink:fakesink0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstFPSDisplaySink:sink_0.GstGhostPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
Redistribute latency...
/GstPipeline:pipeline0/GstFPSDisplaySink:sink_0/GstFakeSink:fakesink0: sync = false
Got EOS from element "pipeline0".
Execution ended after 0:00:00.008319228
Setting pipeline to NULL ...
Freeing pipeline ...
nvidia@nvidia:~$ sudo su
[sudo] password for nvidia:
root@nvidia:/home/nvidia# pkill nvargus-daemon
root@nvidia:/home/nvidia# nvargus-daemon 2>&1 | tee argus-output.log
=== NVIDIA Libargus Camera Service (0.99.33)=== Listening for connections...=== gst-launch-1.0[2108]: Connection established (FFFFACE5B840)OFParserListModules: module list: /proc/device-tree/tegra-camera-platform/modules/module0
NvPclHwGetModuleList: WARNING: Could not map module to ISP config string
NvPclHwGetModuleList: No module data found
OFParserGetVirtualDevice: NVIDIA Camera virtual enumerator not found in proc device-tree
---- imager: No override file found. ----
(NvCamV4l2) Error ModuleNotPresent: V4L2Device not available (in /dvs/git/dirty/git-master_linux/camera/utils/nvcamv4l2/v4l2_device.cpp, function findDevice(), line 256)
(NvCamV4l2) Error ModuleNotPresent:  (propagating from /dvs/git/dirty/git-master_linux/camera/utils/nvcamv4l2/v4l2_device.cpp, function initialize(), line 60)
(NvOdmDevice) Error ModuleNotPresent:  (propagating from dvs/git/dirty/git-master_linux/camera-partner/imager/src/devices/V4L2SensorViCsi.cpp, function initialize(), line 111)
NvPclDriverInitializeData: Unable to initialize driver v4l2_sensor
NvPclInitializeDrivers: error: Failed to init camera sub module v4l2_sensor
NvPclStartPlatformDrivers: Failed to start module drivers
NvPclStateControllerOpen: Failed ImagerGUID 0. (error 0xA000E)
NvPclOpen: PCL Open Failed. Error: 0xf
SCF: Error BadParameter: Sensor could not be opened. (in src/services/capture/CaptureServiceDeviceSensor.cpp, function getSourceFromGuid(), line 725)
SCF: Error BadParameter:  (propagating from src/services/capture/CaptureService.cpp, function addSourceByGuid(), line 453)
SCF: Error BadParameter:  (propagating from src/api/CameraDriver.cpp, function addSourceByIndex(), line 347)
SCF: Error BadParameter:  (propagating from src/api/CameraDriver.cpp, function getSource(), line 519)
=== gst-launch-1.0[2108]: CameraProvider initialized (0xffffa89f0210)=== gst-launch-1.0[2108]: CameraProvider destroyed (0xffffa89f0210)=== gst-launch-1.0[2108]: Connection closed (FFFFACE5B840)=== gst-launch-1.0[2108]: Connection cleaned up (FFFFACE5B840)

hello nikita.pichugin,

it still the failure of sensor initialization to enable camera stream. please refer to Infinite Timeout Support to test with enableCamInfiniteTimeout=1.
besides, since JetPack 6.1 is now available, is it possible for moving to the latest release version for verification.

Hi,
I tried enableCamInfiniteTimeout=1 and it did not help. Right now my camera is configured to the infinite streaming, it does not even have stream_stop and stream_start functions. It just constantly streams video.

I can try to update to JetPack 6.1 but I am not really sure that it will help.

By the way, here are all handlers I have in my driver:

static struct tegracam_ctrl_ops ar0233_ctrl_ops = {
	.numctrls = ARRAY_SIZE(ctrl_cid_list),
	.ctrl_cid_list = ctrl_cid_list,
	.set_gain = ar0233_set_gain,
	.set_exposure = ar0233_set_exposure,
	.set_exposure_short = ar0233_set_exposure,
	.set_group_hold = ar0233_set_group_hold,
};

Hi @nikita.pichugin,

It’s strange that you don’t have nvarguscamerasrc on your Jetson. Normally, when we install JetPack with the SDK Manager, all the necessary packages for using this GStreamer element should be included. If your debug efforts haven’t been successful, you might want to try this way following the guide here Flashing the board with GUI installer.

Regarding the issues you’re facing, these types of problems are often related to the device tree, particularly tegra-camera-platform. I suggest taking a close look at that section. In my opinion, your file tegra234-p3737-camera-dual-hawk-ar0234-e3653-overlay.dts looks fine, but there may be an issue with the main DTB.

You could also try the non-overlay method by creating a DTB and loading it manually. For this you can use this guide and pay special attention to this section To add camera modules to a device tree.

I hope this helps!

Jose Morera
Embedded Software Engineer at RidgeRun
Contact us: support@ridgerun.com
Developers wiki: https://developer.ridgerun.com
Website: www.ridgerun.com

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.