V4L2 get raw image but argus source fail

Hi,

We are working for OV5647 on jetson nano.
Save raw file via v4l2 is done.

v4l2-ctl -d /dev/video0 --set-fmt-video=width=1920,height=1080,pixelformat=BG10 --set-ctrl bypass_mode=0 --stream-mmap --stream-count=1 --stream-to=test.raw

V4L2 stream is fine.

h@h-desktop:/var/log$ v4l2-ctl -d /dev/video0 --set-ctrl bypass_mode=0 --stream-mmap
<<<<<<<<<<<<<<<<< 15.38 fps
<<<<<<<<<<<<<<< 15.34 fps
<<<<<<<<<<<<<<< 15.33 fps
<<<<<<<<<<<<<<<< 15.34 fps

But argus streaming is failed.

h@h-desktop:/lib/modules$ gst-launch-1.0 nvarguscamerasrc ! ‘video/x-raw(memory:NVMM),width=1920, height=1080, framerate=30/1, format=NV12’ ! nvvidconv flip-method=0 ! ‘video/x-raw,width=960, height=616’ ! nvvidconv ! nvegltransform ! nveglglessink -e
Setting pipeline to PAUSED …

Using winsys: x11
Pipeline is live and does not need PREROLL …
Got context from element ‘eglglessink0’: gst.egl.EGLDisplay=context, display=(GstEGLDisplay)NULL;
Setting pipeline to PLAYING …
New clock: GstSystemClock
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:532 No cameras available
Got EOS from element “pipeline0”.
Execution ended after 0:00:00.313184895
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …

Below is nvargus-daemon log

=== NVIDIA Libargus Camera Service (0.97.3)=== Listening for connections…=== gst-launch-1.0[8595]: Connection established (7F9A6FF1D0)OFParserListModules: module list: /proc/device-tree/tegra-camera-platform/modules/module0
OFParserGetVirtualDevice: NVIDIA Camera virtual enumerator not found in proc device-tree
---- imager: No override file found. ----
(NvCamV4l2) Error ModuleNotPresent: V4L2Device not available (in /dvs/git/dirty/git-master_linux/camera/utils/nvcamv4l2/v4l2_device.cpp, function findDevice(), line 256)
(NvCamV4l2) Error ModuleNotPresent: (propagating from /dvs/git/dirty/git-master_linux/camera/utils/nvcamv4l2/v4l2_device.cpp, function initialize(), line 60)
(NvOdmDevice) Error ModuleNotPresent: (propagating from dvs/git/dirty/git-master_linux/camera-partner/imager/src/devices/V4L2SensorViCsi.cpp, function initialize(), line 106)
NvPclDriverInitializeData: Unable to initialize driver v4l2_sensor
NvPclInitializeDrivers: error: Failed to init camera sub module v4l2_sensor
NvPclStartPlatformDrivers: Failed to start module drivers
NvPclStateControllerOpen: Failed ImagerGUID 1. (error 0xA000E)
NvPclOpen: PCL Open Failed. Error: 0xf
SCF: Error BadParameter: Sensor could not be opened. (in src/services/capture/CaptureServiceDeviceSensor.cpp, function getSourceFromGuid(), line 582)
SCF: Error BadParameter: (propagating from src/services/capture/CaptureService.cpp, function addSourceByGuid(), line 437)
SCF: Error BadParameter: (propagating from src/api/CameraDriver.cpp, function addSourceByIndex(), line 295)
SCF: Error BadParameter: (propagating from src/api/CameraDriver.cpp, function getSource(), line 458)
Acquiring SCF Camera device source via index 0 has failed. === gst-launch-1.0[8595]: CameraProvider initialized (0x7f94835610)=== gst-launch-1.0[8595]: CameraProvider destroyed (0x7f94835610)=== gst-launch-1.0[8595]: Connection closed (7F9A6FF1D0)=== gst-launch-1.0[8595]: Connection cleaned up (7F9A6FF1D0)

Below is the dtsi file

#include <dt-bindings/media/camera.h>
#include <dt-bindings/platform/t210/t210.h>

/ {
host1x {
vi_base: vi {
num-channels = <1>;
ports {
#address-cells = <1>;
#size-cells = <0>;
vi_port0: port@0 {
reg = <0>;
rbpcv2_ov5647_vi_in0: endpoint {
port-index = <0>;
bus-width = <2>;
remote-endpoint = <&rbpcv2_ov5647_csi_out0>;
};
};
};
};

  csi_base: nvcsi {
  	num-channels = <1>;
  	#address-cells = <1>;
  	#size-cells = <0>;
  	csi_chan0: channel@0 {
  		reg = <0>;
  		ports {
  			#address-cells = <1>;
  			#size-cells = <0>;
  			csi_chan0_port0: port@0 {
  				reg = <0>;
  				rbpcv2_ov5647_csi_in0: endpoint@0 {
  					port-index = <0>;
  					bus-width = <2>;
  					remote-endpoint = <&rbpcv2_ov5647_out0>;
  				};
  			};
  			csi_chan0_port1: port@1 {
  				reg = <1>;
  				rbpcv2_ov5647_csi_out0: endpoint@1 {
  					remote-endpoint = <&rbpcv2_ov5647_vi_in0>;
  				};
  			};
  		};
  	};
  };

  i2c@546c0000 {
  	ov5647_single_cam0: rbpcv2_ov5647_a@10 {
  		compatible = "nvidia,ov5647";
  		/* I2C device address */
  		reg = <0x36>;

  		/* V4L2 device node location */
  		devnode = "video0";

  		/* Physical dimensions of sensor */
  		physical_w = "3.674";
  		physical_h = "2.738";

  		sensor_model = "ov5647";

  		use_sensor_mode_id = "true";
  		mode0 {
  			mclk_khz = "25000";
  			num_lanes = "2";
  			tegra_sinterface = "serial_a";
  			phy_mode = "DPHY";
  			discontinuous_clk = "yes";
  			dpcm_enable = "false";
  			cil_settletime = "0";

  			active_w = "1920";
  			active_h = "1080";
  			pixel_t = "bayer_bggr";
  			readout_orientation = "90";
  			line_length = "2688";
  			inherent_gain = "1";
  			pix_clk_hz = "68000000";

  			gain_factor = "10";
  			min_gain_val = "10";/* 1DB*/
  			max_gain_val = "160";/* 16DB*/
  			step_gain_val = "1";
  			default_gain = "80";
  			min_hdr_ratio = "1";
  			max_hdr_ratio = "1";
  			framerate_factor = "1000000";
  			min_framerate = "1816577";/*1.816577 */
  			max_framerate = "30000000";/*15*/
  			step_framerate = "1";
  			default_framerate = "30000000";
  			exposure_factor = "1000000";
  			min_exp_time = "34";/* us */
  			max_exp_time = "550385";/* us */
  			step_exp_time = "1";
  			default_exp_time = "33334";/* us */
  			embedded_metadata_height = "0";
  		};				
  		ports {
  			#address-cells = <1>;
  			#size-cells = <0>;

  			port@0 {
  				reg = <0>;
  				rbpcv2_ov5647_out0: endpoint {
  					port-index = <0>;
  					bus-width = <2>;
  					remote-endpoint = <&rbpcv2_ov5647_csi_in0>;
  				};
  			};
  		};
  	};
  };
};

lens_ov5647@RBPCV2 {
min_focus_distance = “0.0”;
hyper_focal = “0.0”;
focal_length = “2.67”;
f_number = “2.0”;
aperture = “2.0”;
};
};

/ {
tcp: tegra-camera-platform {
compatible = “nvidia, tegra-camera-platform”;
modules {
cam_module0: module0 {
badge = “porg_front_RBPCV2”;
position = “front”;
orientation = “1”;
cam_module0_drivernode0: drivernode0 {
pcl_id = “v4l2_sensor”;
devname = “ov5647 6-0010”;
proc-device-tree = “/proc/device-tree/host1x/i2c@546c0000/rbpcv2_ov5647_a@10”;
};
cam_module0_drivernode1: drivernode1 {
pcl_id = “v4l2_lens”;
proc-device-tree = “/proc/device-tree/lens_ov5647@RBPCV2/”;
};
};
};
};
};

Thanks a lot for you help!

What’s the bus number and slave address? Modify below to match the bus number and slave address. Shouldn’t be 6-0010

Thanks for reply!

Did you mean I2C address? It’s 0x36.
The IMX219 devname is "“imx219 6-0010”, Is the correct devname “ov5647 6-0036”?

Yes, the bus number must correct too.
Have a confirm with v4l2-ctl --list-devices

Now we can connect to daemon, but no image in graphic window.

h@h-desktop:~$ gst-launch-1.0 nvarguscamerasrc ! ‘video/x-raw(memory:NVMM),width=1920, height=1080, framerate=30/1, format=NV12’ ! nvvidconv flip-method=0 ! ‘video/x-raw,width=960, height=616’ ! nvvidconv ! nvegltransform ! nveglglessink -eSetting pipeline to PAUSED … Using winsys: x11 Pipeline is live and does not need PREROLL … Got context from element ‘eglglessink0’: gst.egl.EGLDisplay=context, display=(GstEGLDisplay)NULL; Setting pipeline to PLAYING …
New clock: GstSystemClock GST_ARGUS: Creating output stream CONSUMER: Waiting until producer is connected… GST_ARGUS: Available Sensor modes : GST_ARGUS: 1920 x 1080 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 16.000000; Exposure Range min 34000, max 550385000; GST_ARGUS: Running with following settings: Camera index = 0 Camera mode = 0 Output Stream W = 1920 H = 1080 seconds to Run = 0
Frame Rate = 29.999999
GST_ARGUS: PowerService: requested_clock_Hz=13608000
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.

This is logs

Mar 16 05:22:12 h-desktop nvargus-daemon[5046]: === NVIDIA Libargus Camera Service (0.97.3)=== Listening for connections…=== gst-launch-1.0[7244]: Connection established (7F912691D0)OFParserListModules: module list: /proc/device-tree/tegra-camera-platform/modules/module0 Mar 16 05:22:12 h-desktop nvargus-daemon[5046]: OFParserGetVirtualDevice: NVIDIA Camera virtual enumerator not found in proc device-tree
Mar 16 05:22:12 h-desktop nvargus-daemon[5046]: ---- imager: No override file found. ---- Mar 16 05:22:12 h-desktop nvargus-daemon[5046]: LSC: LSC surface is not based on full res! Mar 16 05:22:12 h-desktop nvargus-daemon[5046]: === gst-launch-1.0[7244]: CameraProvider initialized (0x7f8c825e20)LSC: LSC surface is not based on full res! Mar 16 05:22:21 h-desktop nvargus-daemon[5046]: SCF: Error Timeout: (propagating from src/services/capture/CaptureServiceEvent.cpp, function wait(), line 59)
Mar 16 05:22:21 h-desktop nvargus-daemon[5046]: Error: Camera HwEvents wait, this may indicate a hardware timeout occured,abort current/incoming cc

Looks like can’t capture frame from the sensor cause the timeout. Try to modify the discontinuous_clk to opposite value to check. If still failed modify the sensor driver to have the gain/exposure/frame rate control function as dummy function to make sure the same setting with the v4l2-ctl.

Dummy function return zero directly , then we got the video stream via argus source.
The discontinuous_clk set to YES or NO doest not matter.
Thanks!

There is the last question in this topic:
How to set the exposure/gain in the driver?
Or it should set by application via argus lib?

Have a check the programing guide and you need to consult with sensor vendor how to implement it.

https://docs.nvidia.com/jetson/l4t/index.html#page/Tegra%20Linux%20Driver%20Package%20Development%20Guide%2Fcamera_sensor_prog.html%23wwpID0E0MN0HA