The v4l2 instruction captures images, but gstreamer does not

Hi,
I ran into a weird issue where I could provide the raw data captured to the camera by v4l2 as follows:

root@blst-pz-desktop:/home/blst_pz# v4l2-ctl --set-fmt-video=width=4000,height=3000,pixelformat=RG10 --set-ctrl bypass_mode=0 --stream-mmap --stream-count=1 --set-ctrl=sensor_mode=0 --stream-to=camera.raw -d /dev/video0
<

However, when I tried to capture the camera data through gstreamer, it failed, as follows:

root@blst-pz-desktop:/home/blst_pz# gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! 'video/x-raw(memory:NVMM), width=(int)4000,height=(int)3000, framerate=25/1, pixelformat=NV12' ! nvvidconv flip-method=0 ! pngenc ! multifilesink location=%05u.png -e
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Error generated. gstnvarguscamerasrc.cpp, execute:780 No cameras available
Got EOS from element "pipeline0".
Execution ended after 0:00:00.005476002
Setting pipeline to NULL ...
Freeing pipeline ...

This is true for both photos and videos, which give the following error:

root@blst-pz-desktop:/home/blst_pz# gst-launch-1.0 -v nvarguscamerasrc sensor-mode=1 sensor-id=0  ! nvvidconv ! 'video/x-raw(memory:NVMM), width=4000, height=3000, format=(string)NV12, framerate=25/1' ! nvv4l2h265enc ! h265parse ! qtmux ! filesink location=test_h265.mp4 -e
Setting pipeline to PAUSED ...
Opening in BLOCKING MODE 
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Error generated. gstnvarguscamerasrc.cpp, execute:780 No cameras available
/GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)4000, height=(int)3000, framerate=(fraction)25/1, format=(string)NV12
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)4000, height=(int)3000, framerate=(fraction)25/1, format=(string)NV12
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)4000, height=(int)3000, framerate=(fraction)25/1, format=(string)NV12
/GstPipeline:pipeline0/nvv4l2h265enc:nvv4l2h265enc0.GstPad:src: caps = video/x-h265, stream-format=(string)byte-stream, alignment=(string)au, profile=(string)NULL, width=(int)4000, height=(int)3000, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)25/1, interlace-mode=(string)progressive, colorimetry=(string)bt2020, chroma-site=(string)mpeg2
NvMMLiteOpen : Block : BlockType = 8 
/GstPipeline:pipeline0/GstH265Parse:h265parse0.GstPad:sink: caps = video/x-h265, stream-format=(string)byte-stream, alignment=(string)au, profile=(string)NULL, width=(int)4000, height=(int)3000, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)25/1, interlace-mode=(string)progressive, colorimetry=(string)bt2020, chroma-site=(string)mpeg2
Redistribute latency...
===== NvVideo: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 8 
/GstPipeline:pipeline0/nvv4l2h265enc:nvv4l2h265enc0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)4000, height=(int)3000, framerate=(fraction)25/1, format=(string)NV12
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)4000, height=(int)3000, framerate=(fraction)25/1, format=(string)NV12
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)4000, height=(int)3000, framerate=(fraction)25/1, format=(string)NV12
/GstPipeline:pipeline0/GstQTMux:qtmux0.GstPad:src: caps = video/quicktime, variant=(string)apple
/GstPipeline:pipeline0/GstFileSink:filesink0.GstPad:sink: caps = video/quicktime, variant=(string)apple
/GstPipeline:pipeline0/GstQTMux:qtmux0.GstPad:src: caps = video/quicktime, variant=(string)apple, streamheader=(buffer)< 000002446d6f6f760000006c6d76686400000000e36b657fe36b657f00000708000000000001000001000000000000000000000000010000000000000000000000000000000100000000000000000000000000004000000000000000000000000000000000000000000000000000000000000002000001937472616b0000005c746b686400000007e36b657fe36b657f000000010000000000000000000000000000000000000000000000000001000000000000000000000000000000010000000000000000000000000000400000000000000000000000000000f26d646961000000206d64686400000000e36b657fe36b657f000000000000000055c400000000002168646c7200000000000000000000000000000000000000000000000000000000a96d696e660000002168646c720000000064686c72616c6973000000000000000000000000000000002464696e660000001c6472656600000000000000010000000c616c6973000000010000005c7374626c000000107374736400000000000000000000001073747473000000000000000000000010737473630000000000000000000000147374737a000000000000000000000000000000107374636f00000000000000000000003d75647461000000356d657461000000000000002168646c72000000006d686c726d6469720000000000000000000000000000000008696c73740000003d75647461000000356d657461000000000000002168646c72000000006d686c726d6469720000000000000000000000000000000008696c7374 >
/GstPipeline:pipeline0/GstFileSink:filesink0.GstPad:sink: caps = video/quicktime, variant=(string)apple, streamheader=(buffer)< 000002446d6f6f760000006c6d76686400000000e36b657fe36b657f00000708000000000001000001000000000000000000000000010000000000000000000000000000000100000000000000000000000000004000000000000000000000000000000000000000000000000000000000000002000001937472616b0000005c746b686400000007e36b657fe36b657f000000010000000000000000000000000000000000000000000000000001000000000000000000000000000000010000000000000000000000000000400000000000000000000000000000f26d646961000000206d64686400000000e36b657fe36b657f000000000000000055c400000000002168646c7200000000000000000000000000000000000000000000000000000000a96d696e660000002168646c720000000064686c72616c6973000000000000000000000000000000002464696e660000001c6472656600000000000000010000000c616c6973000000010000005c7374626c000000107374736400000000000000000000001073747473000000000000000000000010737473630000000000000000000000147374737a000000000000000000000000000000107374636f00000000000000000000003d75647461000000356d657461000000000000002168646c72000000006d686c726d6469720000000000000000000000000000000008696c73740000003d75647461000000356d657461000000000000002168646c72000000006d686c726d6469720000000000000000000000000000000008696c7374 >
Got EOS from element "pipeline0".
Execution ended after 0:00:00.091428503
Setting pipeline to NULL ...
Freeing pipeline ...

I was using jetson 4.6.1 and it worked fine for a long time, now I upgraded it to jetson 5.1.4. I ported the camera driver and it didn’t work.

I hope I can get your help. How can I solve this problem?

Hello @darius-yuan

Let’s try a simpler pipeline, such as:

gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! 'video/x-raw(memory:NVMM), width=(int)4000,height=(int)3000, framerate=25/1, pixelformat=NV12' ! nvvidconv flip-method=0 ! fakesink silent=false -v

Could you share the sensor driver device tree, especially the tegra-camera-platform node?

Regards!
Eduardo Salazar
Embedded SW Engineer at RidgeRun

Contact us: support@ridgerun.com
Developers wiki: https://developer.ridgerun.com/
Website: www.ridgerun.com

Thank you for your reply.

I tried it, but it still got an error.

root@blst-pz-desktop:/home/blst_pz# v4l2-ctl --set-fmt-video=width=4000,height=3000,pixelformat=RG10 --set-ctrl bypass_mode=0 --stream-mmap --stream-count=1 --set-ctrl=sensor_mode=0 --stream-to=camera.raw -d /dev/video1
<
root@blst-pz-desktop:/home/blst_pz# gst-launch-1.0 nvarguscamerasrc sensor-id=1 ! 'video/x-raw(memory:NVMM), width=(int)4000,height=(int)3000, framerate=25/1, pixelformat=NV12' ! nvvidconv flip-method=0 ! fakesink silent=false -v
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
/GstPipeline:pipeline0/GstFakeSink:fakesink0: last-message = event   ******* (fakesink0:sink) E (type: stream-start (10254), GstEventStreamStart, stream-id=(string)ca84d63d2906058d36ae674f946a09ea, flags=(GstStreamFlags)GST_STREAM_FLAG_NONE, group-id=(uint)1;) 0xaaaaee3aa4f0
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Error generated. gstnvarguscamerasrc.cpp, execute:780 No cameras available
/GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)4000, height=(int)3000, format=(string)NV12, framerate=(fraction)25/1, pixelformat=(string)NV12
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)4000, height=(int)3000, format=(string)NV12, framerate=(fraction)25/1, pixelformat=(string)NV12
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)4000, height=(int)3000, format=(string)NV12, framerate=(fraction)25/1, pixelformat=(string)NV12
/GstPipeline:pipeline0/GstFakeSink:fakesink0: last-message = event   ******* (fakesink0:sink) E (type: caps (12814), GstEventCaps, caps=(GstCaps)"video/x-raw\(memory:NVMM\)\,\ width\=\(int\)4000\,\ height\=\(int\)3000\,\ format\=\(string\)NV12\,\ framerate\=\(fraction\)25/1\,\ pixelformat\=\(string\)NV12";) 0xaaaaee3aa640
/GstPipeline:pipeline0/GstFakeSink:fakesink0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)4000, height=(int)3000, format=(string)NV12, framerate=(fraction)25/1, pixelformat=(string)NV12
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)4000, height=(int)3000, format=(string)NV12, framerate=(fraction)25/1, pixelformat=(string)NV12
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)4000, height=(int)3000, format=(string)NV12, framerate=(fraction)25/1, pixelformat=(string)NV12
/GstPipeline:pipeline0/GstFakeSink:fakesink0: last-message = event   ******* (fakesink0:sink) E (type: eos (28174), ) 0xaaaaee3aa6b0
Got EOS from element "pipeline0".
Execution ended after 0:00:00.006505195
Setting pipeline to NULL ...
Freeing pipeline ...

This is tegra-camera-platform node and mode1:

/{	
	tcp: tegra-camera-platform {
		compatible = "nvidia, tegra-camera-platform";
		
		num_csi_lanes = <8>;
		max_lane_speed = <2144607>;
		min_bits_per_pixel = <10>;
		vi_peak_byte_per_pixel = <5>;
		vi_bw_margin_pct = <2>;
		
		max_pixel_rate = <1715686>;
		isp_peak_byte_per_pixel = <5>;
		isp_bw_margin_pct = <2>;
		

		modules {
			//imx586s
			cam_module0: module0 {
				badge = "imx586_rear_BS";
				position = "rear";
				orientation = "1";
				cam_module0_drivernode0: drivernode0 {
					pcl_id = "v4l2_sensor";
					devname = "imx586 2-001a";
					//proc-device-tree = "/proc/device-tree/cam_i2cmux/i2c@0/imx586_a@34";
					proc-device-tree = "/proc/device-tree/cam_i2cmux/i2c@3180000/imx586_a@34";
				};
				cam_module0_drivernode1: drivernode1 {
					pcl_id = "v4l2_lens";
					proc-device-tree = "/proc/device-tree/lens_imx586@RBPCV2/";
				};
			};
			//imx586_8
			cam_module1: module1 {
				badge = "imx586_front_BS";
				position = "front";
				orientation = "1";
				cam_module1_drivernode0: drivernode0 {
					/* Declare PCL support driver (classically known as guid)  */
					pcl_id = "v4l2_sensor";
					/* Driver v4l2 device name */
					devname = "imx586_8 8-001a";
					/* Declare the device-tree hierarchy to driver instance */
					//proc-device-tree = "/proc/device-tree/cam_i2cmux/i2c@1/imx586_c@34";
					proc-device-tree = "/proc/device-tree/cam_i2cmux/i2c@31e0000/imx586_c@34";
				};
				cam_module1_drivernode1: drivernode1 {
					pcl_id = "v4l2_lens";
					proc-device-tree = "/proc/device-tree/lens_imx586@RBPCV2/";
				};
			};
		};
	};
};
mode1 {/*mode IMX586_MODE_4000X3000_NOR_35FPS*/
					mclk_khz = "24000";
					num_lanes = "4";
					tegra_sinterface = "serial_a";
					phy_mode = "DPHY";
					discontinuous_clk = "no";
					dpcm_enable = "false";
					cil_settletime = "0";
					dynamic_pixel_bit_depth = "10";
					csi_pixel_bit_depth = "10";
					mode_type = "bayer";
					pixel_phase = "rggb";
					
					active_w = "4000";
					active_h = "3000";
					readout_orientation = "0";
					line_length = "7872";
					inherent_gain = "1";
					mclk_multiplier = "625";
					pix_clk_hz = "864000000";
					
					//gain
					gain_factor = "10";
					min_gain_val = "10"; 
					max_gain_val = "360";
					step_gain_val = "1";
					default_gain = "10";

					min_hdr_ratio = "1";
					max_hdr_ratio = "1";

                    //framerate
					framerate_factor = "1000000";
					min_framerate = "1000000"; 
					max_framerate = "35868005";	
					step_framerate = "1";
					default_framerate = "35868005";
					
					//exposure	
					exposure_factor = "1000000";
					min_exp_time = "56"; 
					max_exp_time = "28571";	
					step_exp_time = "2";
					default_exp_time = "28571";

					embedded_metadata_height = "0";
				};

Does jetson 5.1.4 have Settings to update the device tree?

This all works on jetson 4.6.1, I don’t understand why?

Hello @darius-yuan

From the following definitions, I see some inconsistencies.

You are using the i2c camera mux. This enables the i2c buses 9 and 10 in the Xavier NX and you are defining the buses 2 and 8. I suggest you checking the imx477 device tree in the Xavier NX sources at hardware/nvidia/platform/t19x/jakku/kernel-dts/common/tegra194-camera-rbpcv3-imx477.dtsi. The tegra-camera-platform for two cameras support is defined as:

/ {
	tegra-camera-platform {
		compatible = "nvidia, tegra-camera-platform";

		/**
		* Physical settings to calculate max ISO BW
		*
		* num_csi_lanes = <>;
		* Total number of CSI lanes when all cameras are active
		*
		* max_lane_speed = <>;
		* Max lane speed in Kbit/s
		*
		* min_bits_per_pixel = <>;
		* Min bits per pixel
		*
		* vi_peak_byte_per_pixel = <>;
		* Max byte per pixel for the VI ISO case
		*
		* vi_bw_margin_pct = <>;
		* Vi bandwidth margin in percentage
		*
		* max_pixel_rate = <>;
		* Max pixel rate in Kpixel/s for the ISP ISO case
		*
		* isp_peak_byte_per_pixel = <>;
		* Max byte per pixel for the ISP ISO case
		*
		* isp_bw_margin_pct = <>;
		* Isp bandwidth margin in percentage
		*/
		num_csi_lanes = <4>;
		max_lane_speed = <1500000>;
		min_bits_per_pixel = <10>;
		vi_peak_byte_per_pixel = <2>;
		vi_bw_margin_pct = <25>;
		max_pixel_rate = <7500000>;
		isp_peak_byte_per_pixel = <5>;
		isp_bw_margin_pct = <25>;

		/**
		 * The general guideline for naming badge_info contains 3 parts, and is as follows,
		 * The first part is the camera_board_id for the module; if the module is in a FFD
		 * platform, then use the platform name for this part.
		 * The second part contains the position of the module, ex. "rear" or "front".
		 * The third part contains the last 6 characters of a part number which is found
		 * in the module's specsheet from the vendor.
		 */
		modules {
			module0 {
				badge = "jakku_front_IMX477";
				position = "front";
				orientation = "1";
				drivernode0 {
					pcl_id = "v4l2_sensor";
					devname = "imx477 9-001a";
					proc-device-tree = "/proc/device-tree/cam_i2cmux/i2c@0/rbpcv3_imx477_a@1a";
				};
			};
			module1 {
				badge = "jakku_rear_IMX477";
				position = "rear";
				orientation = "1";
				drivernode0 {
					pcl_id = "v4l2_sensor";
					devname = "imx477 10-001a";
					proc-device-tree = "/proc/device-tree/cam_i2cmux/i2c@1/rbpcv3_imx477_c@1a";
				};
			};
		};
	};
};

Please, use that device tree as a reference and compare yours with the imx477 device tree.

Hope this helps!
Regards!

Thank you for your reply.

That’s the problem.

Now the camera can stream normally.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.