HDMI FPGA to "No cameras available" problem in ARGUS

Hello,

I want to display HDMI to MIPI signal on the screen through ARGUS.
I’m using the NX customer kit for Jetpack 4.6.
HDMI input is 3840x2160 30f.
The test sample is referring to Jetson Linux API Reference: Sample Applications

It was confirmed that the screen was displayed as 12_camera_v4l2_cuda through the sample.
This is using V4L2.

~/jetson_multimedia_api/samples/12_camera_v4l2_cuda$ ./camera_v4l2_cuda -d /dev/video0 -s 3840x2160 -f UYVY -n 300 -c
[INFO] (NvEglRenderer.cpp:110) <renderer0> Setting Screen width 3840 height 2160
WARN: request_camera_buff(): (line:380) Camera v4l2 buf length is not expected
WARN: request_camera_buff(): (line:380) Camera v4l2 buf length is not expected
WARN: request_camera_buff(): (line:380) Camera v4l2 buf length is not expected
WARN: request_camera_buff(): (line:380) Camera v4l2 buf length is not expected
Quit due to exit command from user!
----------- Element = renderer0 -----------
Total Profiling time = 316.297
Average FPS = 29.975
Total units processed = 9482
Num. of late units = 9184
-------------------------------------
App run was successful

And it is displayed normally even if you use the v4l2src plugin for G-Streamer.

gst-launch-1.0 v4l2src device=/dev/video0 ! "video/x-raw, format=(string)UYVY, width=(int)3840, height=(int)2160" ! nvvidconv ! "video/x-raw(memory:NVMM), format=(string)I420, width=(int)1920,height=(int)1080" ! nvoverlaysink overlay-w=1920 overlay-h=1080 sync=false

But with ARGUS, the camera is not recognized.

nvidia@nvidia-desktop:~/jetson_multimedia_api/samples/09_camera_jpeg_capture$ ./camera_jpeg_capture --img-res 3840x2160
[INFO] (NvEglRenderer.cpp:110) <renderer0> Setting Screen width 640 height 480
Error generated. main.cpp, execute:329 No cameras available
nvidia@nvidia-desktop:~$ gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! "video/x-raw(memory:NVMM), width=(int)3840, height=(int)2160, framerate=30/1" ! nvvidconv flip-method=0 ! 'video/x-raw, format=(string)I420' ! nvoverlaysink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:725 No cameras available
Got EOS from element "pipeline0".
Execution ended after 0:00:00.007371243
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
nvidia@nvidia-desktop:~$

If we present a picture of the current problem (Note Camera Architecture Stack.)
The red A path does not recognize the camera, and the green B path works normally.
This is how I understand it.


Are there cases where the camera is not recognized because of the Device Tree in the C path?
Here is the dts code:
By default, I2C is disabled.

/*
 * Copyright (c) 2018-2020, NVIDIA CORPORATION.  All rights reserved.
 *
 * This program is free software; you can redistribute it and/or modify
 * it under the terms of the GNU General Public License as published by
 * the Free Software Foundation; either version 2 of the License, or
 * (at your option) any later version.
 *
 * This program is distributed in the hope that it will be useful, but WITHOUT
 * ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
 * FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for
 * more details.
 *
 * You should have received a copy of the GNU General Public License
 * along with this program.  If not, see <http://www.gnu.org/licenses/>.
 */
#include <dt-bindings/media/camera.h>

#define LT6911_EN "okay"

/ {
	i2c@3180000 {
		clock-frequency = <100000>;

		lt6911uxc_a@2b {
			compatible = "lontium,lt6911uxc";
			reg = <0x2b>;
			status = "okay";
			devnode = "video0";
			/* Reset */
			reset-gpio = <&tegra_main_gpio TEGRA194_MAIN_GPIO(P, 4) GPIO_ACTIVE_LOW>;

			port@0 {
				ret = <0>;
				status = "okay";
				hdmi2csi_lt6911_out0: endpoint {
					status = "okay";
					port-index = <0>;
					bus-width = <4>;
					remote-endpoint = <&hdmi2csi_csi_in0>;
				};
			};
		};
	};

	host1x {
		/* Delete existing VI node to avoid conflicts */
		/delete-node/ vi;
		vi@15c10000 {
			num-channels = <1>;
			status = LT6911_EN;
			ports {
				#address-cells = <1>;
				#size-cells = <0>;
				port@0 {
					status = LT6911_EN;
					reg = <0>;
					hdmi2csi_vi_in0: endpoint {
						status = LT6911_EN;
						port-index =<0>;
						bus-width = <4>;
						remote-endpoint = <&hdmi2csi_csi_out0>;
					};
				};
			};
		};

		nvcsi@15a00000 {
			num-channels = <1>;
			#address-cells = <1>;
			#size-cells = <0>;
			status = LT6911_EN;
			channel@0 {
				status = LT6911_EN;
				reg = <0>;
				discontinuous_clk = "no";
				ports {
					#address-cells = <1>;
					#size-cells = <0>;
					port@0 {
						status = LT6911_EN;
						reg = <0>;
						hdmi2csi_csi_in0: endpoint@0 {
							status = LT6911_EN;
							port-index = <0>;
							bus-width = <4>;
							remote-endpoint = <&hdmi2csi_lt6911_out0>;
						};
					};
					port@1 {
                        status = LT6911_EN;
                        reg = <1>;
                        hdmi2csi_csi_out0: endpoint@1 {
                            status = LT6911_EN;
                            remote-endpoint = <&hdmi2csi_vi_in0>;
                        };
                    };
				};
			};
		};
	};

	tegra-camera-platform {
		compatible = "nvidia, tegra-camera-platform";
		/**
		* Physical settings to calculate max ISO BW
		*
		* num_csi_lanes = <>;
		* Total number of CSI lanes when all cameras are active
		*
		* max_lane_speed = <>;
		* Max lane speed in Kbit/s
		*
		* min_bits_per_pixel = <>;
		* Min bits per pixel
		*
		* vi_peak_byte_per_pixel = <>;
		* Max byte per pixel for the VI ISO case
		*
		* vi_bw_margin_pct = <>;
		* Vi bandwidth margin in percentage
		*
		* max_pixel_rate = <>;
		* Max pixel rate in Kpixel/s for the ISP ISO case
		*
		* isp_peak_byte_per_pixel = <>;
		* Max byte per pixel for the ISP ISO case
		*
		* isp_bw_margin_pct = <>;
		* Isp bandwidth margin in percentage
		*/
		num_csi_lanes = <8>;
		max_lane_speed = <2500000>;
		min_bits_per_pixel = <16>;
		vi_peak_byte_per_pixel = <3>;
		vi_bw_margin_pct = <25>;
		isp_peak_byte_per_pixel = <3>;
		isp_bw_margin_pct = <25>;
		status = LT6911_EN;

		/**
		* The general guideline for naming badge_info contains 3 parts, and is as follows,
		* The first part is the camera_board_id for the module; if the module is in a FFD
		* platform, then use the platform name for this part.
		* The second part contains the position of the module, ex. “rear” or “front”.
		* The third part contains the last 6 characters of a part number which is found
		* in the module's specsheet from the vender.
		*/
		modules {
			module0 {
				status = LT6911_EN;
				badge = "hdmi2csi_left_6911";
				position = "left";
				orientation = "1";
				drivernode0 {
					/* Declare PCL support driver (classically known as guid)  */
					pcl_id = "v4l2_sensor";
					/* Driver v4l2 device name */
					devname = "lt6911uxc 2-002b"; 
					/* Declare the device-tree hierarchy to driver instance */
					proc-device-tree = "/proc/device-tree/i2c@3180000/lt6911uxc_a@2b";
				};
			};
		};
	};
};

Looking at a source that doesn’t recognize the camera (jetson_multimedia_api/samples/09_camera_jpeg_capture/main.cpp)

    /* Create the CameraProvider object and get the core interface */
    UniqueObj<CameraProvider> cameraProvider = UniqueObj<CameraProvider>(CameraProvider::create());
    ICameraProvider *iCameraProvider = interface_cast<ICameraProvider>(cameraProvider);
    if (!iCameraProvider)
        ORIGINATE_ERROR("Failed to create CameraProvider");

    /* Get the camera devices */
    std::vector<CameraDevice*> cameraDevices;
    iCameraProvider->getCameraDevices(&cameraDevices);
    if (cameraDevices.size() == 0)
        ORIGINATE_ERROR("No cameras available");

The camera is not recognized by ICameraProvider . Is source debugging possible for this part?
Any other references?

Thanks in advance for the forum.

Hi,
You are not able to use Argus in this use-case. Argus is for MIPI capturing Bayer data and use hardware ISP engine to do delayering. Your use-case is MIPI capturing YUV422 frame data, so it is correct to capture through v4l2 interface. It is expected it works in running 12_camera_v4l2_cuda and using gstreamer v4l2src plugin.

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.