How to solve the problem of camera image acquisition inverted and purple?

Hello, the platform I use is tx1 r28.2, the camera is imx214, currently running the 10_camera_recording example, I can capture the image, but the captured image is inverted, I need to reverse the image, and the color of the image is purple . Can you give some suggestions, thank you!

Hi,
You can use NvBufferTransform_Flip to rotate the image:

/**
 * Defines video flip methods.
 */
typedef enum
{
  /** Video flip none. */
  NvBufferTransform_None,
  /** Video flip rotate 90 degree clockwise. */
  NvBufferTransform_Rotate90,
  /** Video flip rotate 180 degree clockwise. */
  NvBufferTransform_Rotate180,
  /** Video flip rotate 270 degree clockwise. */
  NvBufferTransform_Rotate270,
  /** Video flip with respect to X-axis. */
  NvBufferTransform_FlipX,
  /** Video flip with respect to Y-axis. */
  NvBufferTransform_FlipY,
  /** Video flip transpose. */
  NvBufferTransform_Transpose,
  /** Video flip inverse transpode. */
  NvBufferTransform_InvTranspose,
} NvBufferTransform_Flip;

For purple image, you should check if the device tree is set correctly. Maybe the format is not correct?

Hi,
Thank you very much for your quick answer. Currently, I solved the image inversion problem by setting the imx214 register, but the purple color of the image has not been solved. The relevant parts of my device tree are as follows:

i2c@546c0000 {
			avdd_dsi_csi-supply = <&max77620_ldo0>;

			status = "okay";
			#address-cells = <1>;
			#size-cells = <0>;
			imx214@10 {
				compatible = "nvidia,imx214";
				/* I2C device address */
				reg = <0x10>;

				/* V4L2 device node location */
				devnode = "video1";

				clocks = <&tegra_car TEGRA210_CLK_CLK_OUT_3>;
				clock-names = "clk_out_3";
				clock-frequency = <24000000>;
				mclk = "clk_out_3";

				reset-gpios = <&gpio TEGRA_GPIO(S, 5) GPIO_ACTIVE_HIGH>;
				pwdn-gpios = <&gpio TEGRA_GPIO(T, 0) GPIO_ACTIVE_HIGH>;

				/* Physical dimensions of sensor */
				physical_w = "3.674";
				physical_h = "2.738";

				/* Sensor output flip settings */
				vertical-flip = "true";

				/* Define any required hw resources needed by driver */
				/* ie. clocks, io pins, power sources */
				avdd-reg = "vana";
				iovdd-reg = "vif";

				/**
				* A modeX node is required to support v4l2 driver
				* implementation with NVIDIA camera software stack
				*
				* mclk_khz = "";
				* Standard MIPI driving clock, typically 24MHz
				*
				* num_lanes = "";
				* Number of lane channels sensor is programmed to output
				*
				* tegra_sinterface = "";
				* The base tegra serial interface lanes are connected to
				*
				* discontinuous_clk = "";
				* The sensor is programmed to use a discontinuous clock on MIPI lanes
				*
				* dpcm_enable = "true";
				* The sensor is programmed to use a DPCM modes
				*
				* cil_settletime = "";
				* MIPI lane settle time value.
				* A "0" value attempts to autocalibrate based on mclk_multiplier
				*
				*
				*
				*
				* active_w = "";
				* Pixel active region width
				*
				* active_h = "";
				* Pixel active region height
				*
				* pixel_t = "";
				* The sensor readout pixel pattern
				*
				* readout_orientation = "0";
				* Based on camera module orientation.
				* Only change readout_orientation if you specifically
				* Program a different readout order for this mode
				*
				* line_length = "";
				* Pixel line length (width) for sensor mode.
				* This is used to calibrate features in our camera stack.
				*
				* mclk_multiplier = "";
				* Multiplier to MCLK to help time hardware capture sequence
				* TODO: Assign to PLL_Multiplier as well until fixed in core
				*
				* pix_clk_hz = "";
				* Sensor pixel clock used for calculations like exposure and framerate
				*
				*
				*
				*
				* inherent_gain = "";
				* Gain obtained inherently from mode (ie. pixel binning)
				*
				* min_gain_val = ""; (floor to 6 decimal places)
				* max_gain_val = ""; (floor to 6 decimal places)
				* Gain limits for mode
				*
				* min_exp_time = ""; (ceil to integer)
				* max_exp_time = ""; (ceil to integer)
				* Exposure Time limits for mode (us)
				*
				*
				* min_hdr_ratio = "";
				* max_hdr_ratio = "";
				* HDR Ratio limits for mode
				*
				* min_framerate = "";
				* max_framerate = "";
				* Framerate limits for mode (fps)
				*/
				mode0 { // IMX214_MODE_3840X2160
					mclk_khz = "24000";
					num_lanes = "4";
					tegra_sinterface = "serial_c";
					discontinuous_clk = "no";
					dpcm_enable = "false";
					cil_settletime = "0";

					active_w = "3840";
					active_h = "2160";
					dynamic_pixel_bit_depth = "10";
					csi_pixel_bit_depth = "10";
					pixel_t = "bayer_bggr";
					readout_orientation = "90";
					line_length = "4200";
					inherent_gain = "1";
					mclk_multiplier = "16";
					pix_clk_hz = "400000000";

					min_gain_val = "1.0";
					max_gain_val = "22.2";
					min_hdr_ratio = "1";
					max_hdr_ratio = "64";
					min_framerate = "1.462526";
					max_framerate = "30";
					min_exp_time = "16.165";
					max_exp_time = "165770";
				};
									
				ports {
					#address-cells = <1>;
					#size-cells = <0>;

					port@0 {
						reg = <0>;
						e3326_ov5693_out0: endpoint {
							csi-port = <2>;
							bus-width = <4>;
							remote-endpoint = <&e3326_csi_in0>;
						};
					};
				};
			};
		};
	};

	tegra-camera-platform {
		compatible = "nvidia, tegra-camera-platform";
		/**
		* Physical settings to calculate max ISO BW
		*
		* num_csi_lanes = <>;
		* Total number of CSI lanes when all cameras are active
		*
		* max_lane_speed = <>;
		* Max lane speed in Kbit/s
		*
		* min_bits_per_pixel = <>;
		* Min bits per pixel
		*
		* vi_peak_byte_per_pixel = <>;
		* Max byte per pixel for the VI ISO case
		*
		* vi_bw_margin_pct = <>;
		* Vi bandwidth margin in percentage
		*
		* isp_peak_byte_per_pixel = <>;
		* Max byte per pixel for the ISP ISO case
		*
		* isp_bw_margin_pct = <>;
		* Isp bandwidth margin in percentage
		*/
		num_csi_lanes = <8>;
		max_lane_speed = <15000000>;
		min_bits_per_pixel = <10>;
		vi_peak_byte_per_pixel = <2>;
		vi_bw_margin_pct = <25>;
		max_pixel_rate = <750000>;
		isp_peak_byte_per_pixel = <5>;
		isp_bw_margin_pct = <25>;

		/**
		* The general guideline for naming badge_info contains 3 parts, and is as follows,
		* The first part is the camera_board_id for the module; if the module is in a FFD
		* platform, then use the platform name for this part.
		* The second part contains the position of the module, ex. “rear” or “front”.
		* The third part contains the last 6 characters of a part number which is found
		* in the module's specsheet from the vender.
		*/
		modules {
			module0 {
				badge = "e3326_front_P5V27C";
				position = "rear";
				orientation = "0";
				drivernode0 {
					/* Declare PCL support driver (classically known as guid)  */
					pcl_id = "v4l2_sensor";
					/* Driver's v4l2 device name */
					devname = "imx214 6-0010";
					/* Declare the device-tree hierarchy to driver instance */
					proc-device-tree = "/proc/device-tree/host1x/i2c@546c0000/imx214@10";
				};
			};
		};
	};
};

I observed that the images in various colors (such as red) are correct, but the entire image is generally purple.Thank you!!

hello shuuchao,

  1. could you please share your captured image,
  2. also, please double confirm your sensor specification to check the ordering is same as your property settings.
    for example,
pixel_t = "bayer_bggr";

@shuuchao
Also change below string, do not use any reference badge info for customize sensor.

e3326_front_P5V27C

Thank you very much for your reply:

  1. I confirmed the image format captured by the camera, which is bayer_bggr. I am very sorry that the image I uploaded always fails.
  2. I don’t write or write my own badge, I will make mistakes at runtime. I don’t know the badge format I defined, and where I should put it. Can you give me a sample, or have documentation, thank you.

hello shuuchao,

badge property is an unique name that identifies the module, please refer to Sensor Driver Programming Guide.
you should modify the property following the naming rules, or it will confuse the ISP hardware to load e3326’s configuration files.

badge = "e3326_front_P5V27C";

Hello,

  1. I changed dts to "badge = “imx214_rear_liimx214"” and the image is still purple.
  2. I changed dts to "badge = “imx214_front_liimx214”, the image is not purple, but white.
    I don’t know how “badge” works. At present, I changed dts to "badge = “imx214_front_liimx214”. How can I solve the problem of whiteness? Thank you!
    Image download link:
    https://pan.baidu.com/s/1lGZ1NFKk4OEcL0DOIoa74Q,
    Extraction code: gm9x.

@shyuchao
The badge name is relative the ISP configure loading.
You may need scaling partner’s help to go through the tuning process to gen the ISP configure for your module.

Hello, is it just that my badge is not assigned correctly, or do I still need to add a specific configuration file?

hello shuuchao,

the badge property is indicate which tuning configuration file need to loaded.
so, had you already finish tuning process? you should generate isp overwrite files that include all tuning parameter settings.
thanks

Hello, there is no imx214_front_liimx214.isp on my tx1, and there are no other *.isp files. I don’t know how to write this file. Can you give an example? How can I get this file? Thank you very much!
Here is the log at startup:

OFParserGetVirtualDevice: NVIDIA Camera virtual enumerator not found in proc device-tree
LoadOverridesFile: looking for override file [/Calib/camera_override.isp] 
1/16LoadOverridesFile: looking for override file [/data/nvcam/settings/camera_overrides.isp] 
2/16LoadOverridesFile: looking for override file [/opt/nvidia/nvcam/settings/camera_overrides.isp] 
3/16LoadOverridesFile: looking for override file [/var/nvidia/nvcam/settings/camera_overrides.isp] 
4/16LoadOverridesFile: looking for override file [/data/nvcam/camera_overrides.isp] 
5/16LoadOverridesFile: looking for override file [/data/nvcam/settings/imx214_front_liimx214.isp] 
6/16LoadOverridesFile: looking for override file [/opt/nvidia/nvcam/settings/imx214_front_liimx214.isp] 
7/16LoadOverridesFile: looking for override file [/var/nvidia/nvcam/settings/imx214_front_liimx214.isp] 
8/16---- imager: No override file found. ----

hello shuuchao,

as we mentioned in comment #7 and #9,
please contact Jetson Preferred Partner if you need tuning support.
thanks

Hello, I asked the manufacturer that provided the camera, they don’t provide the isp configuration file, and other help, do I have other ways to solve the color problem? Thank you!

hello shuuchao,

sorry, the only way is contact Jetson Preferred Partner if you need camera tuning support.
thanks

Hello, my camera color purple problem is still not solved, I have the following questions I want to know:

  1. Do you know what the badge information of the imx214 sensor might be? For example, the badge information of the imx219 sensor is A815P2.
  2. I want to create the isp configuration file for imx214. Can you tell me more detailed creation steps?
  3. I want to know how the badge is called in the program. I have not found any relevant information in the kernel.
    Thank you very much!

The same question!How the libargus use badge information?

hi shuuchao,

these were all related to camera tuning process.
we don’t support camera tuning in discussion thread, hence we suggest you contact Jetson Preferred Partner for camera tuning related solutions.
please contact sales team if you have further request,
thanks

Hello, the image color is purple, I refer to the use case userAutoWhiteBalance to get relief. The new problem is that the video edges are dark, the middle is brighter, and the middle is a bit green.

  1. I read some information related to LSC (Lens Shading Correction). How can I solve this problem?
  2. I saw examples of argus\samples, like bayerAverageMap, histogram, openglBox, etc. I didn’t fully understand their impact on video. Are these examples helpful in solving the above problems?
    Thank you!

Hi shuuchao,
We strongly suggest you buy camera boards from our partners. They have many boards on market and you can go and pick one which is good for your usecase.