Turn off HDMI programmatically

WayneWWW,

the dmesg is shared, per your request.

dmesg.txt (68.5 KB)

igal.kroyter,

There is no error from kernel side. Please also share the xorg.0.log.

If both side have no error, I would suggest to check the timing of your panel.

Also, we use pll_d_out0 as parent clock for dsi panel on tx2. Not sure if this affects your result. You could compare your dts with “hardware/nvidia/platform/tegra/common/kernel-dts/panels/panel-s-wuxga-8-0.dtsi”

WayneWWW,

The log is attached (cat /var/log/Xorg.0.log > Xorg.0.log)
I am not sure how exactly the panel affects the NVidia data lanes, though.

Regards.
Xorg.0.log (19.3 KB)

Sorry, I just checked the dmesg again. There is no pclk calculation result for your DSI panel.

Could you enable some debug log in your panel driver? At least we can know that the power sequence has been called or not.

WayneWWW,

if you mean a driver like kernel/display/drivers/video/tegra/dc/panel/panel-p-wuxga-10-1.c then I do not have one as I looked into the functions which only require activation of the panel (a sample driver: panel-null-dsi.c) and supports the following functions which my “panel” the FPGA does not require:

  • dsi_null_panel_enable
  • dsi_null_panel_disable
  • dsi_null_panel_postsuspend
  • dsi_null_panel_hotpluginit

I assumed that filling in the information into the device tree is enough, as I’ve provided the signal properties in there. I think that I need to verify that the device tree is actually being parsed and configures the correct HW registers. If this is the correct approach could you provide a pointer to where I should look?

Mo

igal.kroyter,

Sorry I have no idea how to make it work. We have no experience to enable a FPGA through DSI interface.

Have you already modified your panel-null-dsi.c? I would suggest you could trace the tegradc. Find out why “tegra_dc_program_mode” is not called.

WayneWWW, thanks for the pointer

I shall follow the function’s trace and try to figure out whether the tegra_dc_program_mode function is invoked or not.

I was trying to follow the steps of configuration “dsi,1080p” which actually does not have any driver attached (per board-panel.c:available_internal_panel_select) and I tried to add “dsi,720p”. Is this “panel” is something NVidia supports?

Regards.

There are lots of background story for each dsi panel dts. For example, some of them may be from old tk1 board.

The only one verified working on tx2 is the one “panel-s-wuxga-8-0”.

WayneWWW, hi,

I have added a dummy driver per your advise and now the DSI channel A is output a DSI signal and clock.

Could you advise how do I activate a second DSI channel (I guess C). I need the channel to be independent with respect to the first channel, that is different resolution.

I have went though the forum and I am confused as there are statements that only gang arrangement can work other states that not gang arrangement can work.

Could you please advise how I should configure the device tree? should I add two panels under /host1x/dsi?

Sorry that this is out of the support scope so I cannot share much.

This user seems enable it successfully. You could take it as a reference.
https://devtalk.nvidia.com/default/topic/1037632/jetson-tx2/two-dsi-panels-displays-on-tx2-with-jetpack-3-2-1-/

WayneWWW,

This is one of the links that I have read but could not get much.

Though, I am a little confused, TX2 supports two DSI output channels but NVidia won’t share how to activate the second DSI channel? This is why we decided to go with the TX2 to begin with.

Regards.

igal.kroyter,

TX2 indeed has hardware capability to support two DSI outputs. As it supports many other display features for HDMI/DP. However, we don’t have every of them enabled in sw.

Especially you are using DSI panels which have many kinds of configuration and difference between each vendor. Unlike HDMI/DP, there is no standard way to support such usecase. They are all case by case.

WayneWWW, hi,

Thanks for the reply.
Regarding the activation the DSI. I just need to know how to configure the device tree to have both DSI channels active, with whatever panel there is, I think that once it is there I’ll be able to replace the panel driver with my own.
So how do you think that I should approach this investigation?

You could just add one more DSI in dts and see if it can work as first step, if your want to enable two DSI on two nvdisplay.

Please specify different controller in dt.

nvidia,dsi-instance

WayneLLL, hi,

do you know what might be the reason that while compiling the kernel the CONFIG_TEGRA_NVDISPLAY definition is set while when compiling the device tree (make dtbs) it is not?
This definition defines the DSI_INSTANCE_1 to be set to 2 (while kernel) and 1 (while in device tree), as such there is a contradiction during boot between both values.

BTW this the way that I have configured the dt:

/ {
	host1x {
		sor {
			status = "disabled";
			dp-display {
				status = "disabled";
			};
			hdmi-display {
				status = "disabled";
			};

			panel-s-edp-uhdtv-15-6 {
				smartdimmer {
					status = "disabled";
				};
			};
		};

		dpaux@155c0000 {
			status = "disabled";
		};

		sor1 {
			status = "okay";
			hdmi-display {
				status = "okay";
			};
			dp-display {
				status = "disabled";
			};
		};

		nvdisplay@15200000 {
			status = "okay";
			nvidia,dc-or-node = "/host1x/dsi";
		};		

		nvdisplay@15220000 {
			status = "okay";
			nvidia,dc-or-node = "/host1x/dsi";
		};

		dsi{
			status = "okay";
			compatible = "nvidia,tegra186-dsi";
			nvidia,dsi-controller-vs = <DSI_VS_1>;
			nvidia,enable-hs-clk-in-lp-mode;

			panel-r-fpga-8-0_ch1 {
				status							= "okay";
				compatible						= "r,fpga-8-0";
				nvidia,dsi-instance				= <DSI_INSTANCE_0>;
				nvidia,dsi-n-data-lanes			= <2>;
				nvidia,dsi-pixel-format			= <TEGRA_DSI_PIXEL_FORMAT_24BIT_P>;
				nvidia,dsi-refresh-rate			= <60>;
				nvidia,dsi-video-data-type		= <TEGRA_DSI_VIDEO_TYPE_VIDEO_MODE>;
				nvidia,dsi-video-clock-mode 	= <TEGRA_DSI_VIDEO_CLOCK_CONTINUOUS>;
				nvidia,dsi-video-burst-mode 	= <TEGRA_DSI_VIDEO_NONE_BURST_MODE_WITH_SYNC_END>;
				nvidia,dsi-virtual-channel		= <TEGRA_DSI_VIRTUAL_CHANNEL_0>;
				nvidia,dsi-panel-reset			= <TEGRA_DSI_DISABLE>;
				nvidia,dsi-power-saving-suspend = <TEGRA_DSI_DISABLE>;
				nvidia,dsi-ulpm-not-support		= <TEGRA_DSI_ENABLE>;
				/*nvidia,dsi-init-cmd				=	<TEGRA_DSI_DELAY_MS 160>,
													<0x0 DSI_DCS_WRITE_0_PARAM DSI_DCS_EXIT_SLEEP_MODE 0x0 0x0>,
													<TEGRA_DSI_SEND_FRAME 5>,
													<TEGRA_DSI_DELAY_MS 20>,
													<0x0 DSI_DCS_WRITE_0_PARAM DSI_DCS_SET_DISPLAY_ON 0x0 0x0>,
													<TEGRA_DSI_DELAY_MS 20>;
				nvidia,dsi-n-init-cmd			= <6>;
				nvidia,dsi-suspend-cmd			=	<0x0 DSI_DCS_WRITE_0_PARAM DSI_DCS_SET_DISPLAY_OFF 0x0 0x0>,
													<0x0 DSI_DCS_WRITE_0_PARAM DSI_DCS_ENTER_SLEEP_MODE 0x0 0x0>,
													<TEGRA_DSI_DELAY_MS 60>;
				nvidia,dsi-n-suspend-cmd		= <3>;*/
				disp-default-out {
					nvidia,out-type				= <TEGRA_DC_OUT_DSI>;
					nvidia,out-width			= <130>;
					nvidia,out-height			= <74>;
					nvidia,out-flags			= <TEGRA_DC_OUT_CONTINUOUS_MODE TEGRA_DC_OUT_INITIALIZED_MODE>;
					nvidia,out-parent-clk		= "plld3";
					nvidia,out-xres				= <1280>;
					nvidia,out-yres				= <720>;
				};
				display-timings {
					1280x720-32 {
						...
					};
				};
				smartdimmer {
					status = "disabled";
					...
				};
				nvdisp-cmu {
					...
				};
			};
			
			panel-r-fpga-8-0_ch2 {
				status							= "okay";
				compatible						= "r,fpga-8-0";
				nvidia,dsi-instance				= <2>;//<DSI_INSTANCE_1>;
				nvidia,dsi-n-data-lanes			= <2>;
				nvidia,dsi-pixel-format			= <TEGRA_DSI_PIXEL_FORMAT_24BIT_P>;
				nvidia,dsi-refresh-rate			= <60>;
				nvidia,dsi-video-data-type		= <TEGRA_DSI_VIDEO_TYPE_VIDEO_MODE>;
				nvidia,dsi-video-clock-mode 	= <TEGRA_DSI_VIDEO_CLOCK_CONTINUOUS>;
				nvidia,dsi-video-burst-mode 	= <TEGRA_DSI_VIDEO_NONE_BURST_MODE_WITH_SYNC_END>;
				nvidia,dsi-virtual-channel		= <TEGRA_DSI_VIRTUAL_CHANNEL_1>;
				nvidia,dsi-panel-reset			= <TEGRA_DSI_DISABLE>;
				nvidia,dsi-power-saving-suspend = <TEGRA_DSI_DISABLE>;
				nvidia,dsi-ulpm-not-support		= <TEGRA_DSI_ENABLE>;
				/*nvidia,dsi-init-cmd				=	<TEGRA_DSI_DELAY_MS 160>,
													<0x0 DSI_DCS_WRITE_0_PARAM DSI_DCS_EXIT_SLEEP_MODE 0x0 0x0>,
													<TEGRA_DSI_SEND_FRAME 5>,
													<TEGRA_DSI_DELAY_MS 20>,
													<0x0 DSI_DCS_WRITE_0_PARAM DSI_DCS_SET_DISPLAY_ON 0x0 0x0>,
													<TEGRA_DSI_DELAY_MS 20>;
				nvidia,dsi-n-init-cmd			= <6>;
				nvidia,dsi-suspend-cmd			=	<0x0 DSI_DCS_WRITE_0_PARAM DSI_DCS_SET_DISPLAY_OFF 0x0 0x0>,
													<0x0 DSI_DCS_WRITE_0_PARAM DSI_DCS_ENTER_SLEEP_MODE 0x0 0x0>,
													<TEGRA_DSI_DELAY_MS 60>;
				nvidia,dsi-n-suspend-cmd		= <3>;*/
				disp-default-out {
					nvidia,out-type				= <TEGRA_DC_OUT_DSI>;
					nvidia,out-width			= <130>;
					nvidia,out-height			= <74>;
					nvidia,out-flags			= <TEGRA_DC_OUT_CONTINUOUS_MODE TEGRA_DC_OUT_INITIALIZED_MODE>;
					nvidia,out-parent-clk		= "plld3";
					nvidia,out-xres				= <1280>;
					nvidia,out-yres				= <720>;
				};
				display-timings {
					1280x720-32 {
						clock-frequency			= <74250000>;//<69946560>;
						hactive					= <1280>;
						vactive					= <720>;
						hfront-porch			= <150>;//<112>;
						hback-porch				= <150>;//<30>;
						hsync-len				= <70>;//<32>;
						vfront-porch			= <10>;//<11>;
						vback-porch				= <10>;//<11>;
						vsync-len				= <10>;//<2>;
						nvidia,h-ref-to-sync	= <1>;
						nvidia,v-ref-to-sync	= <1>;
					};
				};
				smartdimmer {
					status = "disabled";
					...
				};
				nvdisp-cmu {
					...
				};
			};
		};
	};
};

I have set the clock to both DSI channels to plld3 though I’m sure that for the second DSI on nvdisplay@15220000 does not have this clock (I’m not sure which clock to select)

The failure that witnessed in the second DSI was

[    4.296984] Failed to get pll_d_out1 clk
[    4.301965] tegradc 15220000.nvdisplay: Failed to get DC Parent clock

Could you assist on this issue?

Regards.

CONFIG_TEGRA_NVDISPLAY is defined when you are using tx2. If you are using TX1, this macro would not be set.

Please check “soc/tegra/kernel-include/dt-bindings/display/tegra-panel.h” for the DSI instance.
You could just set to what you want if there is compilation problem.

Sorry, as I said in previous comment, I don’t know which clock should you use, either.

WayneWWW,

I am using TX2i. When compiling Kernel CONFIG_TEGRA_NVDISPLAY is set, while when compiling the device tree this definition is not set, for now I followed your advice, and set it manually to <2>.

Could you please point me to a specialist in Nvidia that could support me on this issue, our program is really running out of schedule.

Regards.

If you are in urgent case, please contact NVIDIA sales for more attention.

Otherwise, please describe your current status of panel bring up.

Hi, WayneWWW.

I have already addressed the NVidia local sales and waiting for their response.

Hope you could assist until they response.

Attached the dmesg log, Xorg.0.log and some xrand activation.

Regards.
Xorg.0.log (20.9 KB)
xrander.log (881 Bytes)
dmesg.txt (74.6 KB)

Could you share your DT?

If you only enable one DSI each time, could they work separately?