Jetson DSI display burst mode

someone has experience with writing device tree and driver for DSI Display in burst mode?
i have problems to set the pixel clock, which is calculated from nvidia driver in base of the DTB file.
The driver needs a max_panel_freq_khz for calculation. but i dont know how to set it via DTB.

i tried nvidia,dsi-max_panel_freq_khz = <50000>; but this is not transfered to dsi.c driver.

kernel log

<\n>[    0.455461] tegradc 15200000.nvdisplay: DSI: max_panel_freq_khzis not set for DSI burst mode.<\r>
<\n>[    0.455541] tegradc 15200000.nvdisplay: Force clock continuous mode<\r>

thanks for help.


I am not sure where did you come from this “nvidia,dsi-max_panel_freq_khz”.

All compatible items are listed in “kernel-4.4/Documentation/devicetree/bindings/video/nvidia,tegra186-dsi.txt”

it was just a try…
your nvidia driver needs the max_panel_freq_khz value. could you explain how to set it. cause i cant find it in the documents.


In fact, I don’t think we have an existing usecase that enables DSI burst mode on L4T. I’ll help check.

Maybe other forum users can share their experience at this moment.


I just checked the driver code.

if (!dsi->info.max_panel_freq_khz) {
  		dsi->info.max_panel_freq_khz = DEFAULT_MAX_DSI_PHY_CLK_KHZ;
  		if (dsi->info.video_burst_mode >
  			dev_err(&dsi->dc->ndev->dev, "DSI: max_panel_freq_khz"
  					"is not set for DSI burst mode.\n");
  			dsi->info.video_burst_mode =

Even with the error message you met, there is no error return and the max_panel_freq_khz is initialized.


Please add below parsing code to of_dc.c

diff --git a/drivers/video/tegra/dc/of_dc.c b/drivers/video/tegra/dc/of_dc.c
index e8301e0..2614b07 100644
--- a/drivers/video/tegra/dc/of_dc.c
+++ b/drivers/video/tegra/dc/of_dc.c
@@ -1332,6 +1332,10 @@ static struct device_node *parse_dsi_settings(struct platform_device *ndev,
        if (!of_property_read_u32(np_dsi_panel,
+                       "nvidia,dsi-max-panel-freq-khz", &temp)) {
+               dsi->max_panel_freq_khz= temp;
+       }
+       if (!of_property_read_u32(np_dsi_panel,
                        "nvidia,dsi-pixel-format", &temp)) {
                dsi->pixel_format = (u8)temp;
                if (temp == TEGRA_DSI_PIXEL_FORMAT_16BIT_P)

ok thanks,
seems to work.
Still have issues with initialisation of display - but this is different topic.

If you manage to enable panel with burst mode, please share the dts here for other user to refer to.



How is the burst mode? Do you enable it?

still using video mode.
dsi display works - burst mode dont work yet.


Are you still working on the dsi burst mode?

no not at this moment.
stil using non burst mode.