Output data rate = (sensor or deserializer pixel clock in hertz) * (bits per pixel) / (number of CSI lanes)
My setting serdes_pix_clk_hz is “375000000”
So Output data rate = 375000000 * 10/ 4 = 937.5Mbps.
The output data rate < 1.5 Gbps.So I think the deskew calibration is not required.
So the caculate value CLK_HZ_FOR_DESKEW=750MHZ,So Output data rate > 750.According to this caculating,I need the deskew calibration.So how to determine Deskew calibration values?
According to your description: Output data rate = (sensor or deserializer pixel clock in Hertz) * (bits per pixel) / (number of CSI lanes), then deskew must satisfy the condition:Output data rate > 1.5Gbps. They compare the same type rate. Why does the CLK_HZ_FOR_DESKEW defined in the code need to be divided by 2:
Judging from the printing, it didn’t run in, so I wonder if there is a problem with the calculation here? The value of pix_clk_hz here is pix_clk_hz=serdes_pixel_clock=375000000HZ, and the judgment condition here should be Output data rate =(sensor or deserializer pixel clock in hertz) *(bits per pixel) /(number of CSI lanes) means Output data rate = 375000000 10/4 = 937.5Mbps. Therefore, the judgment condition pix_clk_hz >= CLK_HZ_FOR_DESKEW should be changed to: pix_clk_hz(bits per pixel) / (number of CSI lanes) >= 1.5Gbps.
Yes, I am just determining whether deskew calibration is needed. I can’t get the frame data now. Judging from the source kernel code, the judgment condition for triggering deskew is not based on the calculation formula of Output data rate = (sensor or deserializer pixel clock in Hertz) * (bits per pixel) / (number of CSI lanes).