We have an LVDS video out interface whose pixel clock frequency is 10MHz, due to some constraint, my CSI-2 IP does not allow byte clock frequency to go below 20MHz. So we are using a FIFO in my FPGA.
When we get VSYNC, we listen for HSYNC and once we have HSYNC, We store the pixel data in FIFO and once all the pixel data of a line is acquired i.e 640 pixels, we generate a flag that one line is acquired.
On the transmitter side, after reset or boot up, we send SoF, 2 and then wait for line acquired flag to go up (we cannot directly send incoming data from LVDS because transmitter clock is 20MHz, so we acquire one line first and then send all of it. Also at this point we have LP11 state). Once we have the line acquired flag, the transmitter logic in my FPGA sends one line of data i.e. 640 pixels. and then again wait for next line. When all the lines are sent i.e 480 lines, we send EoF and the cycle continues.
So coming to your question, yes, we send SoF, (wait for one line to get acquired in FIFO, which is approximately 640pixels *1/10Mhz = 64 microseconds), Send one Line, Wait for ~64 microseconds, send second Line, Wait for ~64 microseconds, …, send 480th line, EoF
When it is waiting for ~64 microseconds, we are in LP11 state and hence my question that Jetson receiver will timeout or not if LP11 is in this state for ~64 microseconds.