HDMI2CSI - Interlaced video for Jetson TX2

Hi All,

We have integrated an HDMI2CSI board https://blog.zhaw.ch/high-performance/category/hdmi2csi/ with the Jetson TX2. So far we have had success capturing progressive video in various digital video timings, such as 1920x1080p30/50/60, 1240x1028p60/75 etc. This has extended to modifying the EDID block and Toshiba TC358840 driver to support additional resolutions.

The issue being faced is in regard to capturing interlaced video. The following timing was added; 720x576i50, this has successfully been detected by the Toshiba TC358840 driver, as seen in the kernel logs. Attempts to display the video using a GStreamer pipeline has resulted in a correctly sized window displaying with a completely green screen. The following pipelines were attempted; (First = Green Screen, Second = Failure)

gst-launch-1.0 v4l2src device=/dev/video0 ! 'video/x-raw, width=720, height=576, framerate=50/1, format=UYVY' ! nvvidconv ! 'video/x-raw(memory:NVMM), width=720, height=576, framerate=50/1, format=I420' ! nvoverlaysink sync=false
gst-launch-1.0 v4l2src device=/dev/video0 ! 'video/x-raw, width=720, height=576, framerate=50/1, format=UYVY, iterlace-mode=interleaved' ! queue ! deinterlace ! nvvidconv ! 'video/x-raw(memory:NVMM), width=720, height=576, framerate=50/1, format=I420, interlace-mode=progressive' ! nvoverlaysink sync=false -e -vvv

It’s also worth noting that under the first GStreamer pipeline, the following error message was seen in the kernel logs;

tegra-vi4 15700000.vi: PXL_SOF syncpt timeout! err = -11

I believe this is due to the GStreamer pipeline expecting a full progressive frame and only receiving half of the data from the interlaced source. (correct me if I’m wrong).

I have read through a few posts which state that interlaced video can be captured but displaying isn’t supported in L4T 28.1. However, one post stated that interlaced displaying was supported in an earlier L4T revision: 24.2.1. Here are the posts i’m referring to;


My key questions are;

  1. Is it still the case that displaying interlaced video is NOT supported in L4T 28.1?
  2. If displaying interlaced video was supported in L4T 24.2.1, then why was it removed?
  3. What drivers require modification in order to support interlaced video in L4T 28.1?

Any help would be appreciated!


We do not support interlace capturing.

And we are really thankful to the user who shares his solution for r24.2.1.

Hi DaneLLL.
We have the same trouble as a topic author. And we have analyzed changes made by guys from ZHAW to enable support for interlaced capturing in l4t 24.2.1 (Jetson TX1). Among the changes they did they set up bit 12 of the register VI_CSI_0_CSI_IMAGE_DT_0. From the TRM (TX1), chapter 31.7.7, description of this bit is: “INTERLACED_VIDEO: VI channel interlaced video format enable”.

As for Jetson TX2 and description of VI subsytem in TRM (TX2) - there is chapter 27.3 “VI4 Features” that declares support of interlaced capturing by VI4. But any other information regarding interlace is missing in this chapter…

May you help us find where such information is in TRM (TX2)? Or is this just a mistake in TRM?

hello ilukaniuk,

we do not support interlaced video-in via MIPI, suggest you could launch with below pipeline capture and encode.
gst-launch (v4l2src) -> YUV 422 -> convert to YUV 420 -> nvv4l2encoder -> (file dump)

you may also check Topic 1056596, there’s memory leak in decoding interlaced H264 stream through nvv4l2decoder.

DaneLLL, are we correctly understand that we will not receive any bit of data from nvcsi in case of interlaced video-in? Or you mean that nvcsi has no de-interlacing? And we shall receive odd/even fields of frame with mentioned pipeline?

By the way, we using L4T 28.2.1 and plugin ‘nvv4l2encoder’ is not present here. Will that pipeline work in case if we update to newer L4T?

We do not support interlaced capture in VI driver. Hardware is with the capability, but there is no software implementation in VI driver. Please run your source in progressive mode and refer to sensor driver programming guide for device tree programming. You may also consider to use camera modules from our partners.

Encoder does not support interlaced frames. The pseudo pipeline is for V4L2 source in progressive YUV422 format.

Thanks for the response.
Maybe we can implement it by ourselves? We have analyzed TRM (section 27) but nothing regarding interlacing we could not found. May you point us to the right place in documentation where we should start from?

In our case we can’t run our source in progressive mode, unfortunately. It would be enough for us to receive even/odd fields of frame from nvcsi/vi4. In such case we would be able to produce full frame within our application.