Video from adv7282

I am working on a custom board using adv7282-m chip to capture video on agx orin.
I had an issue with video alignment since the resolution is PAL 720x576 and I set the preferred stride to 64. so the actual size now is 736x576 and the line is aligned to 64.
I configured the adv7282-m to send a test pattern.
when I try to capture the video with and I got the first frame as expected but after that I can see that that frames are not aligned and I see a blank line that moves along the frames.
the gstreamer pipe I run is: gst-launch-1.0 v4l2src device=/dev/video2 ! filesink location=test_pattern.raw
If I try to capture the same with v4l2-ctl at first it does not work, but after I run the gstreamer pipe it does, and I get a clear video with pattern and the stide allignment lined on the right.
I run : v4l2-ctl --device /dev/video2 --stream-mmap --stream-to=frame.raw --stream-count=100
when I try to do the same with video and not test pattern the behaviour is quite the same, but I can not get more then a few frames and not constant video.

I will attach the pictures of frames in both cases and the mipi low level trace.

  1. why can’t i capture with v4l2-ctl at first try ?
  2. why can’t I get same results with gstreamer as with v4l2-ctl
  3. why live video behavior is different then test pattern behavior.


gstreamer_first_run_test_pattern.txt (54.7 KB)

v4l_initial_run_mipi_trace.txt (146.0 KB)

hello tzvi1,

it looks like the surface configuration is still mismatched with the real outputs.
how you configure the alignment? there’s CID controls, --set-ctrl preferred_stride=X

Currently I configured it in vi4_registers.h in the kernel (define TEGRA_STRIDE_ALIGNMENT 64)
Do I need to configure anything else ?
I tried with v4l preferred-stride as well.

hello tzvi1,

may I also know what’s the sensor format dumps, i.e. $ v4l2-ctl -d /dev/video0 --list-formats-ext

Type: Video Capture

    [0]: 'UYVY' (UYVY 4:2:2)

hello tzvi1,

don’t it also report the available resolutions?
for instance,

$ v4l2-ctl -d /dev/video0 --list-formats-ext
	Index       : 0
	Type        : Video Capture
	Pixel Format: 'BG10'
	Name        : 10-bit Bayer BGBG/GRGR
		Size: Discrete 2592x1944
			Interval: Discrete 0.033s (30.000 fps)
		Size: Discrete 2592x1458
			Interval: Discrete 0.033s (30.000 fps)
		Size: Discrete 1280x720
			Interval: Discrete 0.008s (120.000 fps)

As far as I understand, this device supports only PAL or NTSC resolutions, and it is hard coded in driver code.
it can work at 720x576 or 720x480, or if we don’t want to use the deinterlacer, we can get half the height ( i.e 288 or 240).
In case of live video it should detect the incoming video and configure the output accordingly, since we intend to use only PAL, this is what I configured as default and I am testing it with test pattern.
If you think I should add something in the driver or device-tree Please let me know.

This is the output of all v4l params:

This is the output when I ask for all v4l params:
v4l2-ctl -d /dev/video3 --all
Driver Info:
Driver name : tegra-video
Card type : vi-output, adv7180 2-0020
Bus info : platform:tegra-capture-vi:0
Driver version : 5.10.104
Capabilities : 0x84200001
Video Capture
Extended Pix Format
Device Capabilities
Device Caps : 0x04200001
Video Capture
Extended Pix Format
Media Driver Info:
Driver name : tegra-camrtc-ca
Model : NVIDIA Tegra Video Input Device
Serial :
Bus info :
Media version : 5.10.104
Hardware revision: 0x00000003 (3)
Driver version : 5.10.104
Interface Info:
ID : 0x03000047
Type : V4L Video
Entity Info:
ID : 0x00000045 (69)
Name : vi-output, adv7180 2-0020
Function : V4L2 I/O
Pad 0x01000046 : 0: Sink
Link 0x0200004b: from remote pad 0x100000c of entity ‘13e40000.host1x:nvcsi@15a00000-’: Data, Enabled
Priority: 2
Video input : 0 (Camera 0: ok)
Format Video Capture:
Width/Height : 720/576
Pixel Format : ‘UYVY’ (UYVY 4:2:2)
Field : None
Bytes per Line : 1472
Size Image : 847872
Colorspace : sRGB
Transfer Function : Rec. 709
YCbCr/HSV Encoding: ITU-R 601
Quantization : Limited Range
Flags :

User Controls

                 brightness 0x00980900 (int)    : min=-128 max=127 step=1 default=0 value=0 flags=slider
                   contrast 0x00980901 (int)    : min=0 max=255 step=1 default=128 value=128 flags=slider
                 saturation 0x00980902 (int)    : min=0 max=255 step=1 default=128 value=128 flags=slider
                        hue 0x00980903 (int)    : min=-127 max=128 step=1 default=0 value=0 flags=slider
             fast_switching 0x00981970 (bool)   : default=0 value=0

Camera Controls

       sensor_configuration 0x009a2032 (u32)    : min=0 max=4294967295 step=1 default=0 [22] flags=read-only, volatile, has-payload
     sensor_mode_i2c_packet 0x009a2033 (u32)    : min=0 max=4294967295 step=1 default=0 [1026] flags=read-only, volatile, has-payload
  sensor_control_i2c_packet 0x009a2034 (u32)    : min=0 max=4294967295 step=1 default=0 [1026] flags=read-only, volatile, has-payload
                bypass_mode 0x009a2064 (intmenu): min=0 max=1 default=0 value=0
                            0: 0 (0x0)
                            1: 1 (0x1)
            override_enable 0x009a2065 (intmenu): min=0 max=1 default=0 value=0
                            0: 0 (0x0)
                            1: 1 (0x1)
               height_align 0x009a2066 (int)    : min=1 max=16 step=1 default=1 value=1
                 size_align 0x009a2067 (intmenu): min=0 max=2 default=0 value=0
                            0: 1 (0x1)
                            1: 65536 (0x10000)
                            2: 131072 (0x20000)
           write_isp_format 0x009a2068 (int)    : min=1 max=1 step=1 default=1 value=1
   sensor_signal_properties 0x009a2069 (u32)    : min=0 max=4294967295 step=1 default=0 [30][18] flags=read-only, has-payload
    sensor_image_properties 0x009a206a (u32)    : min=0 max=4294967295 step=1 default=0 [30][16] flags=read-only, has-payload
  sensor_control_properties 0x009a206b (u32)    : min=0 max=4294967295 step=1 default=0 [30][36] flags=read-only, has-payload
          sensor_dv_timings 0x009a206c (u32)    : min=0 max=4294967295 step=1 default=0 [30][16] flags=read-only, has-payload
           low_latency_mode 0x009a206d (bool)   : default=0 value=0
           preferred_stride 0x009a206e (int)    : min=0 max=65535 step=1 default=0 value=64
               sensor_modes 0x009a2082 (int)    : min=0 max=30 step=1 default=30 value=30 flags=read-only

Image Processing Controls

               test_pattern 0x009f0903 (menu)   : min=0 max=6 default=0 value=1
                            0: Single color
                            1: Color bars
                            2: Luma ramp
                            3: reserved
                            4: reserved
                            5: Boundary box
                            6: Disable

hello tzvi1,

please also try revise the driver to make it as 768x576

Just to make sure I understand.
This is not a supported resolution by the adv7282.
Should I change something in the stride settings or leave it at 64 ? just modify the driver to be hard coded at 768.
Can you please explain why this specific width ?

it’s hacking from driver side to allocate surface with 768x576, still having adv7282 output PAL to fill the buffer.

It does not work, I get only 3-4 frames with gstreamer.
v4l2-ctl does not work at all.

should I provide any other information ?

hello tzvi1,

yes, please gather the kernel logs for checking, you may setup a terminal for running $ dmesg --follow to keep monitor kernel layer output messages.

Hi Jerry,
I was able to capture the test pattern with v4l2-ctl after modifying some configuration in the adv7282 driver, gstreamer is still not working properly, since it expects 720x576 even though it is aware of the stride and that the line is 768 and the image size is 884736 ( I can see it in the GST_DEBUG), however if I ask him to capture 768x576 it reports internal stream error. if you have an idea how to fix this please share. How I can work around this issue with v4l2-ctl capture. my main problem now is that when trying to capture a real live video I get only a few frames. can you help with this ? if so tell me what I need to provide.

for now I am just using a fifo from v4l2-ctl and capture the stream from the fifo and that works fine, but I prefer to handle it directly from gstreamer.

hello tzvi1,

could you please share your pipeline for examination.

a very simple one, for now I just capture the data to a file:

gst-launch-1.0 v4l2src device=/dev/video1 ! filesink location=test.raw

hello tzvi1,

could you please try below to show frame-rate only, and sharing the complete messages for checking.
it shall streaming for 150-frames, it’s around 5-sec for sensor with 30-fps.
for instance,
$ gst-launch-1.0 v4l2src device=/dev/video2 num-buffers=150 ! 'video/x-raw, width=1920, height=1080, framerate=30/1, format=UYVY' ! nvvidconv ! fpsdisplaysink text-overlay=0 name=sink_0 video-sink=fakesink sync=0 -v

why 1920x1080 ?

anyway if I adjust the resolution I get nothing with fpsdiaplysink and with autovideosink it looks the same as in filesink

hello tzvi1,

it just a demonstration of gst pipeline, please revise the support resolution for your use-case.
since you’re unable to access to the stream via v4l2src, may I know what’s the error reported?
please also gather the kernel layer failure for reference, $ dmesg > klogs.txt