Supporting two video modes

@JerryChang, hi,

I did nto understand your question, please elaborate.
Is the driver required to support the TEGRA_CAMERA_CID_SENSOR_MODE_ID ioctl?

hello igal.kroyter,

please check below.
(1) you’ll need to have implementation to define all available sensor modes, and I think you already complete this.
(2) please confirm all sensor modes also shows-up with v4l2 commands, $ v4l2-ctl -d /dev/video0 --list-formats-ext
(3) by default, v4l2 were choose sensor modes with respect to resolution, here’s CID control to specify the mode index to the command-line.

@JerryChang,
(1) OK
(2) This sentence is a puzzle for me as you can see in https://forums.developer.nvidia.com/t/supporting-two-video-modes/169626/13 only the format (UYVY) is depicted but nothing about the standard (NTSC or PAL), while something like the following:

Size: Discrete 1280x720
Interval: Discrete 0.017s (60.000 fps)
Interval: Discrete 0.033s (30.000 fps)
is not displayed when I run the –list-formats-ext option, I guess this is one of the problems.
[3] Driver like ov5647.c implementes TEGRA_CAMERA_CID_SENSOR_MODE_ID ctrl while my driver desn’t do it, should I implement it?

hello igal.kroyter,

you’ll only need to add TEGRA_CAMERA_CID_SENSOR_MODE_ID to the general CID list.
for example,
$L4T_Sources/r32.5/Linux_for_Tegra/source/public/kernel/nvidia/drivers/media/i2c/imx185.c

static const u32 ctrl_cid_list[] = {
        TEGRA_CAMERA_CID_GAIN,
        TEGRA_CAMERA_CID_EXPOSURE,
...
        TEGRA_CAMERA_CID_SENSOR_MODE_ID,

after that,
it is device tree property, use_sensor_mode_id, to enable mode index selection.
you may also check below kernel sources,
for example,
$L4T_Sources/r32.5/Linux_for_Tegra/source/public/kernel/nvidia/drivers/media/platform/tegra/camera/camera_common.c

int camera_common_parse_general_properties(struct device *dev, struct camera_common_data *s_data)
{
        err = of_property_read_string(np, "use_sensor_mode_id", &str);
...

int camera_common_try_fmt(struct v4l2_subdev *sd, struct v4l2_mbus_framefmt *mf)
{
...

        if (s_data->use_sensor_mode_id &&
...
                s_data->mode = frmfmt[s_data->sensor_mode_id].mode;
                s_data->mode_prop_idx = s_data->sensor_mode_id;
1 Like

hello igal.kroyter,

we may need to dig into why sensor modes are not displayed when running --list-formats-ext.
could you please reduce the available modes to keep only single NTSC mode in the driver? thanks

@JerryChang, hi,

I have modified the driver to include some enumeration funtion and now I have the standards depicted in the formats (though one of the formats is duplicated UYVY). How to solve it?

v4l2-ctl -d /dev/video0 --list-formats-ext
ioctl: VIDIOC_ENUM_FMT
Index : 0
** Type : Video Capture**
** Pixel Format: ‘UYVY’**
** Name : UYVY 4:2:2**
** Size: Discrete 720x576**
** Interval: Discrete 0.040s (25.000 fps)**
** Size: Discrete 720x480**
** Interval: Discrete 0.033s (30.000 fps)**

Index : 1
Type : Video Capture
Pixel Format: ‘NV16’
Name : Y/CbCr 4:2:2
Size: Discrete 720x576
Interval: Discrete 0.040s (25.000 fps)
Size: Discrete 720x480
Interval: Discrete 0.033s (30.000 fps)

Index : 2
** Type : Video Capture**
** Pixel Format: ‘UYVY’**
** Name : UYVY 4:2:2**
** Size: Discrete 720x576**
** Interval: Discrete 0.040s (25.000 fps)**
** Size: Discrete 720x480**
** Interval: Discrete 0.033s (30.000 fps)**

Index : 3
Type : Video Capture
Pixel Format: ‘RGB3’ (emulated)
Name : RGB3
Size: Discrete 720x576
Interval: Discrete 0.040s (25.000 fps)
Size: Discrete 720x480
Interval: Discrete 0.033s (30.000 fps)

Index : 4
Type : Video Capture
Pixel Format: ‘BGR3’ (emulated)
Name : BGR3
Size: Discrete 720x576
Interval: Discrete 0.040s (25.000 fps)
Size: Discrete 720x480
Interval: Discrete 0.033s (30.000 fps)

Index : 5
Type : Video Capture
Pixel Format: ‘YU12’ (emulated)
Name : YU12
Size: Discrete 720x576
Interval: Discrete 0.040s (25.000 fps)
Size: Discrete 720x480
Interval: Discrete 0.033s (30.000 fps)

Index : 6
Type : Video Capture
Pixel Format: ‘YV12’ (emulated)
Name : YV12
Size: Discrete 720x576
Interval: Discrete 0.040s (25.000 fps)
Size: Discrete 720x480
Interval: Discrete 0.033s (30.000 fps)

Moreover I have noticed that when I provide the corret frame rate of NTSC which is 30000/1001 (29.97) there is a failure during encoding:

gst-launch-1.0 v4l2src device=/dev/video0 ! ‘video/x-raw, format=(string)UYVY,width=720,height=480,framerate=(fraction)30000/1001’ ! xvimagesink
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2948): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming task paused, reason not-negotiated (-4)
Execution ended after 0:00:01.381554880
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …

while when I set the frame of 30 I get the following error:

gst-launch-1.0 v4l2src device=/dev/video0 ! ‘video/x-raw, format=(string)UYVY,width=720,height=480,framerate=(fraction)30’ ! xvimagesink
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
libv4l2: error turning on stream: Connection timed out
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Could not read from resource.
Additional debug info:
gstv4l2bufferpool.c(1054): gst_v4l2_buffer_pool_poll (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
poll error 1: Connection timed out (110)
Execution ended after 0:00:01.102283008
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …

another thing that I have noticed is that when I try to change the mode (from user space) with the following snippet:

struct v4l2_ext_controls ctrls;
struct v4l2_ext_control ctrl;

memset(&ctrls, 0, sizeof(ctrls));
memset(&ctrl, 0, sizeof(ctrl));

ctrls.ctrl_class = V4L2_CTRL_ID2CLASS(TEGRA_CAMERA_CID_SENSOR_MODE_ID);
ctrls.count = 1;
ctrls.controls = &ctrl;

ctrl.id = TEGRA_CAMERA_CID_SENSOR_MODE_ID;
ctrl.value = mode;

ret = ioctl(fd, VIDIOC_S_EXT_CTRLS, &ctrls);

The value that the driver gets does not change, so I am not sure if the mode is changing, though the adv7180_s_ctr function is invoked only if the value of mode in user space changes (0 to 1 or 1 to 0):

[ 45.691993] *********************adv7180_s_power 1
[ 45.697808] *********************adv7180_set_power 1
[ 45.708273] *********************adv7180_s_ctrl is_new 1
[ 45.713734] *********************adv7180_s_ctrl has_changed 1
[ 45.719579] *********************adv7180_s_ctrl is_private 0
[ 45.725354] *********************adv7180_s_ctrl is_string 0
[ 45.731041] *********************adv7180_s_ctrl is_ptr 0
[ 45.736566] *********************adv7180_s_ctrl is_array 0
[ 45.742178] *********************adv7180_s_ctrl has_volatiles 0
[ 45.748317] *********************adv7180_s_ctrl call_notify 0
[ 45.754101] *********************adv7180_s_ctrl manual_mode_value 0
[ 45.760387] *********************adv7180_s_ctrl name Sensor Mode
[ 45.766434] *********************adv7180_s_ctrl flags 0x20
[ 45.771946] *********************adv7180_s_ctrl val 0
[ 45.777006] *********************adv7180_s_ctrl cur.val 0
[ 45.782429] *********************adv7180_s_ctrl 10100744
[ 45.787765] *********************TEGRA_CAMERA_CID_SENSOR_MODE_ID 0

Moreover I do not see that the dv_timing is changin, I am not sure if this is important, as the chip itself is not using it.

bottom line at the moment, PAL works, NTSC does not work.

any suggestions?

Hi,
Since it shows 30fps in v4l2-ctl --list-formats-ext, you have to set framerate=30/1 in gstreamer pipeline. But you mention the source actually generates frames in 29.97fps. You may check if you see 29.97 in fpsdisplaysink prints:

gst-launch-1.0 v4l2src device=/dev/video0 ! 'video/x-raw, format=(string)UYVY,width=720,height=480,framerate=(fraction)30/1' ! fpsdisplaysink text-overlay=0 video-sink=fakesink sync=0 -v

@DaneLLL, hi,

there is no data flow thus no “window” is opened to display any video. Following is terminasl’s dump:

gst-launch-1.0 v4l2src device=/dev/video0 ! ‘video/x-raw, format=(string)UYVY,width=720,height=480,framerate=(fraction)30/1’ ! fpsdisplaysink text-overlay=0 video-sink=fakesink sync=0 -v
Setting pipeline to PAUSED …
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: sync = false
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src:
caps = “video/x-raw,\ format=(string)UYVY,\ width=(int)720,\ height=(int)480,\ pixel-aspect-ratio=(fraction)1/1,\ interlace-mode=(string)progressive,\ colorimetry=(string)bt601,\ framerate=(fraction)30/1”
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src:
caps = “video/x-raw,\ format=(string)UYVY,\ width=(int)720,\ height=(int)480,\ pixel-aspect-ratio=(fraction)1/1,\ interlace-mode=(string)progressive,\ colorimetry=(string)bt601,\ framerate=(fraction)30/1”
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink.GstProxyPad:proxypad0:
caps = “video/x-raw,\ format=(string)UYVY,\ width=(int)720,\ height=(int)480,\ pixel-aspect-ratio=(fraction)1/1,\ interlace-mode=(string)progressive,\ colorimetry=(string)bt601,\ framerate=(fraction)30/1”
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0.GstPad:sink:
caps = “video/x-raw,\ format=(string)UYVY,\ width=(int)720,\ height=(int)480,\ pixel-aspect-ratio=(fraction)1/1,\ interlace-mode=(string)progressive,\ colorimetry=(string)bt601,\ framerate=(fraction)30/1”
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink:
caps = “video/x-raw,\ format=(string)UYVY,\ width=(int)720,\ height=(int)480,\ pixel-aspect-ratio=(fraction)1/1,\ interlace-mode=(string)progressive,\ colorimetry=(string)bt601,\ framerate=(fraction)30/1”
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink:
caps = “video/x-raw,\ format=(string)UYVY,\ width=(int)720,\ height=(int)480,\ pixel-aspect-ratio=(fraction)1/1,\ interlace-mode=(string)progressive,\ colorimetry=(string)bt601,\ framerate=(fraction)30/1”

ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Could not read from resource.
Additional debug info:
gstv4l2bufferpool.c(1054): gst_v4l2_buffer_pool_poll (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
poll error 1: Invalid argument (22)

Execution ended after 0:00:02.637198642
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …

Any suggestions?

Hi,
It looks like sensor driver and device tree are not ready. You can make sure you can run the mode in v4l2-ctl command and then gstreamer command.

@DaneLLL hi,

as I wrote above I think that the value of the mode does not penetrate into the driver whether I am using my own code or v432-ctl, it has the same behaviour (see Supporting two video modes - #27 by igal.kroyter). Could you help me resolve it?

Hi,
Not sure but I think gstreamer does not support floating frame rate like 29.97. So if your sensor output in this framerate, you would need to show to another value such as 29fps or 30 fps in v4l2-ctl --list-formats-ext. And if you can do frame capture though v4l2-ctl command, it should work fine in gstreamer command.

@DaneLLL hi,

  1. I am trying to change the mode (an index in an array of formats, 0 or 1). when I send this value to the driver either by v432-ctl or ioctl I receive the always the value 0. Thus I think that the mode never changes. Could help debug this issue. It has nothing to do with the frame rate (at the moment).
  2. Regarding your note: NTSC is 29.97 so how others handle this issue (I could not find that anyothers in the forum has any issues with NTSC), BTW, I already provide the value 30, see in previous conversations?

Hi,

I have read incorrect value from the input pointer in the driver, now that I get the correct pointer’s field the mode value getting to the driver is correct.

Though, once I provide PAL video to the A2D and try to encode with mode=0 (PAL) or mode=1 (NTSC) it has no relevance as with both options the command:
gst-launch-1.0 v4l2src device=/dev/video0 ! ‘video/x-raw, format=(string)UYVY,width=720,height=480,framerate=(fraction)30/1’ ! fpsdisplaysink text-overlay=0 video-sink=fakesink sync=0 -v prints out 25 fps.
While, when the A2D is provided with an NTSC video, the “Could not read from resource” message is consistent for both modes.

What am I doing wrong?

@DaneLLL, hi,

it looks like when I start tracing the video encoding (which delays the CPU operation) the Gstreamer succedds with samplinga single frame following some terminal and trace dump:

sudo cat /sys/kernel/debug/tracing/trace
tracer: nop

entries-in-buffer/entries-written: 246/246 #P:6

                          _-----=> irqs-off
                         / _----=> need-resched
                        | / _---=> hardirq/softirq
                        || / _--=> preempt-depth
                        ||| /     delay
       TASK-PID   CPU#  ||||    TIMESTAMP  FUNCTION
          | |       |   ||||       |         |
 kworker/0:2-165   [000] ...1   104.865610: rtos_queue_peek_from_isr_failed: tstamp:4074112986 queue:0x0b4a3c58
 kworker/0:2-165   [000] ...1   104.865615: rtcpu_start: tstamp:4074113816
 kworker/0:2-165   [000] ...1   105.022312: rtos_queue_peek_from_isr_failed: tstamp:4079113837 queue:0x0b4a3c58
 kworker/0:2-165   [000] ...1   105.182300: rtos_queue_peek_from_isr_failed: tstamp:4084114317 queue:0x0b4a3c58
 kworker/0:2-165   [000] ...1   105.338299: rtos_queue_peek_from_isr_failed: tstamp:4089114801 queue:0x0b4a3c58
 kworker/0:2-165   [000] ...1   105.498299: rtos_queue_peek_from_isr_failed: tstamp:4094115293 queue:0x0b4a3c58
 kworker/0:2-165   [000] ...1   105.658296: rtos_queue_peek_from_isr_failed: tstamp:4099115787 queue:0x0b4a3c58
 kworker/0:2-165   [000] ...1   105.818296: rtos_queue_peek_from_isr_failed: tstamp:4104116280 queue:0x0b4a3c58
 kworker/0:2-165   [000] ...1   105.978302: rtos_queue_peek_from_isr_failed: tstamp:4109116769 queue:0x0b4a3c58
 kworker/0:2-165   [000] ...1   106.082312: rtcpu_vinotify_handle_msg: tstamp:4111820416 tag:CHANSEL_PXL_SOF channel:0x00 frame:2 vi_tstamp:4111819935 data:0x00000001
 kworker/0:2-165   [000] ...1   106.082315: rtcpu_vinotify_handle_msg: tstamp:4111820576 tag:ATOMP_FS channel:0x00 frame:2 vi_tstamp:4111819965 data:0x00000000
 kworker/0:2-165   [000] ...1   106.082316: rtcpu_vinotify_handle_msg: tstamp:4112261504 tag:CHANSEL_LOAD_FRAMED channel:0x01 frame:2 vi_tstamp:4112261121 data:0x08000000
 kworker/0:2-165   [000] ...1   106.082316: rtcpu_vinotify_handle_msg: tstamp:4112296986 tag:CHANSEL_PXL_EOF channel:0x00 frame:2 vi_tstamp:4112296436 data:0x01df0002
 kworker/0:2-165   [000] ...1   106.082317: rtcpu_vinotify_handle_msg: tstamp:4112297092 tag:CHANSEL_FAULT channel:0x00 frame:2 vi_tstamp:4112296605 data:0x01e00040
 kworker/0:2-165   [000] ...1   106.082318: rtcpu_vinotify_handle_msg: tstamp:4112297778 tag:CHANSEL_LOAD_FRAMED channel:0x01 frame:2 vi_tstamp:4112297203 data:0x08000000
 kworker/0:2-165   [000] ...1   106.082318: rtcpu_vinotify_handle_msg: tstamp:4112297878 tag:CHANSEL_FAULT_FE channel:0x01 frame:2 vi_tstamp:4112297207 data:0x00000001
 kworker/0:2-165   [000] ...1   106.082319: rtcpu_vinotify_handle_msg: tstamp:4112298009 tag:ATOMP_FE channel:0x00 frame:2 vi_tstamp:4112297210 data:0x00000000
 kworker/0:2-165   [000] ...1   106.134341: rtos_queue_peek_from_isr_failed: tstamp:4114117282 queue:0x0b4a3c58
 kworker/0:2-165   [000] ...1   106.290445: rtos_queue_peek_from_isr_failed: tstamp:4119117762 queue:0x0b4a3c58
 kworker/0:2-165   [000] ...1   106.446335: rtos_queue_peek_from_isr_failed: tstamp:4124118257 queue:0x0b4a3c58
 kworker/0:2-165   [000] ...1   106.602303: rtos_queue_peek_from_isr_failed: tstamp:4129118742 queue:0x0b4a3c58
 kworker/0:2-165   [000] ...1   106.758362: rtos_queue_peek_from_isr_failed: tstamp:4134119235 queue:0x0b4a3c58
 kworker/0:2-165   [000] ...1   106.914332: rtos_queue_peek_from_isr_failed: tstamp:4139119733 queue:0x0b4a3c58
 kworker/0:2-165   [000] ...1   107.070365: rtos_queue_peek_from_isr_failed: tstamp:4144120223 queue:0x0b4a3c58
 kworker/0:2-165   [000] ...1   107.226422: rtos_queue_peek_from_isr_failed: tstamp:4149120716 queue:0x0b4a3c58
 kworker/0:2-165   [000] ...1   107.434350: rtos_queue_peek_from_isr_failed: tstamp:4154121207 queue:0x0b4a3c58
 kworker/0:2-165   [000] ...1   107.590360: rtos_queue_peek_from_isr_failed: tstamp:4159121700 queue:0x0b4a3c58
 kworker/0:2-165   [000] ...1   107.746353: rtos_queue_peek_from_isr_failed: tstamp:4164122196 queue:0x0b4a3c58
 kworker/0:2-165   [000] ...1   107.902366: rtos_queue_peek_from_isr_failed: tstamp:4169122688 queue:0x0b4a3c58
 kworker/0:2-165   [000] ...1   108.058352: rtos_queue_peek_from_isr_failed: tstamp:4174123184 queue:0x0b4a3c58
 kworker/0:2-165   [000] ...1   108.214361: rtos_queue_peek_from_isr_failed: tstamp:4179123676 queue:0x0b4a3c58
 kworker/0:2-165   [000] ...1   108.370354: rtos_queue_peek_from_isr_failed: tstamp:4184124172 queue:0x0b4a3c58
 kworker/0:2-165   [000] ...1   108.526354: rtos_queue_peek_from_isr_failed: tstamp:4189124660 queue:0x0b4a3c58
 kworker/0:2-165   [000] ...1   108.682356: rtos_queue_peek_from_isr_failed: tstamp:4194125157 queue:0x0b4a3c58
 kworker/0:2-165   [000] ...1   108.838348: rtos_queue_peek_from_isr_failed: tstamp:4199125654 queue:0x0b4a3c58
 kworker/0:2-165   [000] ...1   108.994349: rtos_queue_peek_from_isr_failed: tstamp:4204126154 queue:0x0b4a3c58
 kworker/0:2-165   [000] ...1   109.099642: rtos_queue_peek_from_isr_failed: tstamp:4206630481 queue:0x0b4a3c58
 kworker/0:2-165   [000] ...1   115.103258: rtos_queue_peek_from_isr_failed: tstamp:4352473027 queue:0x0b4a3c58

[ 115.237944] video4linux video0: vi_notify_wait In
[ 115.242657] video4linux video0: vi_notify_wait Out
[ 115.247452] video4linux video0: tegra_channel_capture_frame: vi4 got SOF syncpt buf[ffffffc194d08c00]
[ 115.256711] video4linux video0: vi_notify_wait In
[ 116.258377] tegra-vi4 15700000.vi: PXL_SOF syncpt timeout! err = -11
[ 116.264815] video4linux video0: vi_notify_wait Out
[ 116.269771] video4linux video0: tegra_channel_capture_frame: vi4 got SOF syncpt buf[ffffffc194d0a000]
[ 116.279093] video4linux video0: vi_notify_wait In
[ 117.254363] video4linux video0: MW_ACK_DONE syncpoint time out! err -11 0 5
[ 117.261393] video4linux video0: tegra_channel_release_frame: vi4 got EOF syncpt buf[ffffffc194d08c00]
[ 117.270822] video4linux video0: restart_version 6 5
[ 117.285436] tegra-vi4 15700000.vi: PXL_SOF syncpt timeout! err = -11
[ 117.291914] video4linux video0: vi_notify_wait Out
[ 117.296877] video4linux video0: tegra_channel_capture_frame: vi4 got SOF syncpt buf[ffffffc194d08000]
[ 117.306417] video4linux video0: vi_notify_wait In
[ 117.306434] video4linux video0: restart_version 6 5
[ 118.306345] tegra-vi4 15700000.vi: PXL_SOF syncpt timeout! err = -11
[ 118.312777] video4linux video0: vi_notify_wait Out
[ 118.317661] video4linux video0: tegra_channel_capture_frame: vi4 got SOF syncpt buf[ffffffc194d08c00]
[ 118.326996] video4linux video0: vi_notify_wait In
[ 119.330367] tegra-vi4 15700000.vi: PXL_SOF syncpt timeout! err = -11
[ 119.336797] video4linux video0: vi_notify_wait Out
[ 119.341738] video4linux video0: tegra_channel_capture_frame: vi4 got SOF syncpt buf[ffffffc194d0a000]
[ 120.326368] video4linux video0: MW_ACK_DONE syncpoint time out! err -11 0 8
[ 120.333429] video4linux video0: tegra_channel_release_frame: vi4 got EOF syncpt buf[ffffffc194d08c00]
[ 120.342805] video4linux video0: restart_version 7 6
[ 120.343201] video4linux video0: vi_notify_wait In
[ 121.342346] tegra-vi4 15700000.vi: PXL_SOF syncpt timeout! err = -11
[ 121.348789] video4linux video0: vi_notify_wait Out
[ 121.353748] video4linux video0: tegra_channel_capture_frame: vi4 got SOF syncpt buf[ffffffc194d08000]
[ 121.363067] video4linux video0: vi_notify_wait In
[ 122.366359] tegra-vi4 15700000.vi: PXL_SOF syncpt timeout! err = -11
[ 122.372771] video4linux video0: vi_notify_wait Out
[ 122.377654] video4linux video0: tegra_channel_capture_frame: vi4 got SOF syncpt buf[ffffffc194d08c00]
[ 123.362360] video4linux video0: MW_ACK_DONE syncpoint time out! err -11 0 10
[ 123.369499] video4linux video0: tegra_channel_release_frame: vi4 got EOF syncpt buf[ffffffc194d08000]
[ 123.378866] video4linux video0: restart_version 8 7
[ 123.379141] video4linux video0: vi_notify_wait In
[ 124.378337] tegra-vi4 15700000.vi: PXL_SOF syncpt timeout! err = -11
[ 124.384756] video4linux video0: vi_notify_wait Out
[ 124.389723] video4linux video0: tegra_channel_capture_frame: vi4 got SOF syncpt buf[ffffffc194d0a000]
[ 124.399065] video4linux video0: vi_notify_wait In
[ 125.402318] tegra-vi4 15700000.vi: PXL_SOF syncpt timeout! err = -11
[ 125.408797] video4linux video0: vi_notify_wait Out
[ 125.413672] video4linux video0: tegra_channel_capture_frame: vi4 got SOF syncpt buf[ffffffc194d08000]
[ 126.398315] video4linux video0: MW_ACK_DONE syncpoint time out! err -11 0 12
[ 126.405424] video4linux video0: tegra_channel_release_frame: vi4 got EOF syncpt buf[ffffffc194d0a000]
[ 126.415208] video4linux video0: vi_notify_wait In
[ 126.417431] video4linux video0: restart_version 9 8
[ 127.422312] tegra-vi4 15700000.vi: PXL_SOF syncpt timeout! err = -11
[ 127.428752] video4linux video0: vi_notify_wait Out

gst-launch-1.0 v4l2src device=/dev/video0 ! ‘video/x-raw, format=(string)UYVY,width=720,height=480,framerate=(fraction)30/1’ ! fpsdisplaysink text-overlay=0 video-sink=fakesink sync=0 -v
Setting pipeline to PAUSED …
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: sync = false
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = “video/x-raw,\ format=(string)UYVY,\ width=(int)720,\ height=(int)480,\ pixel-aspect-ratio=(fraction)1/1,\ interlace-mode=(string)progressive,\ colorimetry=(string)bt601,\ framerate=(fraction)30/1”
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = “video/x-raw,\ format=(string)UYVY,\ width=(int)720,\ height=(int)480,\ pixel-aspect-ratio=(fraction)1/1,\ interlace-mode=(string)progressive,\ colorimetry=(string)bt601,\ framerate=(fraction)30/1”
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = “video/x-raw,\ format=(string)UYVY,\ width=(int)720,\ height=(int)480,\ pixel-aspect-ratio=(fraction)1/1,\ interlace-mode=(string)progressive,\ colorimetry=(string)bt601,\ framerate=(fraction)30/1”
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0.GstPad:sink: caps = “video/x-raw,\ format=(string)UYVY,\ width=(int)720,\ height=(int)480,\ pixel-aspect-ratio=(fraction)1/1,\ interlace-mode=(string)progressive,\ colorimetry=(string)bt601,\ framerate=(fraction)30/1”
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink: caps = “video/x-raw,\ format=(string)UYVY,\ width=(int)720,\ height=(int)480,\ pixel-aspect-ratio=(fraction)1/1,\ interlace-mode=(string)progressive,\ colorimetry=(string)bt601,\ framerate=(fraction)30/1”
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = “video/x-raw,\ format=(string)UYVY,\ width=(int)720,\ height=(int)480,\ pixel-aspect-ratio=(fraction)1/1,\ interlace-mode=(string)progressive,\ colorimetry=(string)bt601,\ framerate=(fraction)30/1”
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: sync = false
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 4, dropped: 0, current: 1.30, average: 1.30
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 7, dropped: 0, current: 0.97, average: 1.14
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 10, dropped: 0, current: 0.98, average: 1.08
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 13, dropped: 0, current: 0.98, average: 1.06
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 16, dropped: 0, current: 0.98, average: 1.04
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 19, dropped: 0, current: 0.98, average: 1.03
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 22, dropped: 0, current: 0.97, average: 1.02
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 25, dropped: 0, current: 0.98, average: 1.02
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 28, dropped: 0, current: 0.98, average: 1.01
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 31, dropped: 0, current: 0.98, average: 1.01
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 34, dropped: 0, current: 0.98, average: 1.01
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 37, dropped: 0, current: 0.98, average: 1.00
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 40, dropped: 0, current: 0.98, average: 1.00
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 43, dropped: 0, current: 0.98, average: 1.00
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 46, dropped: 0, current: 0.98, average: 1.00
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 49, dropped: 0, current: 0.98, average: 1.00
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 52, dropped: 0, current: 0.98, average: 1.00
^Chandling interrupt.
Interrupt: Stopping pipeline …
Execution ended after 0:00:57.209587471
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …

While when I remove the tracing, The Gstreamer does not sample even a single frame, is it possible that some extra delays will help with the encoding?

Any suggestions?

Hi,
Please check if you can run v4l2-ctl command like:
gst-launch-1.0 capture from 4 cameras simultaneously success, but v4l2-ctl failed - #5 by zmaqidong

Usually if v4l2-ctl command works and format, width, height, framerate is set exactly in gstreamer pipeline, it should run fine. We have the steps in
Jetson Nano FAQ
Q: I have a USB camera. How can I launch it on Jetson Nano?

The v4l2src plugin is a native gstreamer plugin and generally we try with integer framerate. Don’t have much experience in running framerate like framerate=30000/1001. Per your comment this case should work fine, my apology we don’t give precise information about this.

@DaneLLL, hi,

if you meant to run the following line:
v4l2-ctl --set-fmt-video=width=1280,height=1069,pixelformat=UYVY --stream-mmap --set-ctrl bypass_mode=0 --stream-count=1000 -d /dev/video0
Then it also fails.

  1. Could you help me resolve the issue of two format https://forums.developer.nvidia.com/t/supporting-two-video-modes/169626/27?
  2. are you saying that TX2 cannot support NTSC video input?

Hi,
You would need to modify width, height, pixelfotmat according to your source in the v4l2-ctl command. To check if the sensor driver is ready for 720x576 and 720x480 modes.