Excess sensor frames on Xavier and Xavier NX

Hello,

We have developed several V4L drivers for Sony and Omnivision sensors. The supported platforms are: Jetson Nano A02, Nano B01, TX2, Xavier, Xavier NX. On non-Xavier platforms we don’t have the problem described below.

On Xavier/32.3.1 and Xavier NX/32.4.3 we observe strange behavior of some sensors.
In some cases the V4L driver (v4l2-ctl) and Argus (GST nvarguscamerasrc) receive more frames/sec than the actual frame rate of the sensor. We are sure that the sensor does not output more frames by counting the frames by a hardware signal.
As a result we have good frames (the exact number of the sensor output frames) mixed with bad frames - seem to be old frames from the FIFO or broken frames (mixed images of 2 frames).

The problem seems to appear when we save frames to RAW file (V4L) or store MP4 file by GStreamer (Argus). Live image display on the Xavier monitor does not show problems.

Any suggestions how to fix this problem?

Greetings,
Plamen

Suggest to boost the system as performance to try.

sudo nvpmodel -m x // check the /etc/nvpmodel.conf to know which one it better performance.
sudo jetson_clocks

Hi ShaneCCC,

Boosting the system performance seems to worsen the situation.
The number of frames/sec increases, which means more bad frames received.

Any other ideas ?

How about boost below clocks

sudo su
echo 1 > /sys/kernel/debug/bpmp/debug/clk/vi/mrq_rate_locked
echo 1 > /sys/kernel/debug/bpmp/debug/clk/isp/mrq_rate_locked
echo 1 > /sys/kernel/debug/bpmp/debug/clk/nvcsi/mrq_rate_locked
cat /sys/kernel/debug/bpmp/debug/clk/vi/max_rate |tee /sys/kernel/debug/bpmp/debug/clk/vi/rate
cat /sys/kernel/debug/bpmp/debug/clk/isp/max_rate | tee  /sys/kernel/debug/bpmp/debug/clk/isp/rate
cat /sys/kernel/debug/bpmp/debug/clk/nvcsi/max_rate | tee /sys/kernel/debug/bpmp/debug/clk/nvcsi/rate

Hi ShaneCCC,

Maybe we have found some fix for Xavier NX, at least first several tests seem to work OK.

The tested sensor is IMX273 1440x1080, frame rate (Sony data):
8-bit 276.0 frame/s, 10 bit: 226.5 frame/s, 12 bit: 165.9 frame/s

Some pixel clock calculations:

  pix_clk_hz = 1440 * 1080 * 276 * 1 = 429235200    (8-bits/pixel)
  pix_clk_hz = 1440 * 1080 * 226 * 2 = 702950400    (16-bits/pixel)
  pix_clk_hz = 1440 * 1080 * 276 * 2 = 858470400    (16-bits/pixel)

BTW: I’m not quite sure how to calculate correct pixel clock in the device-tree.

The problem seems to be fixed, when we use device-tree file with:

  pix_clk_hz = "1100000000";

Greetings,
Plamen

Check this document for the calculate.
Alos you can add serdes_pix_clk_hz to set it much bigger than the pix_clk_hz that you calculate.

https://docs.nvidia.com/jetson/l4t/index.html#page/Tegra%2520Linux%2520Driver%2520Package%2520Development%2520Guide%2Fcamera_sensor_prog.html%23wwpID0E0M50HA