Low latency camera and software drivers?

[u]What approach is recommended to achieve low camera latency with Jetson Nano?
[/u]
For vision based robot driving I would like that the camera latency approaches 10ms.

I used the examples at Jetson Nano + Raspberry Pi Camera - JetsonHacks which are based on gstreamer and IX219 camera.

The latency I measured is in 100-200ms range (taking image of stopwatch and a window showing the recorded stopwatch).

I am under assumption that CSI cameras can reach better latency than USB cameras. As reference I know that Raspi based vision system using OV5647 or similar with CSI interface (https://limelightvision.io) approaches 25ms total latency (incl opencv based object recognition).

I could not identify options in gst-inspect-1.0 nvarguscamerasrc that lower latency significantly.

I tested with:

cap = cv2.VideoCapture(gstreamer_pipeline(), cv2.CAP_GSTREAMER)

def gstreamer_pipeline(
    capture_width=1920, capture_height=1080,
    display_width=1280, display_height=720,
    framerate=30, exposure_time= 5, # ms
    flip_method=0):

    exposure_time = exposure_time * 1000000 #ms to ns
    exp_time_str = '"' + str(exposure_time) + ' ' + str(exposure_time) + '"'

    return (
        'nvarguscamerasrc '
        'name="NanoCam" '
        'do-timestamp=true '
        'timeout=0 '                               # 0 - 2147483647
        'blocksize=-1 '                            # block size in bytes
        'num-buffers=-1 '                          # -1..2147483647 (-1=ulimited) num buf before sending EOS
        'sensor-mode=-1 '                          # -1..255, IX279 0(3264x2464,21fps),1 (3264x1848,28),2(1080p.30),3(720p,60),4(=720p,120)
        'tnr-strength=-1 '                         # -1..1
        'tnr-mode=1 '                              # 0,1,2
#        'ee-mode=0'                               # 0,1,2
#        'ee-strength=-1 '                         # -1..1
        'aeantibanding=1 '                         # 0..3, off,auto,50,60Hz
        'bufapi-version=false '                    # new buffer api
        'maxperf=true '                            # max performance
        'silent=true '                             # verbose output
        'saturation=1 '                            # 0..2
        'wbmode=1 '                                # white balance mode, 0..9 0=off 1=auto
        'awblock=false '                           # auto white balance lock
        'aelock=true '                             # auto exposure lock
        'exposurecompensation=0 '                  # -2..2
        'exposuretimerange=%s '                    # "13000 683709000"
        'gainrange="1.0 10.625" '                  # "1.0 10.625"
        'ispdigitalgainrange="1 8" '             # "1 8"
        #
        '! video/x-raw(memory:NVMM), '
        'width=(int)%d, '
        'height=(int)%d, '
        'format=(string)NV12, '
        'framerate=(fraction)%d/1 '
        #
        '! nvvidconv flip-method=%d '
        '! video/x-raw, width=(int)%d, height=(int)%d, format=(string)BGRx '
        #
        '! videoconvert '
        '! video/x-raw, format=(string)BGR '
        #
        '! appsink'
        % (
            exp_time_str,
            capture_width,
            capture_height,
            framerate,
            flip_method,
            display_width,
            display_height,
        )
)

@utzinger
Sorry to tell current best latency with ISP is about 4 frames. You may have select the high frame rate sensor mode like 120 fps sensor mode to get better performance.

@ShaneCCC

We are also facing similar problem. Is there a way to disable ISP to get single frame latency

Thanks

There’s no support to bypass ISP for argus interface. But v4l2(v4l2src) is the interface exclude the ISP, and if bayer sensor need to debayer from software by yourself.

Hello,
I have a few follow-up questions on the 4-frame ISP lag.

  1. Is this still true as of today (august 2021)?
  2. Is it only related to Jetson nano? But not TX2 or Xavier.
  3. Is it a hardware limitation?
  4. If I set framerate to 1 fps - does it mean the latency will be 4 seconds?

Thanks

  1. Yes,
  2. All of jetson device.
  3. It’s software relative.
  4. Theoretically yes. But may have timeout problem for much low frame rate output.

Regarding 3. Is it in libargus? The source code for which is not available I guess.
Regarding 4. Is this a bug? Or what is the limitation here?

For lowest latency possible would you recommend sticking to v4l2? Sorry, found the answer above.

I’m doing a little research currently on what hardware I need. The idea is to take Arducam OG02B10 and <jetson-or-rpi?> and fpv stream within local wifi.

Thanks for a quick reply.