Orin Nano & IMX477 how to set framerate below specified range?

Hi there,

For our use case in our RPA, we require 3x IMX477 cameras to be run through a single USB2.0 hub before interfacing with our companion computer (Orin Nano 8GB).

We are using the official B0278 UVC USB adapter board between the camera and the USB hub.

Currently, we are unable to initialise 3x GStreamer pipelines at full rez (4032x3040) at the lowest “allowed” framerate of 10FPS, with a lack of USB2.0 bandwidth identified. 2x Gstreamer pipelines across 1 USB 2 connection and the other pipeline on a separate connection operates without issue at 10FPS, but this is not suitable as we require that USB2.0 connection for another device.

As we do not require high framerates, we could easily get away with a 1FPS pipeline for all cameras to save bandwidth and allow all cameras to run through a single hub. However, when setting the framerate below 10FPS we encounter an error as it is outside the driver’s discrete resolution/framerate settings.

Our script is as follows:

import time

import cv2
import argparse

def get_args():
    parser = argparse.ArgumentParser(description="Get video feed from camera")
    parser.add_argument("--width", "-W", type=int, default=4056, help="Width of the frame")
    parser.add_argument("--height", "-H", type=int, default=3040, help="Height of the frame")
    return parser.parse_args()

def get_gstreamer_camera(camera):
    return (
        f"v4l2src device=/dev/video{camera} !"
        "image/jpeg,format=MJPG,width=4032,height=3040,framerate=10/1 !"
        "nvv4l2decoder mjpeg=1 ! nvvidconv ! video/x-raw,format=BGRx ! appsink drop=1"

def get_camera(camera, args):
    cap = cv2.VideoCapture(get_gstreamer_camera(camera))
    return cap

def main():
    args = get_args()
    cameras = [get_camera(c, args) for c in [0, 2, 4]]

    while True:
        for index, cap in enumerate(cameras):
            start_time = time.time()
            print(f"Capturing frame {index} at {start_time}")
            ret, frame = cap.read()
            if not ret:
                print(f"cap fail for {index}")
            cv2.imwrite(f"test_{index}.jpg", frame)
            end_time = time.time()
            print(f"Time taken to capture frame {index}: {end_time - start_time}")


if __name__ == "__main__":

In short, what method should we use/how do we go about enabling a custom framerate below 10FPS? We are happy to explore any avenue to enable this feature.

Many thanks.

hello oliver74,

per my understanding, IMX477 is a bayer camera sensor, it uses CSI, you cannot feed that to USB directly.
may I have more details about what’s your modification?

Hiya, thanks for the reply.

We are using the Arducam B0278 UVC USB adapter board to convert MIPI-CSI to USB2.0.

I hope this helps.

hello oliver74,

you may give it a try to stream via nvarguscamerasrc directly.
I’ve test locally, it could specify a lower frame-rate through pipeline.
for instance,
here’s sample pipeline.
$ gst-launch-1.0 nvarguscamerasrc sensor-id=0 sensor-mode=0 ! 'video/x-raw(memory:NVMM),width=3840, height=2160, framerate=5/1, format=NV12' ! nvvidconv ! fpsdisplaysink text-overlay=0 name=sink_0 video-sink=fakesink sync=0 -v

the test result shows it’s running with 4K@60-fps sensor mode. but outputting at 5-fps
for example,

GST_ARGUS: Running with following settings:
   Camera index = 0 
   Camera mode  = 0 
   Output Stream W = 3840 H = 2160 
   seconds to Run    = 0 
   Frame Rate = 59.999999 
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
/GstPipeline:pipeline0/GstFPSDisplaySink:sink_0/GstFakeSink:fakesink0: sync = false
/GstPipeline:pipeline0/GstFPSDisplaySink:sink_0: last-message = rendered: 4, dropped: 0, current: 6.82, average: 6.82
/GstPipeline:pipeline0/GstFPSDisplaySink:sink_0: last-message = rendered: 7, dropped: 0, current: 4.99, average: 5.90
/GstPipeline:pipeline0/GstFPSDisplaySink:sink_0: last-message = rendered: 10, dropped: 0, current: 5.03, average: 5.61
/GstPipeline:pipeline0/GstFPSDisplaySink:sink_0: last-message = rendered: 13, dropped: 0, current: 5.02, average: 5.46
/GstPipeline:pipeline0/GstFPSDisplaySink:sink_0: last-message = rendered: 16, dropped: 0, current: 5.00, average: 5.37
/GstPipeline:pipeline0/GstFPSDisplaySink:sink_0: last-message = rendered: 19, dropped: 0, current: 5.00, average: 5.30
/GstPipeline:pipeline0/GstFPSDisplaySink:sink_0: last-message = rendered: 22, dropped: 0, current: 5.01, average: 5.26
/GstPipeline:pipeline0/GstFPSDisplaySink:sink_0: last-message = rendered: 25, dropped: 0, current: 4.98, average: 5.23

Hi Jerry,

Thanks for the suggestion!

Unfortunately, while this does output frames at 5FPS I believe this still initialises the pipeline at the full bandwidth of 4K60 which is too high for >2 concurrent streams over USB2, so the 3rd pipeline does not initialise citing a lack of bandwidth.

Another issue faced is that this limits our frame outputs to 8MP (3840x2160), while we require the full 12MP out from the sensor.

Adjusting the output stream to 4056x3040 seems to just scale the 8MP stream up to fill the 12MP W x H, so we do not gain any additional sensor information.

A bit stumped now!

hello oliver74,

another approach is lower the output frame-rate from sensor driver side.
since it’ll also change the timing, please check with sensor vendor for suggest register modification.

Hi Jerry,

We ended up reaching out to the vendor for help building a custom driver with a lower framerate, unfortunately the NRE they requested was outside of our budget.

The exploration continues!

hello oliver74,

you may give it a try to record the stream as a low frame-rate content.
BTW, you should use Software Encode in Orin Nano since it does not have the NVENC engine.