Which Jetson Nano supported 5M camera is able to access its frame during ISP period?

Hi
The camera I am tested can only access its video frame after ISP, which has 4 frames (140ms) latency at full size (2592x1944). It is very difficult to achieve the target of auto-focus time less than 500ms at full size video.
Would you please give your suggestion of which Jetson Nano supported 5M camera can get the image frame as earlier as possible, instead of after ISP?
Thanks

hello changwen.xie,

you may access L4T sources package and check Nano’s default supported camera, Raspberry Pi v2, IMX219.
for example,
$L4T_Sources/r32.4.3/Linux_for_Tegra/source/public/hardware/nvidia/platform/t210/porg/kernel-dts/porg-platforms/tegra210-camera-rbpcv2-imx219.dtsi

may I know what’s your use-case and also your criteria. for example, please share your expected capture latency.

FYI,
there’s list of cameras supported by Jetson Camera Partners on the Jetson platform,
please also contact with Jetson Preferred Partners for your camera solution supports.
thanks

Hi JerryChang
Thank you very much for your kind reply.
I don’t know where to get this:

My application required the camera has:

  1. Jetson Nano supported (V4L2, gstreamer, etc…)
  2. Resolution: 5M (2592x1944) or higher.
  3. Video frame rate at 28 to 30 fps.
  4. The camera auto-white balance, auto-exposure, and auto-gain control features are NOT needed. But they can be manual controlled from Jetson Nano.
  5. The idea camera latency is 1 frame (frame rate is at 28 to 30 fps). 2 frames latency is acceptable.

hello changwen.xie,

please download L4T sources package to extract that to your local machine, you’ll found the source code for reference.

hello changwen.xie,

may I have more details about the latency,

the roughly pipeline is. did you mean the latency from (a) to (b) ?

Sensor → CSI → VI → (a) SW issue frame capture → Buffering → (b) Rendering to display

Hi JerryChang,
The latency for the camera e-CAM50 at resolution 2592x1944 is 140ms +/-8 (4 frames). This number is from e-con system support. I don’t know how it was measured. But it is closed to my test output (4 to 5 frames).
My test method is as below:

  1. set liquid lens control value to make the camera out of focus.
  2. After 1 second, set liquid lens control value (through I2C) to it focus value.
  3. immediately capture 10 frames from the camera and stored the captured frames into a pre-allocated memory.
  4. After 10 frames captured, save the 10 frames in the memory to a video file.
  5. manual check the saved video file frame by frame to find which frame has been focused.

Best on my test, 5th frame is changing. 6th frame focused. I also tested (using other method) the liquid lens response time is about 1 frame.
Therefore, the latency here is not (a) to (b). It is from sensor to buffering.
Thanks

Thanks.