I’m currently working on a project with the NVIDIA Jetson Nano and am aiming to achieve a high frame rate capture of 2000fps using a CSI camera. My goal is to directly read the sensor data, store the frames in RAM (specifically in grayscale), and capture a short burst of 0.5 seconds, resulting in 1000 frames. Additionally with a second CSI camera, doing the same for stereo imagery.
I have a few questions and would greatly appreciate any insights or experiences you might have:
- Feasibility: Has anyone successfully achieved such high frame rates with a CSI camera on the Jetson Nano? If so, could you share some details about your setup and approach? (I have seen that some people were able to do something similar on a raspberry pi)
- Camera Recommendations: Are there specific CSI camera models or brands known to support 2000fps at resolutions between 480p to 720p?
- Storing Frames in RAM: I’m looking to bypass storage limitations by directly storing the captured frames in RAM. Has anyone implemented this using Direct Memory Access (DMA) or any other method on the Jetson Nano?
- Accessing Raw Sensor Data: I want to access the raw sensor data directly from the camera. Could anyone provide guidance or resources on how to achieve this? Are there specific drivers or software tools that can assist with this?
I understand that achieving 2000fps at good resolutions is quite challenging, especially considering the data throughput and processing requirements. However, any guidance, experiences, or resources you can share would be immensely helpful.
Thank you in advance for your assistance and looking forward to your insights!
Best regards, Felix
this looks not supported. we’ve never test frame-rate above 120-fps. I did see some use-case with 240-fps, but I cannot find the similar topic as for now…
anyways, the data-rate with 2000-fps should also way beyond the bandwidth limitation.
you may follow below to evaluate the CSI data-rate, let’s assume it’s 640x480, 10-bit, 2000-fps.
CSI data rate =
640 * 480 * 10 * 2000 * 1.15 (15% overhead) = 7,065,600,000 ~= 7 Gbps
according to [Jetson Nano Product Design Guide]
Each data lane has a peak bandwidth of up to 1.5 Gbps.
so, this is already above the max CSI bandwidth per 4-lanes. (i.e.
1.5 Gbps * 4 = 6 Gbps)
this is also challenge even though with a lower sensor resolution to reduce the bandwidth, such high frame-rate means the buffer transmit should within…
1/2000 = 0.5ms
Thanks for the helpful response.
My approach to this problem would now be the following:
I would try to use 2 OV9281 with an external trigger and switch between them achieving nearly 1000fps. This might be enough for my purpose. However, I also couldn’t find any tutorial on how to use this camera with an external trigger with the Jetson Nano.
Also, do you think it is possible to switch that quickly between them to achieve that framerate?
I don’t quite understand this.
could you please give an example, or, the details description for your assumption?
Let me try to rephrase my goal. Some people were able to produce a camera output of 450fps with this camera (CAM-MIPIOV9281-V2) on a raspberry pi. This camera uses an external trigger to achieve that framerate.
Using the GPIO interface of the Jetson Nano I would do the same and as there are drivers available for this camera, I don’t see any problem. The tricky part will be managing 2 cameras at the same time. By synchronising them with the external trigger, I could let camera 1 shoot, then camera 2 achieving nearly 1000fps.
Do you think this is possible?
I cannot give you a solid answer since we’ve never test such high frame-rate locally.
it usually uses FSYNC, (frame-sync) hardware pin to connect two camera module to achieve hardware level synchronization.
there’s software example,
Argus/samples/syncSensor please also check Topic 1070823 for reference.