I am wondering if using a USB3 camera module would incur significantly more resources than a MIPI camera? We are evaluating some modules and have seen promising ones that communicate over USB3 at ~300fps that are ideal. USB ones give us a little more flexibility for the custom circuitry we are designing around it, but the system load is a worry. What if there were 2/3/4 or more cameras connected to the same Jetson?
Thank you in advance.
For MIPI cameras, do you use Bayer sensors which outputs in Bayer format and uses Xavier ISP engine? Or YUV sensors which outputs in YUV422 formats?
For USB cameras, if you can use tegra_multimedia_api, it can achieve better performance than using gstreamer.
In gstreamer, it uses v4l2src and require extra memcpy of copying from CPU buffers to NVMM buffers:
v4l2src ! video/x-raw ! nvvidconv ! video/x-raw(memory:NVMM) ! ...
In tegra_multimedia_api, we have 12_camera_v4l2_cuda which demonstrates capturing through NVMM buffers, saving the memcpy.