I’m currently trying to build optimized GStreamer pipelines on both Jetson TX2 and Jetson Nano (in order to use CSI cam directly). However, I get confused whenever I look at the flowchart below and can’t find any explanation that satisfies enough.
One way to think of the architecture is that there are two paths for getting video data from the hardware to your application. One path is the typical linux video path, v4l2. The other path is Jetson specific that processes data through the ISP. USB cameras often support v4l2 although other software interfaces exist (i.e. USB 3 Vision). CSI cameras on Jetson allow you to pass the data either through v4l2 or through the ISP (argus) path. The ISP performs tasks such as debayer, color correction, and more. For CSI cameras it is very beneficial to utilize the ISP path.
At the application level you can process the data however you like. GStreamer is very popular. If your video data is passing through the ISP path you can use nvarguscamerasrc to easily get the data into a GStreamer application.
Thanks for the explanation but I wonder how ISP is being utilized within the way you specified? Is it utilized through [camera core]->[tegra drivers]->[ISP]?
HI Doruk898,
The flow of frames from sensor to the user space when processed through ISP is as below:
[Sensor] → [VI-Bypass] → [Tegra Drivers] → [ISP] → [Tegra Drivers] → [Camera Core] → [libArgus] → [Application].
Note that the internal blocks mentioned in green are Nvidia proprietary and are enclosed within the libraries shared.