I have a single sensor that can output multiple image data simultaneously MIPI using Virtual Channel.
Is it possible to connect that sensor directly to Jetson without using GMSL or SerDes and have multiple image data recognized as separate devices (for example, /dev/video0, /dev/video1, ) for each Virtual Channel?
Note that it is possible to connect this sensor directly to Jetson and get single image data.
In the DeveloperGuide, in the chapter “Jetson Virtual Channel with GMSL Camera Framework”, there is an example of using GMSL, but there is no description of not using GMSL.
Please let me know if this is technically possible.
Please teach me to make one more point on this matter, just to be sure.
Multiple frames of data from one sensor module are output through one MIPI signal line using a virtual channel.
Is there any problem if the data size, data type, and frame rate are different for each of the multiple frame data output?