We are trying to develop a new system using the AGX Xavier or Xavier NX that has 8 or 10 cameras, all streaming at once. These are relatively low resolution and should come in under the 30 Gbps required. Our understanding is that we will need to use virtual channels in order to make this work, since the Xavier devices only provide 6 physical CSI connections. We have been trying to find references that specify exactly what needs to be done in order to implement this architecture, but the best we could find is that we should use the IMX390 device tree as a guide. That looks very promising, but we have some lingering questions:
Is it possible to use virtual channels to interleave data streams from multiple cameras into the MIPI CSI ports on an Xavier and have the stock NVIDIA Xavier L4T device trees and drivers deinterleave that data into multiple camera streams? Do special camera drivers need to be written to enable this functionality?
Is there documentation that describes how to use the virtual channel feature in the Xavier in L4T, specifically how to set up the device tree and drivers? It is not in the current R32.3.1 L4T documentation Sensor Programming Guide, as far as I could find.
Is there anything that needs to be done in hardware to make this work, such as setting up frame syncing in the cameras? Is there any document that describes how hardware would need to be set up to support virtual channels? Or is it as simple as connecting two cameras to a multiplexer and that’s it?