Question about CSI-MIPI genlock and cameras

I want to do stereo vision with a TX2 devkit.
I’d love to use the built-in CSI-MIPI for that, rather than having to go with external USB cameras, or even worse, external digitizers.
If I can get ahold of the right camera modules, I could make an enclosure for them, and characterize them (create calibration files for OpenCV, measure baseline, etc) myself.

However, I need to know some things about the available CSI interface:

  • are there multiple physical CSI ports?
  • do they use CSI flex cable connectors, or something else?
  • how does genlock work in CSI? Can it be driven by the board, or am I at the mercy of the controller chip on the camera module?
  • does anyone know of any global shutter camera modules with CSI-MIPI? CCD sensors, for example?

I don’t need high resolution (VGA is plenty!) but I do need bright, sharp, non-motion-blurred images.
However, I can find approximately zero documentation on how to best go about this :-(

Examples:

I will appreciate any pointers on how to navigate here. As far as I can tell, no great option exists?

I’m sure some of this is just me not knowing exactly how CSI works.
It seems like I2C is brought out as part of it?
Does every camera respond to the same I2C signals, or is some kind of driver necessary for each brand/model of camera?
I notice there is a master clock; does this also control frame capture (genlock) or is it a free-running clock?

Much current sensors are using i2c to initialize and change the setting. I2C just only a control interface to communicate with the camera sensor. MCLK is the sensor working/operate clock it’s frequency depend on sensor.

Thank you for the comment! That’s approximately as I thought.
Unfortunately, that doesn’t tell me how genlock (capturing frames in sync across multiple cameras) is supposed to be achieved.

Folks at Leopard have suggested I could use a sensor with global shutter and external trigger, and use the external trigger for both sensors.
This will reduce my frame rate, but that’s actually OK – I can live with 30 frames per second, as long as each frame is captured during a very short interval. The reason I need short exposure time is to be reasonable sharp even when moving.

If you are able to get libargus working for your sensors there are code examples for synchronizing multiple cameras. I personally haven’t tried this but am aware of the sample code existing in:

/home/ubuntu/tegra_multimedia_api/argus/samples/syncSensors

You will need two CSI MIPI camera modules developed for and certified for the Jetson TX2 board(s). The electrical engineering to support high speed signaling is non-trivial and a device that works with one SOC may not (in fact, probably will not) work with a different SOC and probably won’t even work with the same SOC on a different board. It’s not nearly as forgiving as USB.

Most camera modules are controlled by i2c, and they offer either (or both) CSI and parallel output. The i2c control is the easy part and triggering is also easy, either through i2c or through a GPIO on the sensors. Although, the i2c settings are generally hard-coded in a driver so you will need to do kernel driver work if you are working with an SOC/Sensor combination that is not already supported and configured for your MIPI bus settings.

In other words … don’t try unless you do driver work, have a 20Gsps scope, have experience engineering high speed signal differential pairs, and you have a lot of money to have custom boards and flex pcbs run.

I wonder though … can the MIPI-DSI on the Jetson TX2 devkit be re-configured to support another device identical to the one provided?

Your advice is sound. I do have some of the necessary experience – exotic boards and driver development, for example – but not a 20 Gs/s scope :-)
However, I’m not looking to build my own flex/sensor/lens/adapter assembly; I’m looking to buy something that works.

So far, that mainly points at Leopard Imaging sensors. They have 3-camera and 6-camera adapter boards, and have a number of qualified cameras based on Sony sensors. For example,
Also, they are somewhat pricey, presumably because of the rather specialized, low-volume market they sell into.

Unfortunately, those sensors don’t quite fulfill my requirements – no genlock, rolling shutter, and 60 Hz max rate seem to be the main two problems.
I spoke to a gentleman (Bill Pu) at Leopard who suggested their V034 sensor. It’s not currently listed on their web site as CSI2/MIPI, but it’s supposed to also exist in that version. And it has an external trigger, which makes the GPI based genlock simple. According to Bill, the driver is “in development.”
(Honestly, I’d almost be willing to suggest I can develop the driver based on whatever sample code and data sheets they have, for free in exchange for a pair of the cameras. This is assuming electrical development is already done ;-)

I can’t find any specifications on the camera adapter board that comes with the TX2 devkit – does it provide access to additional CSI2 board connectors, or do I also need to get a connector board with the cameras? (Such as the kit: https://www.leopardimaging.com/LI-TX1-KIT-IMX185CS-D.html)

The kit is only specified at 30 Hz. The reason I need high frame rate is really just that I need a short exposure time. I’m not sure I even will be able to analyze 30 stereo frames per second, but I want short exposure time to minimize blur and skew from movement. I haven’t yet gotten very good answers about the exposure time of these units.

DSI is the display half. CSI2 is the camera part. (That, and GMSL, if you want more, slower, cameras.)

That being said, finding DSI displays “ready to plug” is pretty much impossible.
Anyone trying to do the same for a Raspberry Pi will also have found this problem.
All the TFT LCDs are 40-pin parallel.

(BTW: Good job on the hardware designers matching up the Jetson 40-pin GPIO header to the Raspberry Pi pinout ;-)

Hi snarky,

Please have a look at our 6 camera adapter board for TX1/TX2 which has inbuilt synchronization option available.

https://www.e-consystems.com/multiple-csi-cameras-for-nvidia-jetson-tx2.asp

The camera can do 60 fps as well. But requires driver customization because it is programmed to work in 30 fps by default in multiple resolutions. If you send us your requirements (resolution, framerates etc.) we can customize the product based on your requirements. Please contact us at camerasolutions@e-consystems.com

I’ve done some more research.

The answer is that, no, the Jetson camera adapter that comes with the kit is a single camera, wired to the board, with no external CSI adapter.

Various vendors, include Auvidea, Leopard, and e-con, provide boards that provide some variety of CSI/MIPI input ports. Configurations I’ve found (I’m sure there are more) include:

  • 6xCSI/MIPI 2-lane (e-con, Leopard)
  • 3xCSI/MIPI 4-lane (Leopard)
  • 2xCSI/MIPI 4-lane, 2xCSI/MIPI 2-lane (Auvidea)

There are also other options like GMSL which I won’t worry about.

Additionally, the I2C interface to the camera sensors is not well standardized, and public datasheets are not available, so in addition to the physical board contacts (there are at least two standards, one with coarse-pitch 15-pin connectors and one with fine-pitch 40-pin connectors) we also need a driver.
And, because of the said non-public data sheets, vendors typically only provide drivers in binary form, which means you may become stranded when you upgrade the kernel. This is non-good, but seems unavoidable.

It’d be great if the camera setup work could be moved to user-land, and shim through a thin generic I2C interface somehow, but that’s an architecture change I can’t do anything about myself. (Nvidia guys: Hint, Hint :-)

My current hope is that Leopard will finish the drivers for their VGA resolution color CSI cameras with external trigger and global shutter (v034) and make them available for purchase for the Jetson.
If that doesn’t work out, I may go with their 1080p cameras, although they have rolling shutter and need software-sync for triggering. Software-sync seems to be okay-ish, so that may not be so bad.

Hi,

If you get an NDA with the sensor vendor you can get the datasheet of the sensor and create your own driver, the driver will work with v4l2, nvcamerasrc, libargus, etc. There are several examples in the kernel about how to create your driver. RidgeRun also offer services to create a driver for any camera that you pick and we give you the source code. We also give the driver for IMX219 (raspberry pi 2 camera) for free so customers can prototype.

Regards,
-David

In a previous life, I did a lot of that! I’m just hoping to be able to stay on the padded side of the syscall interface for now :-)

Speaking of which, I do have a couple of RPi2 cameras laying around – where can I find said driver?
(I’d still have to get an appropriate CSI board.)

Hi Snarky,

Great, sent me an email to david.soto@ridgerun.com and I will provide you the tarball and the instructions. We are currently porting the driver to TX2 because several things changed in the device tree but it works correctly on TX1.

-David