I would like to be able to use any CSI MIPI camera with my TX2.
Currently I am using libargus with the camera that comes with the dev-kit.
However, when I plugged in a raspberry PI camera V1, libargus was not able to work with that camera.
Searching further on the net, I found that for a CSI MIPI camera to work with the TX2 it must have a driver (this information might be incorrect, and I would like to verify if that is the actual case.).
So my questions are:
1 - Does a CSI MIPI camera need a driver if we use opencv + gstreamer?
2 - Does a CSI MIPI camera need a driver if we use v4l2? if not, how can we use it? example code?
3 - How can I get libargus to work with any generic CSI MIPI camera?
4 - How can we write a generic CSI - MIPI driver that will work for any generic CSI - MIPI camera? and does this driver already exist?
I am using the Jetpack R32.1 release.
As for the use case: we are working on creating a camera test bench using the Jetson TX2.
The way it works is to plug in (to the Jetson) any CSI MIPI compatible camera and then have the Jetson TX2 stream some frames from that camera without requiring any driver/firmware updates/changes (that is we do not install a new/different driver for each different camera). We then evaluate the quality of the frames collected and put that into a database.
Is this possible? considering we are targeting CSI MIPI cameras which in theory should all be compatible and not need any camera specific driver?
please refer to Using Plugin Manager session, which is exactly doing dynamically device registration during kernel initialization.
however, your camera module board must had onboard EEPROM for plugin-manager.
you might also refer to JetPack-4.2 build-in kernel sources and also device tree as below,
Any MIPI camera that you plug into the Jetson TX2 will require a driver to configure the camera sensor according to the resolution, pixel-format, MIPI lanes need it. This driver will also register a video device to allow the capture from the sensor.
Also, it’s necessary to update the device-tree in order to add the proper hardware configuration depending to the connections between the Jetson board and the camera. For example, define the correct bus (I2C, SPI) to communicate with the sensor, CSI channels and MIPI lanes, etc.
For your use case, a generic driver can be developed to allow streaming from different cameras. The idea will be that this driver only registers the video device, but it does not configure the cameras. So, the video device will be available for getting data from the specified CSI channel(s), but you should manually configure the camera before capturing, by running a script to initialize sensor streaming with the desired configuration.
It will be required that the cameras are compatible to each other. I mean all the cameras use the same CSI channel, the same number of lanes and the same pixel-format. Also the generic driver needs to have support for the resolutions of all the cameras.
In shortly, your use case will require to implement a driver according to your needs. Then you will need to compile L4T sources and flash your Jetson TX2 with a new Image. You can find information about how to compile Jetson source code here:
Did you end up implementing the driver mentioned by @EnriqueR that was marked as the preferred solution? I would be interested in your experience since I’m attempting to do something very similar.