Isaac ROS custom camera support

Dear Experts,

I would like to know the steps to add a custom camera into the list of Isaac ROS camera support, please ?

To be clear, we design two kind of stereo cameras : One is Ethernet based stereo camera, and other will be a pair of sensors (Sony IMX296) connected to the Jetson modules vis MIPI-CSI2 interfaces.

Thanks in advance and best regards,
Khang Le Truong

Isaac ROS supports mostly the cameras in the Jetson camera ecosystem listed here. I’ve passed along your information to the Jetson embedded ecosystem partners group and will update you when I hear back.

Dear @hemals ,

I found your comment in another discussion about non GMSL (i.e. custom) stereocamera which would be similar to our own Jetson based stereocamera with a synchronized pair of Sony IMX296 sensors/modules (also similar to the HemiStereo NX - World’s Most Powerful AI-enabled Stereo Vision Camera). We could even use a pair of Onsemi AR0234CS similar to the LI-AR0234CS-STEREO-GMSL2-30 officially supported by Isaac ROS.

In the comment you said :

libArgus relies on drivers developed by vendors that ship with Jetpack for specific, supported cameras over GMSL2

As a camera maker, we would like to know what specificity that the stereocamera vendors should add into their drivers so that their cameras could be used with libargus, and eventually with the Isaac ROS Argus Camera package/module? And by driver, do you mean the ROS/ROS2 driver?

You might find it helpful to write your own ROS node that manages the two streams and publishes the image pairs yourself than going through Argus too.

Do you confirm that if we write our own ROS/ROS2 node that merges two streams of individual sensor/module and publishes the image pairs as a stereo frame, we could use our camera as input of Isaac ROS Argus Camera ?

Thanks in advance,
Khang Le Truong

Hi @hemals,

Have you had any update on this ? Our expectation would be something similarly described in the following discussion : Isaac ROS Argus Camera on Non-GMSL Stereocamera - #3 by hoermandinger

Thanks and best regards,
Khang Le Truong

Just to recap from that conversation for my own understanding: if you have two monocular imagers that are Argus-compatible with your own timing trigger pulse to synchronize their captures, libArgus will still see this as two separate imagers, not one stereo camera.

You can then develop a ROS node that pairs these images up using an approximate time synchronization policy which also adds the stereo rectification parameters into CameraInfo so that the rest of the ROS graph sees this as a stereo camera.

Without an evaluation of this stereo camera from the Jetson embedded ecosystem partners group to add the camera to the list of approved Argus stereo cameras, this multi-imager rig would not be recognized as an Argus-compatible stereo camera.