I am using a CSI IMX219-based camera and I have been doing some tests with the ISAAC argus_camera sample on a Jetson Nano board. I have read that the camera should have the intrinsic parameters stored in its EEPROM and that the argus-based driver should load them.
I have two questions:
Is there an “easy” way (using v4l2-ctl, nvgstcapture or ISAAC Sight) to check what parameters are being used?
I have done my own calibration. Can I use my own intrinsic parameters with ISAAC? For instance, by combining the isaac.ArgusCsiCamera, isaac.perception.ImageWarp and the isaac.message_generators.CameraIntrinsicsGenerator (and make them synchronize by the output of the ArgusCsiCamera)?
Regarding my second question, I have just finished adapting the “april_tags_python” ISAAC sample to use the CSI camera through argus. I also calibrated the camera and published the camera parameters using the CameraIntrinsicsGenerator, triggered by the argus camera image. I got around 13.5 FPS for the april tags detection on a Jetson Nano with the camera running at 1280x720@60 FPS, according to ISAAC Sight.
I am still wondering, does anyone know the answer to the first question?
Thanks for the reply. I changed the code of the argus_syncstereo to provide the output with a single camera, but if the “if (iSyncSensorCalibrationData)” returns false: I guess that the camera I am using does not have any intrinsic parameters stored in its EEPROM.
Although one can create such an application from this sample, it would be nice to have one application sample included, that would just output this information from a given camera.