We need to take 2 synchronized images of small objects. Those are 3-5MP cameras, 55fps, connected to the Nvidia Xavier through USB 3.0 Hub. I have seen examples of taking synchronized pictures, however, I need some detailed pointers or examples of how to connect the 2 cameras (they are FLIR cameras and can be synchronized) through GPIO and what pins to connect to on the Nvidia Xaiver side. I’m using a Xavier dev kit.
you’ll need to have hardware and also software approaches to have camera synchronization.
from hardware side, there should be a frame-sync pin to align the start-of-frame signaling for these cameras.
from software side, there’s
syncSensor sample to duplicate single capture request for these camera sensors.
please also check similar discussion thread, Topic 111355 for more details of dual camera synchronization.
Thank you. Do you have any pointer on the exact Xavier GPIO pins that we need to use? Per the guidance from the camera vendor (below), we need to have a hardware trigger that provides a 3.3 or 5V square wave TTL signal. So, for 2 cameras (secondary), we will connect Pin2 and Pin4 (5V each) of Nvidia Xavier GPIO to the input voltage pins of secondary cameras and we will also connect the GND pins of Xavier to the GND pins of secondary cameras?
“An alternative method of synchronized capture is to have all cameras triggered by an external hardware trigger (for example, a function generator). Any hardware trigger that provides a 3.3 or 5 V square wave TTL signal can trigger the cameras. This application note does not explicitly cover the configuration of an external hardware trigger, but users who want to use an external hardware trigger can act as if the external hardware trigger is the primary camera, and follow the physical layout section mentioned in the article (if a pull-up resistor is implemented, this can be ignored, as external hardware triggers don’t need it).”
please refer to Configuring the 40-Pin Expansion Header session, by using Jetson-IO python tool to control GPIO pin.
With the FLIR USB camera, sychSensor is not working. I can see the 2 cameras in usb-devices output. I am able to execute the tools that come with camera vendor (cpp files or spinView) to acquire images.
Executing Argus Sample: argus_syncsensor
Argus Version: 0.97.3 (multi-process)
Error generated. /usr/src/jetson_multimedia_api/argus/samples/syncSensor/main.cpp, execute:347 Must have at least 2 sensors available
please note that, Argus samples only support bayer sensors.
you may also refer to Camera Architecture Stack, USB camera can only passing through v4l2src.