I use xavier NX to capture 6 cameras 2616 *1964 30fps mipi sync,do images stitching。
I can get the correct capture result of 2 cameras by modifying the sample Argus/sample/syncSensor。
Increase the number of videos by modifying sample,The system library only supports up to 4 cameras,
After updating the argus libs,6 cameras can be support,Xavier NX use argus sample syncSensor capture 6 camera error
but either 4 or 6 cameras, all output files having the same image from camera 0
developer kit version jetpack4.6.1(L4T 32.7.1)
Problems similar to: Xavier AGX Synchronized Capture with 4 Cameras Argus: virtual cameradevice joining two cameras to one buffer
My question is:
Is there any error in my modification? main.cpp (17.1 KB)
There are obvious differences between images through multiple CaptureSessions,so I want to know how to synchronize the isp parameters of one channel to all via argus?
you may see-also similar topic, Topic 111355 for dual camera frame synchronization.
you’ll need to have multiple sensor per single capture session method, which means it’ll duplicate single capture request to these camera sensors for using one set of ISP settings.
you means the sample synSensor? by modify the sample to 6 cameras, the capture image all from camera 0?
Run the sample I modified to print the log,Does it mean that there is a problem with my program or hardware?
could you please share your code snippets to enable all six cameras.
as you can see,
sample code it uses left camera as master to create the capture session, there’s only single capture request for these two left/right cameras.
std::vector <CameraDevice*> lrCameras;
lrCameras.push_back(cameraDevices[0]); // Left Camera (the 1st camera will be used for AC)
lrCameras.push_back(cameraDevices[1]); // Right Camera
// Create the capture session, AutoControl will be based on what the 1st device sees.
UniqueObj<CaptureSession> captureSession(iCameraProvider->createCaptureSession(lrCameras));
ICaptureSession *iCaptureSession = interface_cast<ICaptureSession>(captureSession);
...
UniqueObj<Request> request(iCaptureSession->createRequest());
IRequest *iRequest = interface_cast<IRequest>(request);
// Enable both streams in the request.
iRequest->enableOutputStream(streamLeft.get());
iRequest->enableOutputStream(streamRight.get());
The bug should be that I forgot to modify the code after copying it。
I printed the cameraDevices.size() and determined that 6 devices have been created。
could you please refer to Topic 217401. that’s looks like an issue with syncSensor.
we also need to check this is bandwidth related. please also try enable four or six camera with lower resolution and frame-rate.
thanks for sharing, could you please further use four camera for testing?
TBH, we’ve never test syncSensor sample application to enable multiple sensor per single capture session for more then two cameras.
there’s bug fix, and I’ve include the change and built pre-built binary. it’s include r35.2.1 and r32.7.3 versions.
please download Topic241908_Feb14.7z (3.3 MB) and replace it with /usr/lib/aarch64-linux-gnu/tegra/libnvscf.so. you may reboot the system to have change take effect.
I’ve revise my previous comments to attach pre-built binary with r35.2.1 and also r32.7.3 versions.
I’m not sure whether it works with r32.7.3 since there’re some code conflicts. please have a try, thanks