We have stereo cameras (IMX185) and are using Argus to retrieve frames, OpenCV to remap them, and visionworks nvxSemiGlobalMatchingNode to calculate the disparity.
This works great when the scene is stationary, but as soon as there is movement the disparity fails. If you stand in front of the cameras and move side to side you can see yourself flashing black and white. If I’m not mistaken this sort of flashing is caused by a time delta between the left and right captures. However the timestamps for both Argus captures (using the method in the syncSensor sample) are identical.
Here is a video of our output:
In the beginning you can see one of my coworkers moving through the frame in the background, then I move the camera fixture side to side, and then up and down which causes the disparity to completely break down.
Does anyone have ideas of what is going on? Is it simply a camera sync issue or could it be something else? If it is just a camera sync issue is it not possible to do disparity with just software sync?