Our company is working in area of virtual reality. We shoot videos with 5-cameras or 8-cameras rigs and stitch them into 360 degrees panoramic video. For a long time, we used for stitching AutoPano. But when we learnt about VRWorks 360 and tried it. The results were highly impressive, so we decided to switch to your tool. We use version of 2.1 for Windows.
For calibration of our camera rigs, we use your application nvcalib_sample, filling in preliminary with our data files image_input.xml and rig_spec.xml. For Titan 8-cameras rig, we have reached the calibration quality of 0.93, and we see that for different frames with a good calibration of 0.92-0.93, we get close parameters in the calibration file. Then we stitch our frames using your application nvss_sample, and get proper stitching for many frames.
For 5-cameras Sony rig, we reached even higher quality of 0.94-0.946, but the calibration parameters jumps from one good calibrated frame to another. So, the results of calibration look absolutely unreliable.
Please, could we indicate where is the problem? Maybe for 5-camera case we need to do a kind of preprocessing?
A while ago we sent you a letter related to your option “moving seams” for nvss_sample. In your document describing this feature(https://devblogs.nvidia.com/vrworks-360-video-sdk-2-0-adds-features-turing-support/), you show the image of a man on a bike, where option “seam_offset” shifts the seam but do not violate the alignment. When we try this option, the stitched image with the seams offset is very different (the bigger offset the more different the images, and the difference could not be removed by any affine transform) from the image stitched with the default seam. Please, could you indicate us how
we could use the exceptionally good and attractive option “seam_offset” without breaking the alignment?