Hi,
I’m trying to stitch two CSI camera (raspberry pi 2 cameras) inputs together to one panorama with VisionWorks, but can only get frames from one of the cameras.
Changing the source URI in nvx_sample_nvgstcamera_capture to “device:///nvcamera?index=0” and “device:///nvcamera?index=1” makes no difference.
I get images from both cameras by changing the sensor-id in gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! nv3dsink.
What is a recommended way to connect two camera sources (nvarguscamerasrc backend) with VisionWorks?
Should I try to modify the GStreamerNvCameraFrameSourceImpl to support two camera sources?
If this is a good way, perhaps you have some tips for how to do that.
Side note: I want to use VisionWorks on the Nano because the relevant functions in the released VPI does not run on the Nano GPU. I’ve read that VPI will support the GPU in a future release, but I can’t wait for that.
I hope VPI will have an “easy” way of interfacing to multiple camera sources.