I am trying to use MIPI cameras and USB UVC cameras at the same time. To test, I’m running a simple program that streams the cameras to an OpenCV gui window. I noticed though that there is considerable lag with the MIPI port. The UVC camera seems to be operating properly.
My pipeline for the mipi camera is:
nvarguscamerasrc ! video/x-raw(memory:NVMM), width=(int)1280, height=(int)720, framerate=(fraction)60/1, format=(string)NV12 ! nvvidconv flip-method=0 ! video/x-raw, width=(int)1280, height=(int)720, format=(string)BGRx ! videoconvert ! video/x-raw, format=(string)BGR ! appsink
I just put that in the VideoCapture constructor.
I open the UVC camera with
VideoCapture cap("/dev/video1", cv::CAP_V4L2)
Even if I turn down the MIPI camera to its lowest resolution, I can still see noticeable lag with fast moving objects. I’m hoping I can minimize that performance difference. Is there a way I can optimize the gstreamer pipeline or do something else? Or is there something else I should do to get decent performance when using both MIPI and USB UVC cameras?
Also, not sure if this is important but I get a gstreamer warning that it cannot query video position. I was able to remove that for the UVC camera by adding the
cv::CAP_V4L2 option but I don’t know what to do for the mipi camera.