Gstreamer not opening camera in OpenCV

System info:

  • Tegra K1
  • Ubuntu 16.04 - Linux for Tegra
  • OpenCV 3.1.0 built from source with Gstreamer support -DWITH_GSTREAMER=ON \ -DWITH_GSTREAMER_0_10=ON
  • Camera is OV5640 connected to the board via CSI
  • Code to replicate problem

    //Various Pipelines I have tried//
    //const std::string gst_pipeline = "v4l2src ! queue ! 'video/x-raw-yuv,format=(fourcc)Y444,width=1920,height=1080,framerate=25/1' ! queue ! appsink";
    const std::string gst_pipeline = "v4l2src ! videoconvert ! appsink";
    cv::VideoCapture cap(gst_pipeline, cv::CAP_GSTREAMER);
    if (!cap.isOpened()){
    	printf("Camera not opened \n");
    	return -1;
    }
        // View video
    cv::Mat frame;
     for(int i = 0; i < 1500; i++) {
           cap >> frame;  // Get a new frame from camera
    }
    

    Doing this I get

    Camera not opened
    

    printed to the terminal.

    If I try to use the default camera, i.e.

    cv::VideoCapture cap(0);
    

    the camera opens but the output is not anything I can use. rhs is the output from using the default camera. https://drive.google.com/file/d/1D7OxTj2UecsbtiihJPzMdOkJBEjdsvwk/view?usp=sharing
    This image is not a bayer image, it is a 3-channel image and none of the colorspace conversions help.

    My Questions

  • Any advice on opening the camera and getting useable image data from it? Ideally with Gstreamer so I can control the resolution and the colorspace
  • Should I try to rebuild OpenCV with V4L flag turned on?