Gstreamer not opening camera in OpenCV

System info:

  • Tegra K1
  • Ubuntu 16.04 - Linux for Tegra
  • OpenCV 3.1.0 built from source with Gstreamer support -DWITH_GSTREAMER=ON \ -DWITH_GSTREAMER_0_10=ON
  • Camera is OV5640 connected to the board via CSI
  • Code to replicate problem

    //Various Pipelines I have tried//
    //const std::string gst_pipeline = "v4l2src ! queue ! 'video/x-raw-yuv,format=(fourcc)Y444,width=1920,height=1080,framerate=25/1' ! queue ! appsink";
    const std::string gst_pipeline = "v4l2src ! videoconvert ! appsink";
    cv::VideoCapture cap(gst_pipeline, cv::CAP_GSTREAMER);
    if (!cap.isOpened()){
    	printf("Camera not opened \n");
    	return -1;
    }
        // View video
    cv::Mat frame;
     for(int i = 0; i < 1500; i++) {
           cap >> frame;  // Get a new frame from camera
    }
    

    Doing this I get

    Camera not opened
    

    printed to the terminal.

    If I try to use the default camera, i.e.

    cv::VideoCapture cap(0);
    

    the camera opens but the output is not anything I can use. rhs is the output from using the default camera. https://drive.google.com/file/d/1D7OxTj2UecsbtiihJPzMdOkJBEjdsvwk/view?usp=sharing
    This image is not a bayer image, it is a 3-channel image and none of the colorspace conversions help.

    My Questions

  • Any advice on opening the camera and getting useable image data from it? Ideally with Gstreamer so I can control the resolution and the colorspace
  • Should I try to rebuild OpenCV with V4L flag turned on?
  • Have a verify below command to confirm you sensor HW and driver can working well first.

    gst-launch-1.0 v4l2src device="/dev/video0" ! “video/x-raw, width=640, height=480, format=(string)I420” ! nvhdmioverlaysink -e

    I should have mentioned that I have the camera working with these pipelines:

    gst-launch-0.10 v4l2src queue-size=1 ! 'video/x-raw-yuv,format=(fourcc)UYVY,width=1920,height=1080' ! xvimagesink
    

    and

    gst-launch-0.10 v4l2src ! queue ! 'video/x-raw-yuv,format=(fourcc)Y444,width=1920,height=1080,framerate=25/1' ! queue ! xvimagesink
    

    When I try the command you listed I get this error:

    Setting pipeline to PAUSED ...
    Inside NvxLiteH264DecoderLowLatencyInitNvxLiteH264DecoderLowLatencyInit set DPB and MjstreamingERROR: Pipeline doesn't want to pause.
    ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Cannot identify device 'dev/video0'.
    Additional debug info:
    v4l2_calls.c(568): gst_v4l2_open (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
    system error: No such file or directory
    Setting pipeline to NULL ...
    Freeing pipeline ...
    

    You may try:

    gst-launch-1.0 v4l2src device="/dev/video0" ! "video/x-raw, width=640, height=480, format=(string)I420" ! nvvidconv ! "video/x-raw(memory:NVMM), width=640, height=480, format=(string)I420" ! nvhdmioverlaysink -e
    

    If it works, I’d also suggest to rebuild opencv with gstreamer-1.0 support but without gstreamer-0.10 support. Not sure however…

    Trying that pipeline I receive this error:

    Setting pipeline to PAUSED ...
    Inside NvxLiteH264DecoderLowLatencyInitNvxLiteH264DecoderLowLatencyInit set DPB and MjstreamingPipeline is live and does not need PREROLL ...
    Setting pipeline to PLAYING ...
    New clock: GstSystemClock
    ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data flow error.
    Additional debug info:
    gstbasesrc.c(2865): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
    streaming task paused, reason not-negotiated (-4)
    EOS on shutdown enabled -- waiting for EOS after Error
    Waiting for EOS...
    

    I can always try rebuilding OpenCV with gstreamer-1.0 support and without 0.10 support but as I understand they are two different programs and should not interfere with each other in any way.

    Sorry I missed your camera outputs in UYUV format. Does this work ?

    gst-launch-1.0 v4l2src device="/dev/video0" ! "video/x-raw,format=UYVY,width=1920,height=1080" ! xvimagesink
    

    If yes, you would use the following gstreamer pipeline in opencv:

    const char gst="v4l2src device=/dev/video0 ! video/x-raw,format=UYVY,width=1920,height=1080 ! videoconvert ! video/x-raw, format=BGR ! appsink";
    cv::VideoCapture cap(gst);
    

    The point having both gstreamer 0.10 and 1.0 support in opencv is : which one will be used? That’s why I’d advise to have only one.

    Sorry for the late reply, but yes I tried the first command

    gst-launch-1.0 v4l2src device="/dev/video0" ! "video/x-raw,format=UYVY,width=1920,height=1080" ! xvimagesink
    

    And that works just fine. I can see the camera stream clearly.

    When I tried implementing it in OpenCV I got the same error that the camera was not opened.