OpenCV VideoCapture USB Camera

Did someone succeeded in acquiring a video stream from a USB camera (or with the TX1 CSI camera module) through the OpenCV’ VideoCapture() function?

Is there some sample code to follow? I’ve read that a possible workaround is using GStreamer but I have never used it and the documentation for it in relation to OpenCV is poor at best.

I also read that a possible solution is to use OpenCV 3 but I would prefer to use OpenCV4Tegra since it is optimized.

Hi Anacleto86,

Actually, the OpenCV4Tegra is not well optimized to support camera on Jetson TX1 at present version to cause the video capturing (live streaming) problem.
We’re working this issue, and going to have an update version in coming release soon.

Instead of using OpenCV4Tegra, you could try to compile and install OpenCV 3.1.0 by referring the below thread:
https://devtalk.nvidia.com/default/topic/917386/jetson-tx1/usb-3-0-port-unstable-on-jetson-tx1-/post/4835793/#4835793

There are already some users who confirmed it can work as temporary solution before the next version of OpenCV4Tegra.

Thanks

So it wasn’t soon. It was months (maybe). This is so disappointing.

Hi Anacleto86,

The next release will be coming soon, please stay tuned.

Thanks

Kayccc,

two months since last update saying “coming soon”. Do you have a better estimate?

Thanx

Hi gpetilli,

The updated OpenCV4Tegra 2.4.13 was released within the JetPack 2.2.

It should work well with USB camera.

Download the latest Jetpack bundle — https://developer.nvidia.com/embedded/jetpack

Access L4T components/docs — https://developer.nvidia.com/embedded/linux-tegra

Please see the Release Notes — http://developer.nvidia.com/embedded/dlc/l4t-release-notes-24-1

Thanks

Kayccc

Im confused. In your 4/11 post you said that the current OpenCV4Tegra did not support video streaming and that we should us OpenCV3.1 which does not have Cuda acceleration. When I use the USB camera, I need to use video/raw-x instead of video/raw-x(memory:NVMM) which I believe means it is running in CPU memory, not GPU, which is consistent with not using OpenCV4Tegra.

So my question is: Is there a new version of OpenCV4Tegra in the works which will support video streaming and color space conversion (bayer to I420) for CSI and/or USB cameras and what month should we plan on it being available?

We are all trying to work with you to demonstrate that Jetson is much more than an expensive Raspberry Pi. As recently as yesterday you recommended using OpenCV 3.1 without acceleration as a temporary workaround. It is important to know the schedule so we can decide where to focus our development effort.

Gene

Hello, Gene:
There are 2 kind of cameras which are used in Jetson TX1 platform.

  1. USB camera. This will be a standard V4L2 device, and it may generate RGB or YUV data. And any version of openCV can open this kind device.
  2. on-board camera (OV5693). This is raw sensor, and V4L2 interface can ONLY get raw data (bayer). To get RGB/YUV data, you need call GSP plugin ‘nvcamerasrc’. For openCV, which support GST pipeline of capture device, the version should be higher than 3.0

Back to your original question, with USB camera, you can use openCV4Tegra, and there are a lot of demo samples in opencv capture function.

br
ChenJian

jachen,

Could you provide links to some of the USB camera demo samples? I am trying to get a Logitech c270 webcam working with my TX1 and am having trouble both with using gstreamer and also with VideoCapture. I have not been able to find any good USB camera demos.

Hello, Bcssd1234:
what’s problem you’ve met? Please provide more details.
Generally, USB camera should work as a standard V4L2 device.

  1. For opencv, the following code clip should work (there are a lot of similar sample in web.)
VideoCapture cap(0); // open the default camera
    if(!cap.isOpened()) { // check if we succeeded
        cerr << "Fail to open camera " << endl;
        return -1;
    }

    for(;;)
    {
        Mat frame;
        cap >> frame; // get a new frame from camera
        imshow("original", frame);
        waitKey(1);
    }
    // the camera will be deinitialized automatically in VideoCapture destructor
    cap.release();
  1. For GST, just use v4l2src.

br
ChenJian

Hi jachen,

When I run cv::VideoCapture(0) with or without the camera plugged in I get the error:
HIGHGUI ERROR: V3L2: Pixel format of incoming image is unsupported by OpenCV

VideoCapture(0) actually works on two other boards for me, but not on that one particular board.

Hello, Bcssd1234:
It seems that your boards have different environment.
You can just check the dev node for video device, and make sure that opencv opens correct USB camera.

br
ChenJian

Any updates to this? I’m on a TX2 and having the same issue with the onboard camera.

I am running a TX1 and downloaded and flashed the latest jetpack feb 4th, 2018. I have a very simple python script using opencv that will not capture an image any time after a reboot until I run this first:

       /usr/bin/nvgstcapture

it then functions.

import cv2

cap = cv2.VideoCapture(0)
if not cap.isOpened():
        print("Cannot open camera\n")
        exit(1)

while True:
        ret, img = cap.read()
        if not ret:
                print("No image was captured by 'cap.read()'\n") #### DIES HERE ####
                break

        cv2.imshow('img', img)

        k = cv2.waitKey(1) & 0xff
        if k == 27:
                break

cap.release()
cv2.destroyAllWindows()

The camera is a Logitech c170 USB webcam (money-strapped FIRST robotics team).
Can anyone tell me what is going on or how I might most quickly find the problem?
I’m a coder, but have exactly two days experience with the TX1 and video processing.