Did someone succeeded in acquiring a video stream from a USB camera (or with the TX1 CSI camera module) through the OpenCV’ VideoCapture() function?
Is there some sample code to follow? I’ve read that a possible workaround is using GStreamer but I have never used it and the documentation for it in relation to OpenCV is poor at best.
I also read that a possible solution is to use OpenCV 3 but I would prefer to use OpenCV4Tegra since it is optimized.
Actually, the OpenCV4Tegra is not well optimized to support camera on Jetson TX1 at present version to cause the video capturing (live streaming) problem.
We’re working this issue, and going to have an update version in coming release soon.
Im confused. In your 4/11 post you said that the current OpenCV4Tegra did not support video streaming and that we should us OpenCV3.1 which does not have Cuda acceleration. When I use the USB camera, I need to use video/raw-x instead of video/raw-x(memory:NVMM) which I believe means it is running in CPU memory, not GPU, which is consistent with not using OpenCV4Tegra.
So my question is: Is there a new version of OpenCV4Tegra in the works which will support video streaming and color space conversion (bayer to I420) for CSI and/or USB cameras and what month should we plan on it being available?
We are all trying to work with you to demonstrate that Jetson is much more than an expensive Raspberry Pi. As recently as yesterday you recommended using OpenCV 3.1 without acceleration as a temporary workaround. It is important to know the schedule so we can decide where to focus our development effort.
Hello, Gene:
There are 2 kind of cameras which are used in Jetson TX1 platform.
USB camera. This will be a standard V4L2 device, and it may generate RGB or YUV data. And any version of openCV can open this kind device.
on-board camera (OV5693). This is raw sensor, and V4L2 interface can ONLY get raw data (bayer). To get RGB/YUV data, you need call GSP plugin ‘nvcamerasrc’. For openCV, which support GST pipeline of capture device, the version should be higher than 3.0
Back to your original question, with USB camera, you can use openCV4Tegra, and there are a lot of demo samples in opencv capture function.
Could you provide links to some of the USB camera demo samples? I am trying to get a Logitech c270 webcam working with my TX1 and am having trouble both with using gstreamer and also with VideoCapture. I have not been able to find any good USB camera demos.
Hello, Bcssd1234:
what’s problem you’ve met? Please provide more details.
Generally, USB camera should work as a standard V4L2 device.
For opencv, the following code clip should work (there are a lot of similar sample in web.)
VideoCapture cap(0); // open the default camera
if(!cap.isOpened()) { // check if we succeeded
cerr << "Fail to open camera " << endl;
return -1;
}
for(;;)
{
Mat frame;
cap >> frame; // get a new frame from camera
imshow("original", frame);
waitKey(1);
}
// the camera will be deinitialized automatically in VideoCapture destructor
cap.release();
When I run cv::VideoCapture(0) with or without the camera plugged in I get the error:
HIGHGUI ERROR: V3L2: Pixel format of incoming image is unsupported by OpenCV
Hello, Bcssd1234:
It seems that your boards have different environment.
You can just check the dev node for video device, and make sure that opencv opens correct USB camera.
I am running a TX1 and downloaded and flashed the latest jetpack feb 4th, 2018. I have a very simple python script using opencv that will not capture an image any time after a reboot until I run this first:
/usr/bin/nvgstcapture
it then functions.
import cv2
cap = cv2.VideoCapture(0)
if not cap.isOpened():
print("Cannot open camera\n")
exit(1)
while True:
ret, img = cap.read()
if not ret:
print("No image was captured by 'cap.read()'\n") #### DIES HERE ####
break
cv2.imshow('img', img)
k = cv2.waitKey(1) & 0xff
if k == 27:
break
cap.release()
cv2.destroyAllWindows()
The camera is a Logitech c170 USB webcam (money-strapped FIRST robotics team).
Can anyone tell me what is going on or how I might most quickly find the problem?
I’m a coder, but have exactly two days experience with the TX1 and video processing.