Hi,
I am trying to do some image processing using the Jetson Xavier and some USB cameras.
I am using a Kayeton camera as shown here which supports both MJPG and YUYV formats.
Here is the output when I run v4l2-ctl -d /dev/video0 --list-formats-ext:
v4l2-ctl -d /dev/video0 --list-formats-extioctl: VIDIOC_ENUM_FMT
Index : 0
Type : Video Capture
Pixel Format: 'MJPG' (compressed)
Name : Motion-JPEG
Size: Discrete 640x480
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 1920x1080
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 1280x720
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 320x240
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 640x480
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.033s (30.000 fps)
Index : 1
Type : Video Capture
Pixel Format: 'YUYV'
Name : YUYV 4:2:2
Size: Discrete 640x480
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 320x240
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 640x480
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.033s (30.000 fps)
Now, I want to use the 1920x1080 MJPG format and simply display the camera feed on the screen.
Doing this on the Jetson Xavier has been extremely frustrating due to gstreamer included with OpenCV, I cannot simply do cv2.VideoCapture() and set the desired resolution.
I have managed to get some sort of output using the following gstreamer command:
cap_string = 'v4l2src device=/dev/video0 io-mode=2 ! image/jpeg,width=1920,height=1080,framerate=30/1 ! nvv4l2decoder mjpeg=1 ! nvvidconv ! video/x-raw,format=BGRx ! videoconvert ! video/x-raw, format=BGR ! appsink drop=1'
This will show the video feed on the screen, however there is a huge delay (approx half a second) between what’s actually in front of the camera and what is displayed on the screen, and the video feed is extremely choppy. Furthermore, when I get the size of the frames they have size 1920x1088 instead of 1920x1080.
This problem does not occur when I use the YUY2 format with gstreamer using the following string:
cap_string = 'v4l2src device=/dev/video0 ! 'video/x-raw,format=YUY2,width=640,height=480,framerate=30/1' ! nvvidconv ! videoconvert ! 'video/x-raw,format=(string)BGR' ! appsink'
However this is the wrong size of video frames for my application, changing the dimensions in the above command to 1920x1080 gives me the error:
'GST_IS_ELEMENT (element)' failed
Now, when I use my desktop Ubuntu machine that has OpenCV without gstreamer I can simply do:
cap = cv2.VideoCapture(0)
cap.set(cv2.CAP_PROP_FOURCC, cv2.VideoWriter_fourcc(*'MJPG'))
cap.set(cv2.CAP_PROP_FRAME_WIDTH, 1920)
cap.set(cv2.CAP_PROP_FRAME_HEIGHT, 1080)
And I can view the camera feed in real time without issues. However I wish to use the cameras on the Jetson as I will need to be capturing footage from a mobile set up. If I run the above on the Jetson I get the same error as earlier:
gst_element_get_state: assertion 'GST_IS_ELEMENT (element)' failed
My question is in two parts.
-
Is it possible to overcome the problem I have on the Jetson without removing gstreamer support and installing a different version of OpenCV? I have tried just about every single combination of gstreamer commands I can find on this forum and this issue still persists.
-
Failing above, how do I go about installing OpenCV without gstreamer support? In all honesty this seems to be more preferable as gstreamer is causing more problems than it solves.
Please let me know if you require any further information.