How to capture and display camera video with python on Jetson TX2

I’d like to share a python script I programmed to capture and display live camera video on Jetson TX2, including IP CAM, USB webcam and the Jetson onboard camera. This sample script could serve as a starting point for developing your own OpenCV or Deep Learning applications. I think this code should work on Jetson TX1 as well.

Please refer to the my blog post and my GitHubGist for details:

[url]https://jkjung-avt.github.io/tx2-camera-with-python/[/url]
[url]https://gist.github.com/jkjung-avt/86b60a7723b97da19f7bfa3cb7d2690e[/url]

Here’s a screenshot of live IP CAM video on Jetson TX2.

External Media

2 Likes

Very cool, thanks for sharing.

I further extended the python script so that it could feed live camera images into a Caffe image classification pipeline to do real-time inferencing. This code could be used to quickly verify image classification models trained by NVIDIA DIGITS or Caffe. It should also serve as a good baseline for building quick demos on Jetson TX2.

I’m happy to share the code with fellow developers. For details, please check out my blog post: [url]https://jkjung-avt.github.io/tx2-camera-caffe/[/url]

I am getting this error.

nvidia@tegra-ubuntu:~/Desktop$ python tegra-cam.py --usb --width 300 --height 300
Called with args:
Namespace(image_height=300, image_width=300, rtsp_latency=200, rtsp_uri=None, use_rtsp=False, use_usb=True, video_dev=1)
OpenCV version: 2.4.13.1
Traceback (most recent call last):
File “tegra-cam.py”, line 102, in
cap = open_cam_usb(args.video_dev, args.image_width, args.image_height)
File “tegra-cam.py”, line 57, in open_cam_usb
return cv2.VideoCapture(gst_str, cv2.CAP_GSTREAMER)
AttributeError: ‘module’ object has no attribute ‘CAP_GSTREAMER’

@RaviKiranK, from the error message it seems that you are using the stock OpenCV4Tegra (2.4.13.1) installed by JetPack-3.1. That version of opencv was not built with GSTREAMER functionalities.

In my case I have built and installed opencv-3.3.0 (replacing OpenCV4Tegra) on TX2, in order for the tegra-cam.py script to work.

Hi Jung,

I have a raspberry pi sending out a UDP stream. I heard UDP is the fastest?

I tried replacing your code in the RTSP section with a UDP, but I got some errors. (pipeline doesn’t start). I have a TX2 with Opencv built with gstreamer.

Running glaunch on the command line works fine though.

I was wondering if you could provide a small modification to show how it could take an incoming rpi stream going to the jetson?

Thank you very much for the code already!

@fredmiller123, please provide more details about the problem you’re trying to solve:

  • What is the gst-launch-1.0 command you used on Jetson TX2 (you said this worked fine)?
  • What is the gstreamer pipeline string you used to call cv2.VideoCapture()?
  • What is the error message thrown out by OpenCV?

@jkjung13

The cmd line was this on a Raspberry pi:

raspivid -fps 15 -b 400000 -t 0 -n -w 640 -h 480 -o - | tee | gst-launch-1.0 -v fdsrc ! h264parse ! rtph264pay config-interval=1 ! udpsink host=192.168.1.111 port=5000

On the Jetson it received and displayed using:

gst-launch-1.0 udpsrc port=5000 ! 'application/x-rtp, encoding-name=H264, 
payload=96' ! rtph264depay ! h264parse ! avdec_h264 ! xvimagesink sync=false

I used cv2 capture as this:

cap = cv2.VideoCapture("udpsrc port=5000 ! 'application/x-rtp, encoding-name=H264, payload=96' ! rtph264depay ! h264parse ! avdec_h264 ! ! appsink", cv2.CAP_GSTREAMER)

Generally I get this error:

(python:3608): GStreamer-CRITICAL **: gst_element_make_from_uri: assertion 
'gst_uri_is_valid (uri)' failed
OpenCV Error: Unspecified error (GStreamer: unable to start pipeline
) in cvCaptureFromCAM_GStreamer, file 
/home/nvidia/opencv/modules/videoio/src/cap_gstreamer.cpp, line 881
VIDEOIO(cvCreateCapture_GStreamer (CV_CAP_GSTREAMER_FILE, filename)): raised 
OpenCV exception:

/home/nvidia/opencv/modules/videoio/src/cap_gstreamer.cpp:881: error: (-2) 
GStreamer: unable to start pipeline
in function cvCaptureFromCAM_GStreamer

To note: I have tried cv2.VideoCapture(‘movie.avi’) to load a local file and it works. It’s just the gstreamer.

Also: I read a post that I need to replace avdec_h264 in cmdline with ffdec_h624 for scripts, but from your github I don’t see either of these?

Thanks for any ideas!

You may try to convert into BGR with videoconvert before appsink.
I’m not familiar with python api, but not sure it is ok with quoted caps.

You may try something like:

cap = cv2.VideoCapture("udpsrc port=5000 ! application/x-rtp, encoding-name=H264, payload=96 ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! video/x-raw, format=(string)BGR ! appsink", cv2.CAP_GSTREAMER)

or

cap = cv2.VideoCapture("udpsrc port=5000 ! application/x-rtp, encoding-name=H264, payload=96 ! rtph264depay ! h264parse ! omxh264dec ! videoconvert ! video/x-raw, format=(string)BGR ! appsink", cv2.CAP_GSTREAMER)

@Honey_Patouceul

Wow, this actually works. Amazing. Thank you SO MUCH!!!

I find it rather frustrating why there are so many important parts of these pipelines but so few examples of the python api.

I suppose it is correct that the pipeline is pretty much a code in itself and that it takes some understanding?

Also @jkjung13 if you had an implementation like your previous examples I’d be interested to learn too!

Thank you!

Fred

@fredmiller123, I wrote a follow-up post illustrating how to do real-time “object detection” with Faster R-CNN on JTX2. Be sure to check it out.

[url]https://jkjung-avt.github.io/faster-rcnn/[/url]

I again wrote a follow-up post about how to do real-time object detection with Single-Shot Multibox Detector (SSD) on JTX2. Please refer to the following blog post.

hello everyone,
I need to live stream the camera feeds from camera that is connected on jetson nano to the PC connected on the same network.
I am using ssh to connect nano and PC. camera on jetson giving the feeds but i want this camera output on my remotely connected pc.

hoping for the answer soon…

sourabhsh55 did you get an answer for your question. I am on the same boat as you. just curious

Hi All,
ny1 having code for live streaming the FLIR Grasshopper camera(Point grey)using jetson nano developer kit.
really facing n number of issues with it…
Someone please help

@jkjung13 I made Hikvision IR camera interface well with Jetson Nano with this code. Thank you a million!!

Great Scott!

That’s brilliant ! Would this work on a Jeston Nano ?

@zaiene.mehdi Yes.