Hi, I am trying to use my Jetson TX2 DevKit camera (the CSI-2 P5V27C Sunny Optical camera) on my Xavier.
Failed attempt:
I installed opencv via the instructions here: Build OpenCV 3.4 on NVIDIA Jetson AGX Xavier Developer Kit - JetsonHacks
In python I can the following code (based on answer here: python - Using tx2 dev-kit CSI camera on the Jetson xavier in Python3 - Stack Overflow)
import cv2
get_str = "nvarguscamerasrc ! 'video/x-raw(memory:NVMM),width=1024, height=768, framerate=120/1, format=NV12' ! nvvidconv flip-method=0 ! appsink"
cap = cv2.VideoCapture(gat_str, cv2.CAP_GSTREAMER)
cap.read()
However it just returns (False, None)
:-(
Any help would be appreciated!
One possible cause is that your opencv version has no gstreamer support (the one in JetPack has no gstreamer nor CUDA support). You may read this topic.
If it turns out it is the case, you would just build and install your own opencv version. This script should do the job. Note that opencv sources and build will require a few GBs of available disk.
Do you get video when running nvgstcapture-1.0? If so, then your camera should be working OK.
You would also change your pipeline (assuming you’re running R28.2.1) to:
nvarguscamerasrc ! video/x-raw(memory:NVMM), format=NV12, width=1280, height=720, framerate=120/1 ! nvvidconv ! video/x-raw, format=BGRx ! videoconvert ! video/x-raw, format=BGR ! appsink
if you want to use 120 fps (not sure what opencv will be able to do with this rate, you may adjust). This would be a typical pipeline for opencv processing in BGR format.
If your opencv processing can be done on NV12 format, your may try this pipeline:
nvarguscamerasrc ! video/x-raw(memory:NVMM), format=NV12, width=1280, height=720, framerate=120/1 ! nvvidconv ! video/x-raw, format=NV12, width=1024, height=768 ! appsink
With this sensor you may use:
I. cpp code from source thread
II. terminal execution:
gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM),width=1024, height=768, framerate=120/1, format=NV12' ! nvvidconv flip-method=0 ! nvegltransform ! nveglglessink -e
III. the python code[rename tegra-cam.txt to tegra-cam.py] reference thread
The latter uses:
def open_cam_onboard(width, height):
# On versions of L4T prior to 28.1, add 'flip-method=2' into gst_str
# Use Jetson onboard camera
# gst_str = ("nvcamerasrc ! "
gst_str = ("nvarguscamerasrc ! "
"video/x-raw(memory:NVMM), width=(int)2592, height=(int)1458, format=(string)I420, framerate=(fraction)30/1 ! "
"nvvidconv ! video/x-raw, width=(int){}, height=(int){}, format=(string)BGRx ! "
"videoconvert ! appsink").format(width, height)
return cv2.VideoCapture(gst_str, cv2.CAP_GSTREAMER)
source
Thus for instaling opencv you would rather:
- do not select opencv component when flashing from Jetpack
- get opencv installed e.g. with
curl -s https://raw.githubusercontent.com/AastaNV/JEP/master/script/install_opencv3.4.0_Xavier.sh | sudo bash
sudo ldconfig -v
- set up python 3 and run
python3 tegra-cam.txt
- compile cpp file and run
g++ -o simple_opencv -Wall -std=c++11 simple_opencv.cpp $(pkg-config --libs opencv)
./simple_opencv
simple_opencv.cpp (611 Bytes)
tegra-cam.txt (6.18 KB)
Thanks guys. I was able to get the video working using nvgstcapture-1.0
. Also, I realized the cv2 version I was using did not have gstreamer installed. This can be checked within python by running cv2.getBuildInformation()
.
The issue was the default instructions from Build OpenCV 3.4 on NVIDIA Jetson AGX Xavier Developer Kit - JetsonHacks install for python 2. To install for python 3 I incorrectly ran sudo apt install python3-opencv
which will not install gstreamer.
To modify above instructions for python3 requires adding the following option for cmake in buildOpenCV.sh line 161:
-D PYTHON_DEFAULT_EXECUTABLE=/usr/bin/python3 \
Then create a link to cv2 if using a virtual env:
ln /usr/local/lib/python3.6/site-packages/cv2.cpython-36m-aarch64-linux-gnu.so .venv/lib/python3.6/site-packages/cv2.so
Could someone refine python code for opening devkit onboard camera and post ‘minimal’ excerpt that will pop up camera window and will show stream, please?
I did not come up with anything better than to use existing tegra-cam file. But when I start to dissect fragments of it to reduce number of lines of code it will always fail.