I am currently working on a project that Jetson Nano uses IMX219 CSI camera, and streams live cam data to a web server using Flask. However, when I try to stream live cam data, firstly it had a green screen, rather than cam image’s itself. After some researches, using Python’s Nanocamera library instead of Gstreamer pipeline would solve the streaming problem. However, let alone to solve the problem, currently the camera gives me “Unable to open” error. Then I removed the Nanocamera library, refreshing OpenCV, but nothing changes. What would you suggest firstly to access the camera, and then to stream live data?
Hello together,
I’m currently experiencing the same problem and I didn’t find a solution until now. I tried the nanocam library as well but wasn’t succesful.
I’m trying to stream the video feed from my Raspberry Pi V2 camera to a Flask webserver. I’m using a Jetson nano A02 and my code looks like this example.
When I establish the connection with the webserver, all I get is a green image. For initializing the camera, I used camera = cv2.VideoCapture(cv2.CAP_V4L2) instead of camera = cv2.VideoCapture(0) or using the gstreamer pipeline because I often got the !image.empty() in function 'imencode' error.
My device is listed as video0 and I can acces the camera stream with a CSI-Camera example script.
IMX219 is a bayer sensor, and it has non-byte aligned pixel depth, so opencv wouldn’t be able to receive it properly. Furthermore, debayering on CPU would be very slow with a Nano.
You’d better use Argus as mentionned by @ShaneCCC, leveraging HW debayering, but using opencv videoCapture gstreamer backend that enables HW part of the conversion to BGR for opencv: