I’m developing a RTSP streaming system using Jetson-Nano.
My design is as followings.
- Get video from gstreamer pipeline into OpenCV
- processing images using OpenCV function
- Streaming processed image
I already completed step 1 and 2.
But I suffer from 3rd step.
I’m thinking of passing the processed image to cv :: VideoCapture and streaming RTSP with gstreamer.
So, I created a VideoCapture variable initialized with pipeline(this pipeline is another one using input).
the pipeline is "appsrc, videoconvert, encode, rtppay and sink for streaming.
I confirmed streaming is done.But the streaming is very slow(delay about 5 secs).
In my analysis, the cause is videoconvert in gstreamer.
I want to use nvvidconv or nvvideoconvert. But these element cannot receive RGB or BGR format.
And videocapture in OpenCV can treat only 3 channels format(RGB or BGR…, cannot treat RGBA format and so on).
How can I stream using nvvidconv or nvvideoconvert?
(Or is there any other good way to convert format?)