I’m developing a RTSP streaming system using Jetson-Nano.
My design is as followings.
Get video from gstreamer pipeline into OpenCV
processing images using OpenCV function
Streaming processed image
I already completed step 1 and 2.
But I suffer from 3rd step.
I’m thinking of passing the processed image to cv :: VideoCapture and streaming RTSP with gstreamer.
So, I created a VideoCapture variable initialized with pipeline(this pipeline is another one using input).
the pipeline is "appsrc, videoconvert, encode, rtppay and sink for streaming.
I confirmed streaming is done.But the streaming is very slow(delay about 5 secs).
In my analysis, the cause is videoconvert in gstreamer.
I want to use nvvidconv or nvvideoconvert. But these element cannot receive RGB or BGR format.
And videocapture in OpenCV can treat only 3 channels format(RGB or BGR…, cannot treat RGBA format and so on).
How can I stream using nvvidconv or nvvideoconvert?
(Or is there any other good way to convert format?)
I made a system with reference to DeepStream.
In DeepStream, once streamed with UDP, it is received by itself and re-streamed with RTSP.
RTSP streaming reference is ‘test-launch.c’ in this site([url]https://github.com/GStreamer/gst-rtsp-server/tree/master/examples[/url])
And my input is usb camera, so I’m using v4l2src element.