I gave up passing network rtsp stream to aruco.
It just won’t work with my level of skills at the moment.
However, as aruco works with local camera- probably the code below will work to mount the network stream as a local camera.
./test-launch "nvarguscamerasrc ! video/x-raw(memory:NVMM), format=NV12, width=1920, height=1080, framerate=30/1 ! nvvidconv ! video/x-raw, width=640, height=480, format=NV12, framerate=30/1 ! omxh265enc ! rtph265pay name=pay0 pt=96 config-interval=1"
stream ready at rtsp://127.0.0.1:8554/test
gst-launch-1.0 rtspsrc location=rtsp://127.0.0.1:8554/test ! queue ! decodebin ! videoconvert ! v4l2sink device=/dev/video2
./aruco_test live:2
Opening camera index 2
VIDEOIO ERROR: V4L2: Pixel format of incoming image is unsupported by OpenCV
OpenCV Error: Unspecified error (GStreamer: unable to start pipeline
) in cvCaptureFromCAM_GStreamer, file /home/nvidia/build_opencv/opencv/modules/videoio/src/cap_gstreamer.cpp, line 887
VIDEOIO(cvCreateCapture_GStreamer(CV_CAP_GSTREAMER_V4L2, reinterpret_cast<char *>(index))): raised OpenCV exception:
/home/nvidia/build_opencv/opencv/modules/videoio/src/cap_gstreamer.cpp:887: error: (-2) GStreamer: unable to start pipeline
in function cvCaptureFromCAM_GStreamer
Exception :Could not open video
And when I am trying to read from video2 gstreamer fails:
gst-launch-1.0 v4l2src device=/dev/video2 ! 'video/x-raw, format=RGB, width=640, height=480, framerate=30/1' ! queue ! videoconvert ! xvimagesink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
However, when I start it with the approach below it works:
gst-launch-1.0 -v nvarguscamerasrc ! 'video/x-raw(memory:NVMM), format=NV12, width=1920, height=1080, framerate=30/1' ! nvvidconv ! 'video/x-raw, width=640, height=480, format=I420, framerate=30/1' ! videoconvert ! identity drop-allocation=1 ! 'video/x-raw, width=640, height=480, format=RGB, framerate=30/1' ! v4l2sink device=/dev/video2
./aruco_test live:2
Opening camera index 2
Gtk-Message: 15:11:53.522: Failed to load module "canberra-gtk-module"
(in:7950): Gtk-CRITICAL **: 15:11:53.555: IA__gtk_window_resize: assertion 'width > 0' failed
VIDEOIO ERROR: V4L2: getting property #1 is not supported
Frame:-1
Time detection=55.546 milliseconds nmarkers=0 images resolution=[640 x 480]
VIDEOIO ERROR: V4L2: getting property #1 is not supported
Frame:-1
What might be the issue with the former approach?
Is it because of the mess with formats NV12 - RGB - I420?
How the correct line loopbacking a network steam to /dev/video2 will look like and how to check that it plays with gsteamer?
Thanks
Update: what I have managed to run is:
gst-launch-1.0 v4l2src device=/dev/video2 ! 'video/x-raw, format=I420, width=640, height=480, framerate=30/1' ! queue ! videoconvert ! xvimagesink
On the other hand what works is
./aruco_test live:2
with local v4l2loopback generated with sequence below:
gst-launch-1.0 -v nvarguscamerasrc ! 'video/x-raw(memory:NVMM), format=NV12, width=1920, height=1080, framerate=30/1' ! nvvidconv ! 'video/x-raw, width=640, height=480, format=I420, framerate=30/1' ! videoconvert ! identity drop-allocation=1 ! 'video/x-raw, width=640, height=480, format=RGB, framerate=30/1' ! v4l2sink device=/dev/video2
but it fails executing with sequence generated with the code below:
./test-launch "nvarguscamerasrc ! video/x-raw(memory:NVMM), format=NV12, width=1920, height=1080, framerate=30/1 ! nvvidconv ! video/x-raw, width=640, height=480, format=NV12, framerate=30/1 ! omxh265enc ! rtph265pay name=pay0 pt=96 config-interval=1"
stream ready at rtsp://127.0.0.1:8554/test
gst-launch-1.0 rtspsrc location=rtsp://127.0.0.1:8554/test ! queue ! decodebin ! videoconvert ! v4l2sink device=/dev/video2
./aruco_test live:2
Opening camera index 2
VIDEOIO ERROR: V4L2: Pixel format of incoming image is unsupported by OpenCV
It seems to require format to be passed in RGB, and doesn’t seem to work with rtsp where NV12 or I420 are used. However, it works with the former example where /dev/video2 is created with RGB. Probably it is wrong conversion of formats somewhere that might be the cause.