I have a CSI camera which can capture images at a high resolution (4032 x 3040). Currently, I am capturing the images through OpenCV Videocapture object and then resizing them to a lower resolution (640 x 640).
Because there is a slight delay in the resizing operation with OpenCV, I want to use Gstreamer to scale the input image and send the high-res and low-res image to two different virtual video devices (/dev/video*) created using v4l2loopback.
The problem here is that when I view the output of the two devices, both of them are at lower resolution of 640 x 640. This is the command I use to view the stream:
gst-launch-1.0 v4l2src device=/dev/video1 ! xvimagesink – similarly for /dev/video2.
I can’t figure out why I am getting both virtual streams in lower resolution.
Also, if there is any other way to get images at different resolutions, please feel free to suggest. Thank you!
Thank you for your reply! With the pipeline you suggested, I can get only images at lower resolution. The reason I used v4l2loopback was to create two virtual video streams where one would be at high resolution (4032x3040) and other at low resolution (640x640).
I modified the pipeline I was using according to your suggestion to use nvvidconv:
You would use identity drop-allocation=true before v4l2sink in case of v4l2loopback.
Note that v4l2loopback nodes may result in significant CPU usage.
Try (here using BGRx):