I have a CSI camera which can capture images at a high resolution (4032 x 3040). Currently, I am capturing the images through OpenCV Videocapture object and then resizing them to a lower resolution (640 x 640).
Because there is a slight delay in the resizing operation with OpenCV, I want to use Gstreamer to scale the input image and send the high-res and low-res image to two different virtual video devices (/dev/video*) created using v4l2loopback.
I used the below command to do this:
gst-launch-1.0 nvarguscamerasrc gainrange='3 3' ispdigitalgainrange='3 3' exposuretimerange='1000000 1000000' ! 'video/x-raw(memory:NVMM), width=(int)4032, height=(int)3040, format=(string)NV12, framerate=(fraction)30/1' ! nvvidconv flip-method=0 ! tee name=mytee ! queue ! v4l2sink device=/dev/video1 mytee. ! videoconvert ! videoscale ! 'video/x-raw, width=(int)640, height=(int)640' ! v4l2sink device=/dev/video2
The problem here is that when I view the output of the two devices, both of them are at lower resolution of 640 x 640. This is the command I use to view the stream:
gst-launch-1.0 v4l2src device=/dev/video1 ! xvimagesink – similarly for /dev/video2.
I can’t figure out why I am getting both virtual streams in lower resolution.
Also, if there is any other way to get images at different resolutions, please feel free to suggest. Thank you!