Hello,
We use Orin and Camera as shown below:
-
Jetson Orin Devkit
-
Jetpack 5.0.2
-
V4L2 camera
The camera document said I can capture jpg image with the gst-launch command as shown below:
gst-launch-1.0 v4l2src device=/dev/video0 numbuffers=1 ! “video/x-raw, format=(string)UYVY,
width=(int)3840, height=(int)2160” ! jpegenc ! filesink
location=filename.jpg
And, the question I’ve got is
- How can I capture RGB or BGR images with nvidia’s gstreamer?
- With these RGB/BGR images we will make cv::Mat and apply many opencv operations.
I know I need to use appsink, callback, nvvidconv and opencv VideoCapture class.
I’d like to find some easy and short example code for this.
Thank you very much for your kindness in advance…
Thank you.
F.Y.I.
ubuntu:~$ v4l2-ctl -d /dev/video0 --all
Driver Info:
Driver name : tegra-video
Card type : vi-output, ar0230 31-0044
Bus info : platform:tegra-capture-vi:3
Driver version : 5.10.104
Capabilities : 0x84200001
Video Capture
Streaming
Extended Pix Format
Device Capabilities
Device Caps : 0x04200001
Video Capture
Streaming
Extended Pix Format
Media Driver Info:
Driver name : tegra-camrtc-ca
Model : NVIDIA Tegra Video Input Device
Serial :
Bus info :
Media version : 5.10.104
Hardware revision: 0x00000003 (3)
Driver version : 5.10.104
Interface Info:
ID : 0x03000017
Type : V4L Video
Entity Info:
ID : 0x00000015 (21)
Name : vi-output, ar0230 31-0044
Function : V4L2 I/O
Pad 0x01000016 : 0: Sink
Link 0x0200001b: from remote pad 0x100000c of entity ‘13e40000.host1x:nvcsi@15a00000-’: Data, Enabled
Priority: 2
Video input : 0 (Camera 3: no power)
Format Video Capture:
Width/Height : 1920/1080
Pixel Format : ‘UYVY’ (UYVY 4:2:2)
Field : None
Bytes per Line : 3840
Size Image : 4147200
Colorspace : sRGB
Transfer Function : Default (maps to sRGB)
YCbCr/HSV Encoding: Default (maps to ITU-R 601)
Quantization : Default (maps to Limited Range)
Flags :
Streaming Parameters Video Capture:
Capabilities : timeperframe
Frames per second: 30.000 (30/1)
Read buffers : 0