Unable to get RGB buffer in gstreamer/python pipeline

Hello,

We have installed DeepStream version 5.1 and using it with Python 3.6.9 and Opencv 4.3.0. We have built this OpenCV with gstreamer locally.

The RTSP URI is as follows:
uri = f"rtspsrc location=rtsp://admin:admin12345@192.168.0.103/live ! decodebin ! nvvideoconvert ! appsink"
cap = cv2.VideoCapture(uri, cv2.CAP_GSTREAMER)
When we get a frame from this cap object as follows:
ret, frame = cap.read()
We observed that this frame is grayscale, i.e., the shape is (1080,1920) and is not an RGB image.

We tested with following pipeline:
uri = f"rtspsrc location=rtsp://admin:admin12345@192.168.0.103/live latency=0 ! queue ! rtph264depay ! h264parse ! nvv4l2decoder ! nvvideoconvert ! video/x-raw, width={frame_width}, height={frame_height}, format=RGB ! appsink"
However, this is giving an error:
0:00:00.040978788 24568 0x1343190 ERROR GST_PIPELINE grammar.y:721:gst_parse_perform_link: could not link nvvideoconvert0 to appsink0, nvvideoconvert0 can't handle caps video/x-raw, width=(int)1280, height=(int)720, format=(string)RGB
[ WARN:0] global /home/vast/opencv-4.3/opencv/modules/videoio/src/cap_gstreamer.cpp (713) open OpenCV | GStreamer warning: Error opening bin: could not link nvvideoconvert0 to appsink0, nvvideoconvert0 can't handle caps video/x-raw, width=(int)1280, height=(int)720, format=(string)RGB
[ WARN:0] global /home/vast/opencv-4.3/opencv/modules/videoio/src/cap_gstreamer.cpp (480) isPipelinePlaying OpenCV | GStreamer warning: GStreamer: pipeline have not been created

We also tested with some variations:

uri = f"rtspsrc location=rtsp://admin:admin12345@192.168.0.103/live latency=0 ! queue ! rtph264depay ! h264parse ! nvv4l2decoder ! nvvideoconvert ! video/x-raw(memory:NVMM), width={frame_width}, height={frame_height}, format=(string)RGBA ! appsink"

And these did not help as well. Could you please provide any suggestions?

The following is from opencv build information:
Video I/O:
DC1394: YES (2.2.5)
FFMPEG: YES
avcodec: YES (57.107.100)
avformat: YES (57.83.100)
avutil: YES (55.78.100)
swscale: YES (4.8.100)
avresample: YES (3.7.0)
GStreamer: YES (1.14.5)
v4l/v4l2: YES (linux/videodev2.h)

Please help.

Please use “gst-inspect-1.0 nvvideoconvert” to query which formats are supported by nvvideoconvert.

What is your platform?

Hi,

The setup is as follows

  1. Intel i5-8th Generation
  2. 8GB RAM
  3. GTX 1050 4GB
  4. Driver Version: 460.32.03
  5. CUDA Version 10.0.130
  6. Ubuntu 18.04
  7. GStreamer 1.14.5
  8. Opencv 4.3.0, locally compiled on the platform with ffmpeg and gstreamer bindings
  9. Video Codec version: Video_Codec_SDK_11.0.10
  10. Python 3.6.9

The output of gst-inspect-1.0 nvvideoconvert is as follows:

Factory Details:
Rank primary (256)
Long-name NvVidConv Plugin
Klass Filter/Converter/Video/Scaler
Description Converts video from one colorspace to another & Resizes
Author NVIDIA Corporation. Post on Deepstream SDK forum for any queries @ DeepStream SDK - NVIDIA Developer Forums

Plugin Details:
Name nvvideoconvert
Description video Colorspace conversion & scaler
Filename /usr/lib/x86_64-linux-gnu/gstreamer-1.0/deepstream/libgstnvvideoconvert.so
Version 1.2.3
License Proprietary
Source module nvvideoconvert
Binary package GStreamer nvvideoconvert Plugin
Origin URL http://nvidia.com/

GObject
±—GInitiallyUnowned
±—GstObject
±—GstElement
±—GstBaseTransform
±—Gstnvvideoconvert

Pad Templates:
SINK template: ‘sink’
Availability: Always
Capabilities:
video/x-raw(memory:NVMM)
format: { (string)I420, (string)NV12, (string)P010_10LE, (string)BGRx, (string)RGBA, (string)GRAY8, (string)GBR }
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
video/x-raw
format: { (string)I420, (string)P010_10LE, (string)NV12, (string)BGRx, (string)RGBA, (string)GRAY8, (string)GBR }
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]

SRC template: ‘src’
Availability: Always
Capabilities:
video/x-raw(memory:NVMM)
format: { (string)I420, (string)NV12, (string)P010_10LE, (string)BGRx, (string)RGBA, (string)GRAY8, (string)GBR }
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
video/x-raw
format: { (string)I420, (string)NV12, (string)P010_10LE, (string)BGRx, (string)RGBA, (string)GRAY8, (string)GBR }
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]

Element has no clocking capabilities.
Element has no URI handling capabilities.

Pads:
SINK: ‘sink’
Pad Template: ‘sink’
SRC: ‘src’
Pad Template: ‘src’

Element Properties:
name : The name of the object
flags: readable, writable
String. Default: “nvvideoconvert0”
parent : The parent of the object
flags: readable, writable
Object of type “GstObject”
qos : Handle Quality-of-Service events
flags: readable, writable
Boolean. Default: false
silent : Produce verbose output ?
flags: readable, writable
Boolean. Default: false
gpu-id : Set GPU Device ID for operation
flags: readable, writable, changeable only in NULL or READY state
Unsigned Integer. Range: 0 - 4294967295 Default: 0
output-buffers : number of output buffers
flags: readable, writable, changeable in NULL, READY, PAUSED or PLAYING state
Unsigned Integer. Range: 1 - 4294967295 Default: 4
interpolation-method: Set interpolation methods
flags: readable, writable, controllable
Enum “GstNvInterpolationMethod” Default: 6, “Default”
(0): Nearest - Nearest
(1): Bilinear - Bilinear
(2): Algo-1 - GPU - Cubic, VIC - 5 Tap
(3): Algo-2 - GPU - Super, VIC - 10 Tap
(4): Algo-3 - GPU - LanzoS, VIC - Smart
(5): Algo-4 - GPU - Ignored, VIC - Nicest
(6): Default - GPU - Nearest, VIC - Nearest
src-crop : Pixel location left:top:width:height
Use string with values of crop location to set the property.
e.g. 20:20:40:50
flags: readable, writable, changeable only in NULL or READY state
String. Default: “0:0:0:0”
dest-crop : Pixel location left:top:width:height
Use string with values of crop location to set the property.
e.g. 20:20:40:50
flags: readable, writable, changeable only in NULL or READY state
String. Default: “0:0:0:0”
compute-hw : Compute Scaling HW
flags: readable, writable, controllable
Enum “GstNvComputeHWType” Default: 0, “Default”
(0): Default - Default, GPU for Tesla, VIC for Jetson
(1): GPU - GPU
(2): VIC - VIC
nvbuf-memory-type : Type of NvBufSurface Memory to be allocated for output buffers
flags: readable, writable, changeable only in NULL or READY state
Enum “GstNvBufMemoryType” Default: 0, “nvbuf-mem-default”
(0): nvbuf-mem-default - Default memory allocated, specific to particular platform
(1): nvbuf-mem-cuda-pinned - Allocate Pinned/Host cuda memory
(2): nvbuf-mem-cuda-device - Allocate Device cuda memory
(3): nvbuf-mem-cuda-unified - Allocate Unified cuda memory

You don’t need to show me the gst-inspect-1.0 result. Just for you to know RGB format is not supported by nvvideoconvert.

The following pipeline can work on our dGPU platform:
gst-launch-1.0 rtspsrc location=rtsp://xxxx ! queue ! decodebin ! queue ! nvvideoconvert ! ‘video/x-raw(memory:NVMM), width=720, height=480, format=(string)RGBA’ ! queue ! appsink sync=0 async=0 enable-last-sample=0 drop=1 max-buffers=2

Hi,
This also failed with the same error. Taking hint from the output of gst-inspect output, we tried all variations: GBR, BGRx, RGBA, but we are getting similar errors to what is posted in the 1st message and the pipeline is not created.

There seem to be no errors with the gstreamer or deepstream 5.1 installations. Not that we could figure out.

We will try on AWS T4 machine also now and will post the results.

If you want to use with opencv, the caps should be ‘video/x-raw, width=720, height=480, format=(string)RGBA’. You can not use NVMM .

Sure. Got it.

We are seeing that this pipeline works in gstreamer:
cmd = [“gst-launch-1.0”,
“rtspsrc”, “location=rtsp://admin:admin12345@192.168.0.103/Streaming/Channels/101”, “!”,
“queue”, “!”,
“decodebin”, “!”,
“queue”, “!”,
“nvvideoconvert”, “!”,
“video/x-raw,width=1920,height=1080,format=RGBA”, “!”,
“queue”, “!”,
“fakesink”
]

But we have no way to get this in a python buffer. If we try with appsink, gst-launch-1.0 kept dumping core.

This is the command use:
uri = f’rtspsrc location=rtsp://admin:admin12345@192.168.0.103/live ! queue ! decodebin ! queue ! nvvideoconvert ! video/x-raw, width=720, height=480, format=(string)RGBA ! queue ! appsink’
cap = cv2.VideoCapture(uri, cv2.CAP_GSTREAMER)

I took a look at Convert NV12 to RGB in DeepStream pipeline - #7 by Fiona.Chen where you had advised about a pipeline to another forum user. I used it to tweak the pipeline.
uri = f"rtspsrc location=rtsp://admin:admin12345@192.168.0.103/live latency=0 ! decodebin ! nvvideoconvert ! video/x-raw,format=(string)RGBA ! videoconvert ! video/x-raw,format=(string)BGR, width=1280, height=720 ! appsink"

This seems to be working, but can you advise if we can make it even better so that it takes minimal CPU?

Looking forward.

nvvideoconvert does not support BGR format. It is already the best one.