Deesptream v4l2src pipeline error

Hardware Platform (dGPU Geforce GTX 1050)
DeepStream Version : 6.2
TensorRT Version 8.5.2.2
NVIDIA GPU Driver Version: 525.89.02

nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2022 NVIDIA Corporation
Built on Wed_Sep_21_10:33:58_PDT_2022
Cuda compilation tools, release 11.8, V11.8.89
Build cuda_11.8.r11.8/compiler.31833905_0

±----------------------------------------------------------------------------+
| NVIDIA-SMI 525.89.02 Driver Version: 525.89.02 CUDA Version: 12.0 |
|-------------------------------±---------------------±---------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|===============================+======================+======================|
| 0 NVIDIA GeForce … Off | 00000000:01:00.0 Off | N/A |
| N/A 36C P0 5W / 50W | 10MiB / 4096MiB | 0% Default |
| | | N/A |
±------------------------------±---------------------±---------------------+

±----------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=============================================================================|
| 0 N/A N/A 902 G /usr/lib/xorg/Xorg 4MiB |
| 0 N/A N/A 1550 G /usr/lib/xorg/Xorg 4MiB |
±----------------------------------------------------------------------------+

Issue description:with deepstream-app
$ sudo deepstream-app -c source1_usb_dec_infer_resnet_int8_jcrq.txt

and i got this error:

I am trying to run a very simple pipeline
ERROR from tiled_display_queue: Internal data stream error.
Debug info: gstqueue.c(988): gst_queue_handle_sink_event (): /GstPipeline:pipeline/GstBin:tiled_display_bin/GstQueue:tiled_display_queue:
streaming stopped, reason not-negotiated (-4)
** INFO: <bus_callback:204>: incorrect camera parameters provided, please provide supported resolution and frame rate

my camera configuration is:
$ v4l2-ctl --device=/dev/video0 --all

Driver Info:
Driver name : uvcvideo
Card type : Integrated_Webcam_HD: Integrate
Bus info : usb-0000:00:14.0-5
Driver version : 5.15.87
Capabilities : 0x84a00001
Video Capture
Metadata Capture
Streaming
Extended Pix Format
Device Capabilities
Device Caps : 0x04200001
Video Capture
Streaming
Extended Pix Format
Priority: 2
Video input : 0 (Camera 1: ok)
Format Video Capture:
Width/Height : 1280/720
Pixel Format : ‘YUYV’ (YUYV 4:2:2)
Field : None
Bytes per Line : 2560
Size Image : 1843200
Colorspace : sRGB
Transfer Function : Rec. 709
YCbCr/HSV Encoding: ITU-R 601
Quantization : Default (maps to Limited Range)
Flags :
Crop Capability Video Capture:
Bounds : Left 0, Top 0, Width 1280, Height 720
Default : Left 0, Top 0, Width 1280, Height 720
Pixel Aspect: 1/1
Selection Video Capture: crop_default, Left 0, Top 0, Width 1280, Height 720, Flags:
Selection Video Capture: crop_bounds, Left 0, Top 0, Width 1280, Height 720, Flags:
Streaming Parameters Video Capture:
Capabilities : timeperframe
Frames per second: 10.000 (10/1)
Read buffers : 0
brightness 0x00980900 (int) : min=-64 max=64 step=1 default=0 value=0
contrast 0x00980901 (int) : min=0 max=95 step=1 default=0 value=0
saturation 0x00980902 (int) : min=0 max=100 step=1 default=64 value=64
hue 0x00980903 (int) : min=-2000 max=2000 step=1 default=0 value=0
white_balance_temperature_auto 0x0098090c (bool) : default=1 value=1
gamma 0x00980910 (int) : min=100 max=300 step=1 default=100 value=100
gain 0x00980913 (int) : min=1 max=8 step=1 default=1 value=1
power_line_frequency 0x00980918 (menu) : min=0 max=2 default=2 value=2
0: Disabled
1: 50 Hz
2: 60 Hz
white_balance_temperature 0x0098091a (int) : min=2800 max=6500 step=10 default=4600 value=4600 flags=inactive
sharpness 0x0098091b (int) : min=1 max=7 step=1 default=2 value=2
backlight_compensation 0x0098091c (int) : min=0 max=3 step=1 default=3 value=3
exposure_auto 0x009a0901 (menu) : min=0 max=3 default=3 value=3
1: Manual Mode
3: Aperture Priority Mode
exposure_absolute 0x009a0902 (int) : min=10 max=625 step=1 default=156 value=156 flags=inactive
exposure_auto_priority 0x009a0903 (bool) : default=0 value=1

I uploaded my config file

source1_usb_dec_infer_resnet_int8_jcrq.txt (3.8 KB)

In order to test my cam config i run these pipelines in in command line and worked fine:

$ gst-launch-1.0 v4l2src name=cam_src ! decodebin ! videoconvert ! videoscale ! video/x-raw,format=RGB ! queue ! videoconvert ! nveglglessink name=img_origin

$ gst-launch-1.0 v4l2src name=cam_src ! decodebin ! videoconvert ! videoscale ! video/x-raw,format=RGB ! queue ! videoconvert ! ximagesink name=img_origin

I think that my problem is the source since there is not posible to set nveglglessink or ximagesink because just these optiona are available in deepstream:

1: Fakesink
2: EGL based windowed nveglglessink for dGPU and nv3dsink for Jetson
3: Encode + File Save (encoder + muxer + filesink)
4: Encode + RTSP streaming; Note: sync=1 for this type is not applicable;
5: nvdrmvideosink (Jetson only)
6: Message converter + Message broker

i tried with type=2 and type=5 but does not work.

Coud somebody help me to solve the pipeline issue or implement a pipeline like this (gst-launch-1.0 v4l2src name=cam_src ! decodebin ! videoconvert ! videoscale ! video/x-raw,format=RGB ! queue ! videoconvert ! ximagesink name=img_origin) but in deepstream?

from the error, the camera parameters are incorrect.
can you check the device format by this command? v4l2-ctl -d /dev/video0 --list-formats-ext, the fill the correct source values in the configuration file.
Or, if gst-launch-1.0 command is fine, you might dump the pipeline to check the negotiation results, then modify the source values in configuration file. here is dumping method.

  1. set gstreamer GST_DEBUG_DUMP_DOT_DIR, for example, export GST_DEBUG_DUMP_DOT_DIR=/home/nvidia
  2. execute gst-launch command, for example, gst-launch-1.0 xxx -e
  3. use this methodview dot to see the dot file in /home/nvidia.

Hi, thanks for your answer.

I have already using the negotiated values. I forgot to say that when I change the [sink0] to file-output with type=3 (3=File ), the pipeline works fine and generates the output file, therefore, even when I use fakesink (type=1). I think the problem is with the type of sink. I want to use a sink that allow me to show the result video on screen changing type=2 (2=EglSink), I would like to used sinks like ximgsink or nveglglessink but that types of sinks are not available in deepstream. I don’t know why!!!

May be to include a videoconver before the sink, but i know to do that on gstreamer but not in deepstream.

could you share more logs? please do “export GST_DEBUG=6” first to modify Gstreamer’s log level, then run again, you can redirect the logs to a file.

these are the logs of gst-launch-1.0 pipeline that works fine

This the debug level 5 output, because the level 3 is too poor

debug.log (1.4 MB)

additionally I run the -v option to get a verbose output, I attached it too.

verbose.log (4.3 KB)

AND

this is the log of the deepstream-app that fails

deepstream-app.log (6.1 KB)

from the log, there is an error “cuGraphicsGLRegisterBuffer failed with error(219) gst_eglglessink_cuda_init texture = 1”, which indicates an error with OpenGL or DirectX context. are you testing in docker? do you have a physical monitor?
please refer to the topics : 121833, 125673

I have the integrated monitor and a I’m not using Docker.

The error number 219 means:
CUDA_ERROR_INVALID_GRAPHICS_CONTEXT = 219
This indicates an error with OpenGL or DirectX context.
please check if you run into the same issue as cuGraphicsGLRegisterBuffer failed with error(219) gst_eglglessink_cuda_init texture = 1
if still meet the same issue, did you install OpenGL? Could you share the output of
$ dpkg -l | grep -i opengl

opengl.txt (1.9 KB)

it is ximagesink’s log, could share nveglglessink 's log? thanks.

  1. could you use this command to get camera’s formats,
    v4l2-ctl -d /dev/video0 --list-formats-ext

  2. could you try this comandline? want to check if nveglglessink can render HW memory in your environment.
    gst-launch-1.0 v4l2src name=cam_src ! decodebin ! videoconvert ! videoscale ! video/x-raw,format=RGB ! queue ! nvvideoconvert ! video/x-raw\(memory:NVMM\),format=RGBA ! nveglglessink name=img_origin

Hi,

I execute the pipeline and got thid output

Note: I removed the parameter name=cam_src from v4l2src in order to get the integrated webcam also execute adding device=/dev/video0 with exactly the same result. The webcam’s led turned on but the pipeline close with the output in the pipeline.log file.

pipline.log (5.4 KB)

and the whith v4l2-ctl commsnd I got this output

v4l2-ctl -d /dev/video0 --list-formats-ext
ioctl: VIDIOC_ENUM_FMT
Type: Video Capture

[0]: 'MJPG' (Motion-JPEG, compressed)
	Size: Discrete 1280x720
		Interval: Discrete 0.033s (30.000 fps)
	Size: Discrete 960x540
		Interval: Discrete 0.033s (30.000 fps)
	Size: Discrete 848x480
		Interval: Discrete 0.033s (30.000 fps)
	Size: Discrete 640x480
		Interval: Discrete 0.033s (30.000 fps)
	Size: Discrete 640x360
		Interval: Discrete 0.033s (30.000 fps)
	Size: Discrete 320x240
		Interval: Discrete 0.033s (30.000 fps)
[1]: 'YUYV' (YUYV 4:2:2)
	Size: Discrete 1280x720
		Interval: Discrete 0.100s (10.000 fps)
	Size: Discrete 640x480
		Interval: Discrete 0.033s (30.000 fps)
	Size: Discrete 640x360
		Interval: Discrete 0.033s (30.000 fps)
	Size: Discrete 424x240
		Interval: Discrete 0.033s (30.000 fps)
	Size: Discrete 320x240
		Interval: Discrete 0.033s (30.000 fps)
	Size: Discrete 320x180
		Interval: Discrete 0.033s (30.000 fps)
	Size: Discrete 160x120
		Interval: Discrete 0.033s (30.000 fps)

there is no HW memory goes into nveglglessink, it will not call cuGraphicsGLRegisterBuffer .

in this log, there is still the elgsink error “cuGraphicsGLRegisterBuffer failed with error(219) gst_eglglessink_cuda_init texture = 1”, 219 means " This indicates an error with OpenGL or DirectX context.". it is OpenGL environment issue, please refer to 125673, which has some investigating methods.

  1. try rebooting.
  2. Using the GPU which has the monitor connected.
  3. Could you install OpenGL from https://developer.nvidia.com/opengl-driver and try again ?
  4. use fakesink or filesink instead.

With filesink, fakesink, autovideosink and ximagesink works properly. The problem is that DeepStream does not have ximagesink or autovideosink as available options in the config files.

I am going to try installing opengl from the nvidia url recommended by you and will be back. Thanks.

Sorry for the late reply, Is this still an DeepStream issue to support? Thanks

i found same problem in T4, deepstream6.1 ,it is work in jeton nx。

on dGPU, deepstream will use nveglglessink to render, On Jetson, deepstream will use nv3dsink. about “same problem”, did you still meet “cuGraphicsGLRegisterBuffer failed with error(219)”?

no, it failed to pull usb stream in deepstream. it is ok in jetson nx, same camera, same camera configure .

could you share the commandline and the error logs? thanks.

error:
nvstreammux: Successfully handled EOS for source_id=0
ERROR from src_elem: Internal data stream error.
Debug info: gstbasesrc.c(3072): gst_base_src_loop (): /GstPipeline:pipeline/GstBin:multi_src_bin/GstBin:src_sub_bin0/GstV4l2Src:src_elem:
streaming stopped, reason error (-5)

I find it can get the stream in gstreamer with fakesink and tcpclientsink,

config file:source1_usb_dec_infer_resnet_int8.txt
[source0]
enable=1
#Type - 1=CameraV4L2 2=URI 3=MultiURI
uri=usb:
type=1
camera-width=640
camera-height=480
camera-fps-n=30
camera-fps-d=1
camera-v4l2-dev-node=0