Error:gst-stream-error-quark

when I run the sample deepstream-test1-usbcam from deepstream_python_apps, it met an error such as:
Error: gst-stream-error-quark: Internal data stream error. (1): gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:usb-cam-source:
streaming stopped, reason not-negotiated (-4)

The information about my usb-camera as follow:
ioctl: VIDIOC_ENUM_FMT
Index : 0
Type : Video Capture
Pixel Format: ‘MJPG’ (compressed)
Name : Motion-JPEG
Size: Discrete 1920x1080
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Size: Discrete 640x480
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Size: Discrete 800x600
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Size: Discrete 1024x768
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Size: Discrete 1280x720
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Size: Discrete 1600x1200
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Size: Discrete 320x240
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Size: Discrete 2592x1944
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Size: Discrete 3840x2160
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Size: Discrete 3264x2448
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Size: Discrete 4192x3104
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
How can I solve the problem, thanks a lot.

Hi,
By default deepstream-test1-usbcam does not support MJPEG decoding. You would need to do customization. Please refer to discussion in
How the Deepstream can support the multiple Videos input which from Multiple USB cameras and to do the analysis

Please make sure your camera can run in gst-launch-1.0 command first, and then do the customization.

I have no solution from your advice, and when I run the command " gst-launch-1.0", it displays as follow:
tesco-yu@tescoyu-desktop:/$ gst-launch-1.0 videotestsrc ! nvvidconv ! nvegltransform ! nveglglessink
nvbuf_utils: Could not get EGL display connection
Setting pipeline to PAUSED …

Using winsys: x11
ERROR: Pipeline doesn’t want to pause.
Setting pipeline to NULL …
Freeing pipeline …

What can I do for the next step?

Hi,
Please provide the information:
• Hardware Platform (Jetson / GPU)
• DeepStream Version
• JetPack Version (valid for Jetson only)
• TensorRT Version
• Issue Type( questions, new requirements, bugs)

This error print is not expected:

nvbuf_utils: Could not get EGL display connection

Suggest you run other samples and check if it is still printed. If yes, it is better to re-flash the system image.

Thanks for your response,my information as follow:
• Hardware Platform (Jetson / GPU):jetson xaiver nx
• DeepStream Version:deepstream-5.1
• JetPack Version (valid for Jetson only):4.5
• TensorRT Version:7.1
• Issue Type( questions, new requirements, bugs):

I was able to run the pipeline with inference on a mjpg stream with the following pipeline:
gst-launch-1.0 v4l2src device=/dev/video0 io-mode=2 ! ‘image/jpeg,width=1920,height=1080,framerate=30/1’ ! jpegparse ! jpegdec ! videoconvert ! nvvideoconvert ! ‘video/x-raw(memory:NVMM),format=NV12’ ! mux.sink_0 nvstreammux live-source=1 name=mux batch-size=1 width=1920 height=1080 ! nvinfer config-file-path=/opt/nvidia/deepstream/deepstream-5.1/sources/deepstream_python_apps/apps/deepstream-test1-usbcam/dstest1_pgie_config.txt batch-size=1 ! nvmultistreamtiler rows=1 columns=1 width=1920 height=1080 ! nvvideoconvert ! nvegltransform ! nveglglessink

But how can I write it to the “deepstream-test1-usbcam.py”.

Hi,
You need to modify deepstream-test1 from

v4l2src ! video/x-raw,framerate=30/1 ! videoconvert ! nvvideoconvert ! video/x-raw(memory:NVMM) ! nvstreammux ! ...

to

v4l2src device=/dev/video0 io-mode=2 ! 'image/jpeg,width=1920,height=1080,framerate=30/1' ! jpegparse ! jpegdec ! videoconvert ! nvvideoconvert ! nvstreammux ! ...

There is no existing python code, but a user shares C code . Please check
How to change source pixel format from YUYV to MJPG - #7 by jhchris713

I have solved the problem by your advice, thanks a lot.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.