USB MJPG Camera Not working: streaming stopped, reason not-negotiated (-4)

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
AGX Xavier

• DeepStream Version
Deepstream 6.2

• JetPack Version (valid for Jetson only)
5.2

• TensorRT Version
8.5.2-1+cuda11.4

• NVIDIA GPU Driver Version (valid for GPU only)
35.3.1

• Issue Type( questions, new requirements, bugs)

I bought two USB webcams (Arducam 12mp IMX708) that have a USB driver on them to make them universally usable even without a Pi CSI lane. They work fine on my ubuntu laptop.

I am trying to use them as sources for image inference in deepstream using deepstream-app. However, I can not get past the following error message:

RROR from src_elem: Internal data stream error.
Debug info: gstbasesrc.c(3072): gst_base_src_loop (): /GstPipeline:pipeline/GstBin:multi_src_bin/GstBin:src_sub_bin0/GstV4l2Src:src_elem:
streaming stopped, reason not-negotiated (-4)
** INFO: <bus_callback:204>: incorrect camera parameters provided, please provide supported resolution and frame rate

The problem is, I have been over the camera parameters over and over, and I can’t figure out what I’m missing.

Here is the output of v4l2-ctl --list-formats-ext -d /dev/video0:

ioctl: VIDIOC_ENUM_FMT
Type: Video Capture

    [0]: 'MJPG' (Motion-JPEG, compressed)
            Size: Discrete 2304x1296
                    Interval: Discrete 0.033s (30.000 fps)
            Size: Discrete 4608x2592
                    Interval: Discrete 0.100s (10.000 fps)
            Size: Discrete 1920x1080
                    Interval: Discrete 0.033s (30.000 fps)
            Size: Discrete 1600x1200
                    Interval: Discrete 0.033s (30.000 fps)
            Size: Discrete 1280x720
                    Interval: Discrete 0.033s (30.000 fps)

Here’s my source and sink config:

[application]
enable-perf-measurement=1
perf-measurement-interval-sec=3

[tiled-display]
enable=0
rows=1
columns=1
width=1280
height=720
nvbuf-memory-type=0
compute-hw=0

[source0]

IMX708 1

enable=1
type=1
camera-width=4608
camera-height=2592
camera-fps-n=10
camera-fps-d=1
camera-v4l2-dev-node=0

cudadec-memtype=0

[source1]

IMX708 2

enable=0
type=1
camera-width=4608
camera-height=2592
camera-fps-n=10
camera-fps-d=1
camera-v4l2-dev-node=2

[source2]

Thermal Camera

enable=0
type=1
camera-width=640
camera-height=480
camera-fps-n=30
camera-fps-d=1
camera-v4l2-dev-node=4

[sink0]

IMX708 1

enable=1
type=4
source-id=0
sync=0
codec=1
bitrate=4000000
rtsp-port=8552
udp-port=5398
width=1920
height=1080
enc-type=0
profile=0
udp-buffer-size=100000
nvbuf-memory-type=0

[sink1]

IMX708 2

enable=0
type=4
source-id=1
sync=0
codec=1
bitrate=4000000
rtsp-port=8553
udp-port=5399
width=1920
height=1080
enc-type=0
profile=0
udp-buffer-size=100000
nvbuf-memory-type=0

[sink2]

Thermal Camera

enable=0
type=4
source-id=2
sync=0
codec=1
bitrate=4000000
rtsp-port=8554
udp-port=5400
width=640
height=480
enc-type=0
profile=0
udp-buffer-size=100000
nvbuf-memory-type=0

[osd]
enable=1
gpu-id=0
border-width=5
text-size=15
text-color=1;1;1;1;
text-bg-color=0.3;0.3;0.3;1
font=Serif
show-clock=0
clock-x-offset=800
clock-y-offset=820
clock-text-size=12
clock-color=1;0;0;0
nvbuf-memory-type=0

[streammux]

gpu-id=0

live-source=1
batch-size=1
batched-push-timeout=40000
width=1920
height=1080
enable-padding=0
nvbuf-memory-type=0

[primary-gie]
enable=1
gpu-id=0
gie-unique-id=1
nvbuf-memory-type=0
config-file=config_infer_primary_yoloV8.txt

[tests]
file-loop=0

You can see I have a thermal camera in there as well. Ignore that for now. I have input every width/height/fpsn/fpsd combo in there and always the same issue, so I’m assuming my problem is somewhere else? Help is appreciated.

• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
See notes above.
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

  1. please refer to this topic, especially in 28.3.1 part. from your logs, the camera can only output MJPG format data. first you can use gst-launch to debug how to use MJPEG data in DeepStream pipeline.

  2. please find video-format in the doc . currently deepstream-app can support raw data from camera, but can’t accept MJPEG format directly, which is an encoded data. deepstream-app is opensource, and you can customize to implement. the key code is create_camera_source_bin of opt\nvidia\deepstream\deepstream-6.2\sources\apps\apps-common\src\deepstream_source_bin.c. you need to add a nvv4l2decoder plugin.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.