Unable to load deep stream with MJPEG pixel format

Hi,

I am using Jetson nano with deepstream version 5.1. I wanted to test CustomVision AI with deepstream using USB camera with format MJPEG. However, I am getting the following error :

Quitting
XIO: fatal IO error 11 (Resource temporarily unavailable) on X server “:0”
after 16 requests (16 known processed) with 0 events remaining.
Disconnecting Azure…
XIO: fatal IO error 11 (Resource temporarily unavailable) on X server “:0.0”
after 75 requests (75 known processed) with 0 events remaining.
** WARN: <parse_source:577>: Unknown key ‘camera-v412-dev-node’ for group [source0]
** ERROR: main:1451: Failed to set pipeline to PAUSED
Quitting
ERROR from src_elem: Cannot identify device ‘/dev/video0’.
Debug info: v4l2_calls.c(609): gst_v4l2_open (): /GstPipeline:pipeline/GstBin:multi_src_bin/GstBin:src_sub_bin0/GstV4l2Src:src_elem:
system error: No such file or directory
App run failed
** WARN: <parse_source:577>: Unknown key ‘camera-v412-dev-node’ for group [source0]
** ERROR: main:1451: Failed to set pipeline to PAUSED
Quitting
ERROR from src_elem: Cannot identify device ‘/dev/video0’.
Debug info: v4l2_calls.c(609): gst_v4l2_open (): /GstPipeline:pipeline/GstBin:multi_src_bin/GstBin:src_sub_bin0/GstV4l2Src:src_elem:
system error: No such file or directory
App run failed
** WARN: <parse_source:577>: Unknown key ‘camera-v412-dev-node’ for group [source0]
** ERROR: main:1451: Failed to set pipeline to PAUSED
Quitting
ERROR from src_elem: Cannot identify device ‘/dev/video0’.
Debug info: v4l2_calls.c(609): gst_v4l2_open (): /GstPipeline:pipeline/GstBin:multi_src_bin/GstBin:src_sub_bin0/GstV4l2Src:src_elem:
system error: No such file or directory
App run failed
** WARN: <parse_source:577>: Unknown key ‘camera-v412-dev-node’ for group [source0]
** ERROR: main:1451: Failed to set pipeline to PAUSED

My camera configuration is as follows:

ist-formats
SPCA2650 AV Camera (usb-70090000.xusb-2.2):
/dev/video0

ioctl: VIDIOC_ENUM_FMT
Index : 0
Type : Video Capture
Pixel Format: ‘MJPG’ (compressed)
Name : Motion-JPEG

Index       : 1
Type        : Video Capture
Pixel Format: 'YUYV'
Name        : YUYV 4:2:2

My DSConfig-CustomVisionAI.txt file setup is :

[application]
enable-perf-measurement=1
perf-measurement-interval-sec=5
#gie-kitti-output-dir=streamscl

[tiled-display]
enable=1
rows=1
columns=1
width=1280
height=960
gpu-id=0
nvbuf-memory-type=0

[source0]
enable=1
#type - 1=camerav4l2 2=uri 3=multiuri 4=rtsp 5=csi
type=1
camera-width=1280
camera-height=720
camera-fps-n=25
camera-fps-d=1
camera-v4l2-dev-node=1

[sink0]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=RTSPStreaming 5=Overlay
type=5
sync=0
display-id=0
offset-x=0
offset-y=0
width=0
height=0
overlay-id=1
source-id=0

I also have tried the following solution with no lock :

Please help asap.

Warm regards
Abhishek

Hi,
By default it is not supported in deepstream-app and you would need to do customization. Please check if you can launch it in gst-launch-1.0 command like:

$ gst-launch-1.0 v4l2src ! jpegparse ! nvv4l2decoder mjpeg=1 ! nvoverlaysink

Hi,

I am getting the following response:

Setting pipeline to PAUSED …
Opening in BLOCKING MODE
Opening in BLOCKING MODE
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
NvMMLiteOpen : Block : BlockType = 277
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 277
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming stopped, reason not-negotiated (-4)
Execution ended after 0:00:00.785360868
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …

Hi,
You need to configure width, height, framerate per the source like:

v4l2src ! image/jpeg,width=_WIDTH_,height=_HEIGHT_,framerate=_FRATE_ ! ...

Hi,

Here’s a code I tried with output:
gst-launch-1.0 v4l2src ! video/x-raw,format=MJPG,width=640,height=480,framerate=25/1 ! jpegparse ! nvv4l2decoder mjpeg=1 ! nvoverlaysink
WARNING: erroneous pipeline: could not link v4l2src0 to jpegparse0, neither element can handle caps video/x-raw, format=(string)MJPG, width=(int)640, height=(int)480, framerate=(fraction)25/1

Kindly excuse me if the code has error. I am quite new to shell scripting.

I am getting the following error currently:
*** DeepStream: Launched RTSP Streaming at rtsp://localhost:8554/ds-test ***

(deepstream-test5-app:1): GLib-CRITICAL **: 09:31:29.239: g_strrstr: assertion ‘haystack != NULL’ failed
nvds_msgapi_connect : connect success
Opening in BLOCKING MODE
Opening in BLOCKING MODE

Using winsys: x11
Disconnecting Azure…
** ERROR: main:1451: Failed to set pipeline to PAUSED
Quitting
ERROR from src_elem: Cannot identify device ‘/dev/video0’.
Debug info: v4l2_calls.c(609): gst_v4l2_open (): /GstPipeline:pipeline/GstBin:multi_src_bin/GstBin:src_sub_bin0/GstV4l2Src:src_elem:
system error: No such file or directory
App run failed

There is no update from you for a period, assuming this is not an issue any more.
Hence we are closing this topic. If need further support, please open a new one.
Thanks

Hi,
Please share information of the source:

$ v4l2-ctl -d /dev/videoX --list-formats-ext