Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU) Jetson Orin NX
• DeepStream Version 6.3
• JetPack Version (valid for Jetson only) 5.1.2
• TensorRT Version 8.5.2
• Issue Type( questions, new requirements, bugs) fail to run deepstream-test1-usbcam.py on MJPG support camera
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing) Deepstream USB Python Example GST Pipeline Fails
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)
Hi,
I’m trying to stream a MJPG format camera. The output of v4l2-ctl -d /dev/video0 --list-formats-ext
:
ioctl: VIDIOC_ENUM_FMT
Type: Video Capture[0]: 'MJPG' (Motion-JPEG, compressed) Size: Discrete 640x360 Interval: Discrete 0.033s (30.000 fps) Interval: Discrete 0.040s (25.000 fps) Size: Discrete 1280x720 Interval: Discrete 0.033s (30.000 fps)
These pipelines without nvinfer
access to the camera:
gst-launch-1.0 v4l2src device=/dev/video0 ! ‘image/jpeg,width=640,height=360,framerate=30/1,format=MJPG’ ! jpegdec ! nvvideoconvert ! ‘video/x-raw,format=NV12’ ! nveglglessink
gst-launch-1.0 -v v4l2src device=/dev/video0 ! ‘image/jpeg,width=640,height=360,framerate=30/1,format=MJPG’ ! nvv4l2decoder mjpeg=1 ! nvvideoconvert ! ‘video/x-raw,format=NV12’ ! nveglglessink
The pipeline below works when the camera supports YUYV format, but when the camera only supports MJPG,
gst-launch-1.0 v4l2src device=/dev/video0 io-mode=2 ! ‘image/jpeg,width=1920,height=1080,framerate=30/1’ ! jpegparse ! jpegdec ! videoconvert ! nvvideoconvert ! ‘video/x-raw(memory:NVMM),format=NV12’ ! mux.sink_0 nvstreammux live-source=1 name=mux batch-size=1 width=1280 height=720 ! nvinfer config-file-path=/workspace/ds-test/config_pgie_yolo_v4.txt batch-size=1 ! nvmultistreamtiler rows=1 columns=1 width=1920 height=1080 ! nvvideoconvert ! nvegltransform ! nveglglessink
there is the error:
Setting pipeline to PAUSED …
Using winsys: x11
Opening in BLOCKING MODE
WARNING: [TRT]: Using an engine plan file across different models of devices is not recommended and is likely to affect performance or even cause errors.
0:00:04.065469626 364 0xaaaae6520ca0 INFO nvinfer gstnvinfer.cpp:682:gst_nvinfer_logger: NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::deserializeEngineAndBackend()
<nvdsinfer_context_impl.cpp:1988> [UID = 1]: deserialized trt engine from :/workspace/ds-test/models/yolov4_binary_e30_fp16.engine
WARNING: [TRT]: The getMaxBatchSize() function should not be used with an engine built from a network created with NetworkDefinitionCreationFlag::kEXPLICIT_BATCH flag. This function will always return 1.
INFO: [Implicit Engine Info]: layers num: 5
0 INPUT kFLOAT Input 3x544x960
1 OUTPUT kINT32 BatchedNMS 1
2 OUTPUT kFLOAT BatchedNMS_1 200x4
3 OUTPUT kFLOAT BatchedNMS_2 200
4 OUTPUT kFLOAT BatchedNMS_3 2000:00:04.255938402 364 0xaaaae6520ca0 INFO nvinfer gstnvinfer.cpp:682:gst_nvinfer_logger: NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::generateBackendContext() [UID = 1]: Use deserialized engine model: /workspace/ds-test/models/yolov4_binary_e30_fp16.engine
0:00:04.289599010 364 0xaaaae6520ca0 INFO nvinfer gstnvinfer_impl.cpp:328:notifyLoadModelStatus: [UID 1]: Load new model:/workspace/ds-test/config_pgie_yolo_v4.txt sucessfully
Pipeline is live and does not need PREROLL …
Got context from element ‘eglglessink0’: gst.egl.EGLDisplay=context, display=(GstEGLDisplay)NULL;
Setting pipeline to PLAYING …
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = image/jpeg, width=(int)640, height=(int)360, framerate=(fraction)30/1, format=(string)MJPG, parsed=(boolean)true, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:5:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = image/jpeg, width=(int)640, height=(int)360, framerate=(fraction)30/1, format=(string)MJPG, parsed=(boolean)true, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:5:1, interlace-mode=(string)progressive
NvMMLiteOpen : Block : BlockType = 277
NvMMLiteBlockCreate : Block : BlockType = 277
/GstPipeline:pipeline0/nvv4l2decoder:nvv4l2decoder0.GstPad:sink: caps = image/jpeg, width=(int)640, height=(int)360, framerate=(fraction)30/1, format=(string)MJPG, parsed=(boolean)true, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:5:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = image/jpeg, width=(int)640, height=(int)360, framerate=(fraction)30/1, format=(string)MJPG, parsed=(boolean)true, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:5:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/nvv4l2decoder:nvv4l2decoder0.GstPad:src: caps = video/x-raw(memory:NVMM), format=(string)I420, width=(int)640, height=(int)360, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)jpeg, colorimetry=(string)2:4:5:1, framerate=(fraction)30/1, nvbuf-memory-type=(string)nvbuf-mem-surface-array, gpu-id=(int)0
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
> Additional debug info:
> gstbasesrc.c(3072): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
> streaming stopped, reason not-negotiated (-4)
Execution ended after 0:00:00.933210934
Setting pipeline to NULL …
nvstreammux: Successfully handled EOS for source_id=0
Freeing pipeline …
Please give me advice.
Thank you