How to use deepstream-app with MJPEG format stream? 2nd try

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) x86_64 platform with RTX4070 GPU
• DeepStream Version 6.4
• JetPack Version (valid for Jetson only)
• TensorRT Version 8.6
• NVIDIA GPU Driver Version (valid for GPU only) 12.4 (Docker Container runtime /usr/local/cuda-12.2)
• Issue Type( questions, new requirements, bugs) make run deepstream-app with usb camera format support MJPG and inferencing using source1_usb_dec_infer_resnet_int8.txt
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing) See NVIDIA‐based Video Analytics on Photon OS on WSL2 · dcasota/photonos-scripts Wiki · GitHub
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

Hi,
I’m rather new to the forum. Thank you for the help.

I’m trying to make run the deepstream-app with the source1_usb_dec_infer_resnet_int8.txt example.

The usb camera used supports the video-format ‘MJPG’ (picture streaming). The test-launch works flawlessly.

./test-launch --gst-debug=3 ‘( v4l2src device=/dev/video0 ! image/jpeg,format=MJPG,width=640,height=480,framerate=30/1 ! jpegparse ! rtpjpegpay name=pay0 pt=96 )’

Thanks to the forum entry How to use deepstream-app with MJPEG format stream? - Intelligent Video Analytics / DeepStream SDK - NVIDIA Developer Forums the main finding was that the pre-compiled deepstream-app is hardcoded to video/x-raw with NV12 and the source has to be modified to support MJPG.

Modify and compiling the deepstream app works thanks to the hints in the forum entry.

The compiled deepstream app actually fails to link ‘src_cap_filter1’ and and ‘nvvidconv1’. The modifications by changing NV12 to MJPG and video/x-raw to image/jpeg aren’t enough. I’m trying to figure out how to implement a fix in /opt/nvidia/deepstream/deepstream/sources/apps/apps-common/src/deepstream_source_bin.c to have a working deepstream-app. Who can help with deepstream-app modifications examples for MJPG ?

Is it running normally according to the configuration of the last topic, but the frame rate is a little low? Could you attach your diff in your code?

Thank you for your help!

Starting situation

  • The v4l2-ctl format output lists MJPG. v4l2-ctl formats output.txt (6.5 KB)
    The detected usb camera (<>csi camera) supports - for rtsp performance - e.g. 640x480 with 30fps.
  • The configuration file source1_usb_dec_infer_resnet_int8.txt has been modified to make use of the usb camera on dev/video0 and rtsp. modified-source1_usb_dec_infer_resnet_int8.txt (4.1 KB)
  • Without any modification, the deepstream-app starts the pipeline deepstream-app output.txt (52.1 KB), but there are issues shown in the console:
    1. “kPLAN_MAGIC_TAG failed”
    2. deserialize engine from file /opt/nvidia/deepstream/deepstream/samples/models/Primary_Detector/resnet18_trafficcamnet.etlt failed.
    3. warning “The implicit batch dimension mode has been deprecated.”

Modified function create_camera_source_bin() in /opt/nvidia/deepstream/deepstream/sources/apps/apps-common/src/deepstream_source_bin.c

As suggested in How to use deepstream-app with MJPEG format stream? - #10 by yuweiw I’ve modified the source with:

caps1 = gst_caps_new_simple ("image/jpeg",
          "width", G_TYPE_INT, config->source_width, "height", G_TYPE_INT,
          config->source_height, "framerate", GST_TYPE_FRACTION,
          config->source_fps_n, config->source_fps_d, NULL);

The fresh compiled deepstream-app seems to have the same issue as described in How to use deepstream-app with MJPEG format stream? - #12 by 936694123.

** ERROR: <create_camera_source_bin:173>: Failed to link 'src_cap_filter1' (image/jpeg, width=(int)640, height=(int)480, framerate=(fraction)30/1) and 'nvvidconv1' (video/x-raw, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], format=(string){ ABGR64_LE, BGRA64_LE, AYUV64, ARGB64_LE, ARGB64, RGBA64_LE, ABGR64_BE, BGRA64_BE, ARGB64_BE, RGBA64_BE, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, AV12, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, RGBP, GBR, BGRP, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }; video/x-raw(ANY), format=(string){ ABGR64_LE, BGRA64_LE, AYUV64, ARGB64_LE, ARGB64, RGBA64_LE, ABGR64_BE, BGRA64_BE, ARGB64_BE, RGBA64_BE, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, AV12, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, RGBP, GBR, BGRP, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ])
** ERROR: <create_camera_source_bin:225>: create_camera_source_bin failed
** ERROR: <create_pipeline:1863>: create_pipeline failed
** ERROR: <main:697>: Failed to create pipeline

A modification with

caps1 = gst_caps_new_simple ("video/x-raw", "format", G_TYPE_STRING, "MJPEG",
          "width", G_TYPE_INT, config->source_width, "height", G_TYPE_INT,
          config->source_height, "framerate", GST_TYPE_FRACTION,
          config->source_fps_n, config->source_fps_d, NULL);

leads to the following result.

** ERROR: <create_camera_source_bin:170>: Failed to link 'src_elem' (image/jpeg; video/mpeg, mpegversion=(int)4, systemstream=(boolean)false; video/mpeg, mpegversion=(int){ 1, 2 }; video/mpegts, systemstream=(boolean)true; video/x-bayer, format=(string){ bggr, gbrg, grbg, rggb }, width=(int)[ 1, 32768 ], height=(int)[ 1, 32768 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-dv, systemstream=(boolean)true; video/x-fwht; video/x-h263, variant=(string)itu; video/x-h264, stream-format=(string){ byte-stream, avc }, alignment=(string)au; video/x-h265, stream-format=(string)byte-stream, alignment=(string)au; video/x-pwc1, width=(int)[ 1, 32768 ], height=(int)[ 1, 32768 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-pwc2, width=(int)[ 1, 32768 ], height=(int)[ 1, 32768 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-raw, format=(string){ RGB16, BGR, RGB, ABGR, xBGR, RGBA, RGBx, GRAY8, GRAY16_LE, GRAY16_BE, YVU9, YV12, YUY2, YVYU, UYVY, Y42B, Y41B, YUV9, NV12_64Z32, NV24, NV61, NV16, NV21, NV12, I420, ARGB, xRGB, BGRA, BGRx, BGR15, RGB15 }, width=(int)[ 1, 32768 ], height=(int)[ 1, 32768 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-sonix, width=(int)[ 1, 32768 ], height=(int)[ 1, 32768 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-vp8; video/x-vp9; video/x-wmv, wmvversion=(int)3, format=(string)WVC1; video/x-raw(format:Interlaced), format=(string){ RGB16, BGR, RGB, ABGR, xBGR, RGBA, RGBx, GRAY8, GRAY16_LE, GRAY16_BE, YVU9, YV12, YUY2, YVYU, UYVY, Y42B, Y41B, YUV9, NV12_64Z32, NV24, NV61, NV16, NV21, NV12, I420, ARGB, xRGB, BGRA, BGRx, BGR15, RGB15 }, width=(int)[ 1, 32768 ], height=(int)[ 1, 32768 ], framerate=(fraction)[ 0/1, 2147483647/1 ], interlace-mode=(string)alternate) and 'src_cap_filter1' (video/x-raw, format=(string)MJPEG, width=(int)640, height=(int)480, framerate=(fraction)30/1)
** ERROR: <create_camera_source_bin:225>: create_camera_source_bin failed
** ERROR: <create_pipeline:1863>: create_pipeline failed
** ERROR: <main:697>: Failed to create pipeline

findings

Deepstream supports MJPG for dGPU accordingly to Gst-nvjpegdec — DeepStream documentation 6.4 documentation (nvidia.com). Programming elements like gst_element_factory_make ("jpegdec", "jpeg-decoder") and gst_bin_add_many (GST_BIN (bin->bin), bin->src_elem, jpg_dec, nvvidconv1, nvvidconv2, bin->cap_filter, NULL);, gst_element_link_filtered(bin->src_elem, jpg_dec, caps); NVGSTDS_LINK_ELEMENT (jpg_dec, nvvidconv1); NVGSTDS_LINK_ELEMENT (nvvidconv1, nvvidconv2); aren’t used in Deepstream 6.4 deepstream-app.
How must the actual implementation of create_camera_source_bin() in case NV_DS_SOURCE_CAMERA_V4L2 be extended to support mjpeg from source to rtsp?

Just from the log you attached, it’s not the issue with the camera negotiate. It’s the model’s issue.

ERROR: [TRT]: 1: [runtime.cpp::parsePlan::314] Error Code 1: Serialization (Serialization assertion plan->header.magicTag == rt::kPLAN_MAGIC_TAG failed.)

When you use DeepStream version 6.4, are you sure your setup is fine? dGPU model Platform and OS Compatibility

Thank you for asking, yes, I switched back to relative path instead of absolute path in source1_usb_dec_infer_resnet_int8.txt which caused issue1.

Here the customized source1_usb_dec_infer_resnet_int8.txt (3.8 KB).

On the custom dGPU setup, the usb camera video as rtsp stream has been successfully tested from scratch with a side-installation of the freedesktop.org rtsp server inside deepstream:6.4-triton-multiarch 6.4 container. This already was the case with the deepstream:6.4-triton-devel container.

But, using the unmodified deepstream-app with the customized source1_usb_dec_infer_resnet_int8.txt, the rtsp stream functionality doesn’t work yet. On initial start, there is still issue2, see new deepstream-app output (51.9 KB), which is not the case with a follow-up start (1.7 KB).

Accordingly to the research so far, I further have to exclude
a) model configuration issues,
b) missing container configuration steps to make use of mjpg in this rtsp scenario,
c) necessary deepstream-app modifications and
d) vlc depending limitations e.g. as mentioned in Deepstream-test1-rtsp-out doesn’t output RTSP stream on VLC - Intelligent Video Analytics / DeepStream SDK - NVIDIA Developer Forums.

So after the code modification, it can run normally, but fps printing is always 0, is that right?And are you working on a WSL2 host?
Does your device have a monitor? You can try to display on the sceen and check that.

No, c) deepstream-app modifications hasn’t been accomplished yet. I’m trying to extend the C source (found a few ideas here, here and here :-) and will report back on this thread. You helped me a lot to fix configuration issues. * thanks.

To get an understanding where to start with the modifications, I’m describing the findings so far.

custom dGPU: create a pipeline, usb camera with format MJPG, rtspstreaming with software encoded h264

without inferencing

  • Ia) this works: compiled source test-launch.c from gst-rtsp-server/examples at master · GStreamer/gst-rtsp-server · GitHub in DeepStream 6.4 docker container
    ./test-launch --gst-debug=3 '( v4l2src device=/dev/video0 ! image/jpeg,format=MJPG,width=640,height=480,framerate=30/1 ! jpegparse ! rtpjpegpay name=pay0 pt=96 )'
  • Ib1) this does not work, but fakesink is on.
    gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=640,height=480,framerate=30/1,format=MJPG ! fakesink
    and
    gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=640,height=480,framerate=30/1,format=MJPG ! jpegparse ! jpegdec ! x264enc tune=zerolatency ! mpegtsmux ! fakesink
  • Ib2) this does not work, too (?? = research):
    • start with image/jpeg and rtspstreaming output as h264:
      gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=640,height=480,framerate=30/1,format=MJPG ! ?? ! rtph264pay name=pay0 pt=96
      or
    • start with video/x-h264 and rtspstreaming output as h264:
      gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-h264,width=640,height=480,framerate=30/1 ! ?? ! rtph264pay name=pay0 pt=96
      or
    • start with video/x-raw with format=mjpg and and rtspstreaming output as h264:
      gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,width=640,height=480,framerate=30/1,format=MJPG ! ?? ! rtph264pay name=pay0 pt=96

with inferencing

  • IIa) this works: (no example yet)

  • IIb) this does not work on custom dGPU, but luckily it works on Jetson. The example is from here and here

    gst-launch-1.0 v4l2src device=/dev/video0 io-mode=2 ! 'image/jpeg,width=640,height=480,framerate=30/1,format=MJPG' ! jpegparse ! jpegdec ! videoconvert ! nvvideoconvert ! 'video/x-raw(memory:NVMM),format=NV12' ! mux.sink_0 nvstreammux live-source=1 name=mux batch-size=1 width=640 height=480 ! nvinfer config-file-path=/opt/nvidia/deepstream/deepstream/sources/apps/sample_apps/deepstream-test1/dstest1_pgie_config.txt batch-size=1 ! nvmultistreamtiler rows=1 columns=1 width=640 height=480 ! nvvideoconvert ! nvdrmvideosink ! nveglglessink
    

    The example above (II2b) starts with image/jpeg, but does not end yet with rtspstreaming output as h264. The following pipeline adoption works with fakesink after nvvideoconvert. Didn’t find a valid pipeline ending so far using h264parse, rtph264pay, udpsink, or any other element.

    gst-launch-1.0 v4l2src device=/dev/video0 ! 'image/jpeg,width=640,height=480,framerate=30/1,format=MJPG' ! jpegparse ! jpegdec ! videoconvert ! nvvideoconvert ! 'video/x-raw(memory:NVMM),format=NV12' ! mux.sink_0 nvstreammux live-source=1 name=mux batch-size=1 width=640 height=480 ! nvinfer config-file-path=/opt/nvidia/deepstream/deepstream/sources/apps/sample_apps/deepstream-test1/dstest1_pgie_config.txt batch-size=1 ! nvmultistreamtiler rows=1 columns=1 width=640 height=480 ! nvvideoconvert ! fakesink
    
    • Ending the pipeline with ! nvdsosd ! nvvideoconvert ! x264 ! rtph264pay ! udpsink results in udpsink0: client ::1:5400, reason: Error sending message: Cannot assign requested address […] error sending UDP packets. Might be a docker-related issue with localhost.
      launch pipeline and screen output (3.1 KB)
      gst-inspect (579 Bytes)
      gst-inspect-output (9.1 KB)

helpful weblinks

2 Likes

Ok, if there are any new questions, you can submit a new topic. Thanks

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.