Thank you for your help!
Starting situation
- The
v4l2-ctl
format output lists MJPG. v4l2-ctl formats output.txt (6.5 KB)
The detected usb camera (<>csi camera) supports - for rtsp performance - e.g. 640x480 with 30fps.
- The configuration file
source1_usb_dec_infer_resnet_int8.txt
has been modified to make use of the usb camera on dev/video0
and rtsp. modified-source1_usb_dec_infer_resnet_int8.txt (4.1 KB)
- Without any modification, the deepstream-app starts the pipeline deepstream-app output.txt (52.1 KB), but there are issues shown in the console:
- “kPLAN_MAGIC_TAG failed”
- deserialize engine from file /opt/nvidia/deepstream/deepstream/samples/models/Primary_Detector/resnet18_trafficcamnet.etlt failed.
- warning “The implicit batch dimension mode has been deprecated.”
Modified function create_camera_source_bin()
in /opt/nvidia/deepstream/deepstream/sources/apps/apps-common/src/deepstream_source_bin.c
As suggested in How to use deepstream-app with MJPEG format stream? - #10 by yuweiw I’ve modified the source with:
caps1 = gst_caps_new_simple ("image/jpeg",
"width", G_TYPE_INT, config->source_width, "height", G_TYPE_INT,
config->source_height, "framerate", GST_TYPE_FRACTION,
config->source_fps_n, config->source_fps_d, NULL);
The fresh compiled deepstream-app seems to have the same issue as described in How to use deepstream-app with MJPEG format stream? - #12 by 936694123.
** ERROR: <create_camera_source_bin:173>: Failed to link 'src_cap_filter1' (image/jpeg, width=(int)640, height=(int)480, framerate=(fraction)30/1) and 'nvvidconv1' (video/x-raw, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], format=(string){ ABGR64_LE, BGRA64_LE, AYUV64, ARGB64_LE, ARGB64, RGBA64_LE, ABGR64_BE, BGRA64_BE, ARGB64_BE, RGBA64_BE, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, AV12, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, RGBP, GBR, BGRP, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }; video/x-raw(ANY), format=(string){ ABGR64_LE, BGRA64_LE, AYUV64, ARGB64_LE, ARGB64, RGBA64_LE, ABGR64_BE, BGRA64_BE, ARGB64_BE, RGBA64_BE, GBRA_12LE, GBRA_12BE, Y412_LE, Y412_BE, A444_10LE, GBRA_10LE, A444_10BE, GBRA_10BE, A422_10LE, A422_10BE, A420_10LE, A420_10BE, RGB10A2_LE, BGR10A2_LE, Y410, GBRA, ABGR, VUYA, BGRA, AYUV, ARGB, RGBA, A420, AV12, Y444_16LE, Y444_16BE, v216, P016_LE, P016_BE, Y444_12LE, GBR_12LE, Y444_12BE, GBR_12BE, I422_12LE, I422_12BE, Y212_LE, Y212_BE, I420_12LE, I420_12BE, P012_LE, P012_BE, Y444_10LE, GBR_10LE, Y444_10BE, GBR_10BE, r210, I422_10LE, I422_10BE, NV16_10LE32, Y210, v210, UYVP, I420_10LE, I420_10BE, P010_10LE, NV12_10LE32, NV12_10LE40, P010_10BE, Y444, RGBP, GBR, BGRP, NV24, xBGR, BGRx, xRGB, RGBx, BGR, IYU2, v308, RGB, Y42B, NV61, NV16, VYUY, UYVY, YVYU, YUY2, I420, YV12, NV21, NV12, NV12_64Z32, NV12_4L4, NV12_32L32, Y41B, IYU1, YVU9, YUV9, RGB16, BGR16, RGB15, BGR15, RGB8P, GRAY16_LE, GRAY16_BE, GRAY10_LE32, GRAY8 }, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ])
** ERROR: <create_camera_source_bin:225>: create_camera_source_bin failed
** ERROR: <create_pipeline:1863>: create_pipeline failed
** ERROR: <main:697>: Failed to create pipeline
A modification with
caps1 = gst_caps_new_simple ("video/x-raw", "format", G_TYPE_STRING, "MJPEG",
"width", G_TYPE_INT, config->source_width, "height", G_TYPE_INT,
config->source_height, "framerate", GST_TYPE_FRACTION,
config->source_fps_n, config->source_fps_d, NULL);
leads to the following result.
** ERROR: <create_camera_source_bin:170>: Failed to link 'src_elem' (image/jpeg; video/mpeg, mpegversion=(int)4, systemstream=(boolean)false; video/mpeg, mpegversion=(int){ 1, 2 }; video/mpegts, systemstream=(boolean)true; video/x-bayer, format=(string){ bggr, gbrg, grbg, rggb }, width=(int)[ 1, 32768 ], height=(int)[ 1, 32768 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-dv, systemstream=(boolean)true; video/x-fwht; video/x-h263, variant=(string)itu; video/x-h264, stream-format=(string){ byte-stream, avc }, alignment=(string)au; video/x-h265, stream-format=(string)byte-stream, alignment=(string)au; video/x-pwc1, width=(int)[ 1, 32768 ], height=(int)[ 1, 32768 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-pwc2, width=(int)[ 1, 32768 ], height=(int)[ 1, 32768 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-raw, format=(string){ RGB16, BGR, RGB, ABGR, xBGR, RGBA, RGBx, GRAY8, GRAY16_LE, GRAY16_BE, YVU9, YV12, YUY2, YVYU, UYVY, Y42B, Y41B, YUV9, NV12_64Z32, NV24, NV61, NV16, NV21, NV12, I420, ARGB, xRGB, BGRA, BGRx, BGR15, RGB15 }, width=(int)[ 1, 32768 ], height=(int)[ 1, 32768 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-sonix, width=(int)[ 1, 32768 ], height=(int)[ 1, 32768 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-vp8; video/x-vp9; video/x-wmv, wmvversion=(int)3, format=(string)WVC1; video/x-raw(format:Interlaced), format=(string){ RGB16, BGR, RGB, ABGR, xBGR, RGBA, RGBx, GRAY8, GRAY16_LE, GRAY16_BE, YVU9, YV12, YUY2, YVYU, UYVY, Y42B, Y41B, YUV9, NV12_64Z32, NV24, NV61, NV16, NV21, NV12, I420, ARGB, xRGB, BGRA, BGRx, BGR15, RGB15 }, width=(int)[ 1, 32768 ], height=(int)[ 1, 32768 ], framerate=(fraction)[ 0/1, 2147483647/1 ], interlace-mode=(string)alternate) and 'src_cap_filter1' (video/x-raw, format=(string)MJPEG, width=(int)640, height=(int)480, framerate=(fraction)30/1)
** ERROR: <create_camera_source_bin:225>: create_camera_source_bin failed
** ERROR: <create_pipeline:1863>: create_pipeline failed
** ERROR: <main:697>: Failed to create pipeline
findings
Deepstream supports MJPG for dGPU accordingly to Gst-nvjpegdec — DeepStream documentation 6.4 documentation (nvidia.com). Programming elements like gst_element_factory_make ("jpegdec", "jpeg-decoder")
and gst_bin_add_many (GST_BIN (bin->bin), bin->src_elem, jpg_dec, nvvidconv1, nvvidconv2, bin->cap_filter, NULL);
, gst_element_link_filtered(bin->src_elem, jpg_dec, caps); NVGSTDS_LINK_ELEMENT (jpg_dec, nvvidconv1); NVGSTDS_LINK_ELEMENT (nvvidconv1, nvvidconv2);
aren’t used in Deepstream 6.4 deepstream-app.
How must the actual implementation of create_camera_source_bin()
in case NV_DS_SOURCE_CAMERA_V4L2
be extended to support mjpeg from source to rtsp?