Input buffer number of surfaces (0) must be equal to mux->num_surfaces_per_frame (1)

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) AGX
• DeepStream Version
• JetPack Version (valid for Jetson only) 4.3
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Sample application and the configuration file content
• Reproduce steps
• Reproducing rate and duration

Hi!
I’m trying to run rtsp stream from Axis camera.
If I set uri=“rtsp://log:pass@192.168.90.90:554/axis-media/media.amp” everything works fine and logs in console:

Now playing:Creating LL OSD context new
Decodebin child added: source
Running…
Decodebin child added: decodebin0
Decodebin child added: rtph264depay0
Decodebin child added: h264parse0
Decodebin child added: capsfilter0
Decodebin child added: nvv4l2decoder0
Opening in BLOCKING MODE
NvMMLiteOpen : Block : BlockType = 261
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 261
In cb_newpad
Creating LL OSD context new

If I set uri=“rtsp://log:pass@192.168.90.90:554/axis-media/media.amp?videocodec=jpeg” logs:

Now playing:Creating LL OSD context new
Decodebin child added: source
Running…
Decodebin child added: decodebin0
Decodebin child added: rtpjpegdepay0
Decodebin child added: nvjpegdec0
In cb_newpad
Creating LL OSD context new
ERROR from element stream-muxer: Input buffer number of surfaces (0) must be equal to mux->num_surfaces_per_frame (1)
Set nvstreammux property num-surfaces-per-frame appropriately

Error details: /dvs/git/dirty/git-master_linux/deepstream/sdk/src/gst-plugins/gst-nvmultistream/gstnvstreammux.c(309): gst_nvstreammux_chain (): /GstPipeline:mondoose-c-pipeline/GstNvStreamMux:stream-muxer
Returned, stopping playback
Deleting pipeline

What’s wrong?

Can you fill the template with the platform and software information? It will save much time for us to know what is happenning.

• Hardware Platform (Jetson / GPU) AGX
• DeepStream Version
• JetPack Version (valid for Jetson only) 4.3
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Sample application and the configuration file content
• Reproduce steps
• Reproducing rate and duration

Can I suppose you are using deepstream-app? If so, current deepstream-app can only support RTSP streams with h264 or h265 format video. If you want to support mjpeg format video, please refer to Deepstream test 5 not able to process mpeg rtsp stream

Hi, @Fiona.Chen! Thank you for reply!

I’ve used code depending on deepstream-test3_app.c.
I’ve tryed to copy /opt/nvidia/deepstream/deepstream-5.0/sources/app/apps-common/src/deepstream_source_bin.c to my includes directory and added to function cb_rtspsrc_select_stream() printing of encoding_name, but nothing printing to console.
I used uridecodebin and next functions:

static GstElement *
create_source_bin (guint index, const gchar * uri)
{
  GstElement *bin = NULL, *uri_decode_bin = NULL;
  gchar bin_name[16] = { };

  g_snprintf (bin_name, 15, "source-bin-%02d", index);
  /* Create a source GstBin to abstract this bin's content from the rest of the
   * pipeline */
  bin = gst_bin_new (bin_name);

  /* Source element for reading from the uri.
   * We will use decodebin and let it figure out the container format of the
   * stream and the codec and plug the appropriate demux and decode plugins. */
  uri_decode_bin = gst_element_factory_make ("uridecodebin", "uri-decode-bin");

  if (!bin || !uri_decode_bin) {
    g_printerr ("One element in source bin could not be created.\n");
    return NULL;
  }

  /* We set the input uri to the source element */
  g_object_set (G_OBJECT (uri_decode_bin), "uri", uri, NULL);

  /* Connect to the "pad-added" signal of the decodebin which generates a
   * callback once a new pad for raw data has beed created by the decodebin */
  g_signal_connect (G_OBJECT (uri_decode_bin), "pad-added",
      G_CALLBACK (cb_newpad), bin);
  g_signal_connect (G_OBJECT (uri_decode_bin), "child-added",
      G_CALLBACK (decodebin_child_added), bin);

  gst_bin_add (GST_BIN (bin), uri_decode_bin);

  /* We need to create a ghost pad for the source bin which will act as a proxy
   * for the video decoder src pad. The ghost pad will not have a target right
   * now. Once the decode bin creates the video decoder and generates the
   * cb_newpad callback, we will set the ghost pad target to the video decoder
   * src pad. */
  if (!gst_element_add_pad (bin, gst_ghost_pad_new_no_target ("src",
              GST_PAD_SRC))) {
    g_printerr ("Failed to add ghost pad in source bin\n");
    return NULL;
  }
  return bin;
}

static void
cb_newpad (GstElement * decodebin, GstPad * decoder_src_pad, gpointer data)
{
  g_print ("In cb_newpad\n");
  GstCaps *caps = gst_pad_get_current_caps (decoder_src_pad);
  const GstStructure *str = gst_caps_get_structure (caps, 0);
  const gchar *name = gst_structure_get_name (str);
  GstElement *source_bin = (GstElement *) data;
  GstCapsFeatures *features = gst_caps_get_features (caps, 0);

  /* Need to check if the pad created by the decodebin is for video and not
   * audio. */
  if (!strncmp (name, "video", 5)) {
    /* Link the decodebin pad only if decodebin has picked nvidia
     * decoder plugin nvdec_*. We do this by checking if the pad caps contain
     * NVMM memory features. */
    if (gst_caps_features_contains (features, GST_CAPS_FEATURES_NVMM)) {
      /* Get the source bin ghost pad */
      GstPad *bin_ghost_pad = gst_element_get_static_pad (source_bin, "src");
      if (!gst_ghost_pad_set_target (GST_GHOST_PAD (bin_ghost_pad),
              decoder_src_pad)) {
        g_printerr ("Failed to link decoder src pad to source bin ghost pad\n");
      }
      gst_object_unref (bin_ghost_pad);
    } else {
      g_printerr ("Error: Decodebin did not pick nvidia decoder plugin.\n");
    }
  }
}

static void
decodebin_child_added (GstChildProxy * child_proxy, GObject * object,
    gchar * name, gpointer user_data)
{
  g_print ("Decodebin child added: %s\n", name);
  if (g_strrstr (name, "decodebin") == name) {
    g_signal_connect (G_OBJECT (object), "child-added",
        G_CALLBACK (decodebin_child_added), user_data);
  }
}

Can you try the following pipeline with command? Are you using Jetson board?

gst-launch-1.0 uridecodebin uri=rtsp://xxxxx ! nvvideoconvert ! nvegltransform ! nveglglessink

And I have one more problem:
In gst-example I print frame_meta->buf_pts every second, but delta of two neighborhood buf_pts is ~0.6 sec

One topic for one problem, please create new topic for new problem. Please use template to provide information when creating new topic.

Can you try the following pipeline with command? Are you using Jetson board?

Yes, I using Jetson Xavier AGX

gst-launch-1.0 uridecodebin uri=rtsp://xxxxx ! nvvideoconvert ! nvegltransform ! nveglglessink

This command run with h264 and fail with jpeg, but nothing printing to console.

static gboolean
cb_rtspsrc_select_stream (GstElement *rtspsrc, guint num, GstCaps *caps,
        gpointer user_data)
{
  GstStructure *str = gst_caps_get_structure (caps, 0);
  const gchar *media = gst_structure_get_string (str, "media");
  const gchar *encoding_name = gst_structure_get_string (str, "encoding-name");
  gchar elem_name[50];
  NvDsSrcBin *bin = (NvDsSrcBin *) user_data;
  gboolean ret = FALSE;

  gboolean is_video = (!g_strcmp0 (media, "video"));

  if (!is_video)
    return FALSE;

  /* Create and add depay element only if it is not created yet. */
  if (!bin->depay) {
    g_snprintf (elem_name, sizeof (elem_name), "depay_elem%d", bin->bin_id);

    /* Add the proper depay element based on codec. */
    g_print("Encoding name is %s\n", encoding_name);
    if (!g_strcmp0 (encoding_name, "H264")) {
      bin->depay = gst_element_factory_make ("rtph264depay", elem_name);
    } else if (!g_strcmp0 (encoding_name, "H265")) {
      bin->depay = gst_element_factory_make ("rtph265depay", elem_name);
    } else {
      NVGSTDS_WARN_MSG_V ("%s not supported", encoding_name);
      return FALSE;
    }

In console with h264:

Setting pipeline to PAUSED …
Using winsys: x11
Pipeline is live and does not need PREROLL …
Got context from element ‘eglglessink0’: gst.egl.EGLDisplay=context, display=(GstEGLDisplay)NULL;
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://xxxx/axis-media/media.amp
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (open) Opened Stream
Setting pipeline to PLAYING …
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Progress: (request) Sending PLAY request
Progress: (request) Sent PLAY request
Opening in BLOCKING MODE
NvMMLiteOpen : Block : BlockType = 261
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 261

In console with jpeg:

Setting pipeline to PAUSED …

Using winsys: x11
Pipeline is live and does not need PREROLL …
Got context from element ‘eglglessink0’: gst.egl.EGLDisplay=context, display=(GstEGLDisplay)NULL;
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://xxxx/axis-media/media.amp?videocodec=jpeg
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (open) Opened Stream
Setting pipeline to PLAYING …
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Progress: (request) Sending PLAY request
Progress: (request) Sent PLAY request
ERROR: from element /GstPipeline:pipeline0/GstURIDecodeBin:uridecodebin0/GstRTSPSrc:source/GstUDPSrc:udpsrc1: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstURIDecodeBin:uridecodebin0/GstRTSPSrc:source/GstUDPSrc:udpsrc1:
streaming stopped, reason error (-5)
Execution ended after 0:00:01.497170210
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …

I get video with

gst-launch-1.0 -e uridecodebin uri=“rtsp://xxxx/axis-media/media.amp?videocodec=jpeg” ! nvvidconv ! jpegenc ! matroskamux ! autovideosink

Are you using DeepStream 5.0 GA?
Can you try the following command and give me the output?
gst-inspect-1.0 nvv4l2decoder

I using DS4

gst-inspect-1.0 nvv4l2decoder

Factory Details:
Rank primary + 11 (267)
Long-name NVIDIA v4l2 video decoder
Klass Codec/Decoder/Video
Description Decode video streams via V4L2 API
Author Nicolas Dufresne nicolas.dufresne@collabora.com, Viranjan Pagar vpagar@nvidia.com

Plugin Details:
Name nvvideo4linux2
Description Nvidia elements for Video 4 Linux
Filename /usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstnvvideo4linux2.so
Version 1.14.0
License LGPL
Source module nvvideo4linux2
Binary package nvvideo4linux2
Origin URL http://nvidia.com/

GObject
±—GInitiallyUnowned
±—GstObject
±—GstElement
±—GstVideoDecoder
±—GstNvV4l2VideoDec
±—nvv4l2decoder

Pad Templates:
SRC template: ‘src’
Availability: Always
Capabilities:
video/x-raw(memory:NVMM)
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]

SINK template: ‘sink’
Availability: Always
Capabilities:
image/jpeg
video/x-h264
stream-format: { (string)byte-stream }
alignment: { (string)au }
video/x-h265
stream-format: { (string)byte-stream }
alignment: { (string)au }
video/mpeg
mpegversion: 4
systemstream: false
parsed: true
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
video/mpeg
mpegversion: [ 1, 2 ]
systemstream: false
parsed: true
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
video/x-divx
divxversion: [ 4, 5 ]
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
video/x-vp8
video/x-vp9
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]

Element has no clocking capabilities.
Element has no URI handling capabilities.

Pads:
SINK: ‘sink’
Pad Template: ‘sink’
SRC: ‘src’
Pad Template: ‘src’

Element Properties:
name : The name of the object
flags: readable, writable
String. Default: “nvv4l2decoder0”
parent : The parent of the object
flags: readable, writable
Object of type “GstObject”
device : Device location
flags: readable
String. Default: “/dev/nvhost-nvdec”
device-name : Name of the device
flags: Opening in BLOCKING MODE
readable
String. Default: “”
device-fd : File descriptor of the device
flags: readable
Integer. Range: -1 - 2147483647 Default: -1
output-io-mode : Output side I/O mode (matches sink pad)
flags: readable, writable
Enum “GstNvV4l2DecOutputIOMode” Default: 0, “auto”
(0): auto - GST_V4L2_IO_AUTO
(2): mmap - GST_V4L2_IO_MMAP
(3): userptr - GST_V4L2_IO_USERPTR
capture-io-mode : Capture I/O mode (matches src pad)
flags: readable, writable
Enum “GstNvV4l2DecCaptureIOMode” Default: 0, “auto”
(0): auto - GST_V4L2_IO_AUTO
(2): mmap - GST_V4L2_IO_MMAP
extra-controls : Extra v4l2 controls (CIDs) for the device
flags: readable, writable
Boxed pointer of type “GstStructure”
skip-frames : Type of frames to skip during decoding
flags: readable, writable, changeable in NULL, READY, PAUSED or PLAYING state
Enum “SkipFrame” Default: 0, “decode_all”
(0): decode_all - Decode all frames
(1): decode_non_ref - Decode non-ref frames
(2): decode_key - decode key frames
drop-frame-interval : Interval to drop the frames,ex: value of 5 means every 5th frame will be given by decoder, rest all dropped
flags: readable, writable, changeable only in NULL or READY state
Unsigned Integer. Range: 0 - 30 Default: 0
num-extra-surfaces : Additional number of surfaces in addition to min decode surfaces given by the v4l2 driver
flags: readable, writable, changeable only in NULL or READY state
Unsigned Integer. Range: 0 - 24 Default: 1
disable-dpb : Set to disable DPB buffer for low latency
flags: readable, writable
Boolean. Default: false
enable-full-frame : Whether or not the data is full framed
flags: readable, writable
Boolean. Default: false
enable-frame-type-reporting: Set to enable frame type reporting
flags: readable, writable
Boolean. Default: false
enable-error-check : Set to enable error check
flags: readable, writable
Boolean. Default: false
enable-max-performance: Set to enable max performance
flags: readable, writable
Boolean. Default: false
mjpeg : Set to open MJPEG block
flags: readable, writable
Boolean. Default: false
bufapi-version : Set to use new buf API
flags: readable, writable
Boolean. Default: false

Can you try the following command? Please do not use nvvidconv, it is not part of DeepStream.

gst-launch-1.0 rtspsrc location=rtsp://xxxx ! rtpjpegdepay ! jpegparse ! nvv4l2decoder mjpeg=1 ! nvvideoconvert ! nvegltransform ! nveglglessink

gst-launch-1.0 rtspsrc location=“rtsp://xxxx/axis-media/media.amp?videocodec=jpeg” ! rtpjpegdepay ! jpegparse ! nvv4l2decoder mjpeg=1 ! nvvideoconvert ! nvegltransform ! nveglglessink

Setting pipeline to PAUSED …

Using winsys: x11
Opening in BLOCKING MODE
Pipeline is live and does not need PREROLL …
Got context from element ‘eglglessink0’: gst.egl.EGLDisplay=context, display=(GstEGLDisplay)NULL;
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://xxxx/axis-media/media.amp?videocodec=jpeg
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (open) Opened Stream
Setting pipeline to PLAYING …
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Progress: (request) Sending PLAY request
Progress: (request) Sent PLAY request
NvMMLiteOpen : Block : BlockType = 277
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 277
ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc0:
streaming stopped, reason error (-5)
Execution ended after 0:00:02.205456767
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …

Sorry, I replace nveglglessink to fakesink:

gst-launch-1.0 rtspsrc location=“rtsp://root:nvidia@192.168.90.90:554/axis-media/media.amp?videocodec=jpeg” ! rtpjpegdepay ! jpegparse ! nvv4l2decoder mjpeg=1 ! nvvideoconvert ! nvegltransform ! fakesink

Setting pipeline to PAUSED …
Opening in BLOCKING MODE
Pipeline is live and does not need PREROLL …
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://root:nvidia@192.168.90.90:554/axis-media/media.amp?videocodec=jpeg
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (open) Opened Stream
Setting pipeline to PLAYING …
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Progress: (request) Sending PLAY request
Progress: (request) Sent PLAY request
NvMMLiteOpen : Block : BlockType = 277
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 277
ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc0:
streaming stopped, reason error (-5)
Execution ended after 0:00:02.164146046
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …

Please upgrade to DeepStream 5.0 GA. We will not fix any problem with DS 4.0 now.

I’m not sure what is the format. Can you get the pipeline graph with the pipeline you can run?
gst-launch-1.0 -e uridecodebin uri=“rtsp://xxxx/axis-media/media.amp?videocodec=jpeg” ! nvvidconv ! jpegenc ! matroskamux ! autovideosink
The method is in the " Getting pipeline graphs" part in this link Basic tutorial 11: Debugging tools

Or you can refer to this link:

If the pipeline in Deepstream test 5 not able to process mpeg rtsp stream can not work in your board, there is no way to run your case with deepstream.