Uridecodebin and uridecodebin3 for video/audio deepstream pipeline

Please provide complete information as applicable to your setup.

• Hardware Platform (GPU)
• DeepStream Version 6.1 using docker image

Hi, I need your suggestion

This is my deepstream pipeline using for my 2 questions below.

  1. Why did my pipeline not work with uridecodebin element? Could you show me how to debug?
    I can link uridecodebin with video_queue but audio_queue failed to link

This is the code sample:
GstElement *src_bin = gst_element_factory_make ("uridecodebin", bin_name);

cb_newpad:

static void cb_newpad(GstElement *decodebin, GstPad *pad, gpointer data) {
  GstCaps *caps = gst_pad_query_caps (pad, NULL);
  const GstStructure *str = gst_caps_get_structure (caps, 0);
  const gchar *name = gst_structure_get_name (str);

  gchar *pad_name = gst_pad_get_name(pad);

  g_print("New pad added: %s\n", pad_name);

  NvDsSrcBin* this_bin = (NvDsSrcBin*) data;
  GstElement *next_elem = NULL;
  
  if (!strncmp (name, "video", 5)) {
    next_elem = (GstElement *) this_bin->video_dec_queue;
  }
  else if (!strncmp (name, "audio", 5)) {
    next_elem = (GstElement *) this_bin->audio_dec_queue;
  }

  GstPad *sinkpad = gst_element_get_static_pad (next_elem, "sink");
  if (gst_pad_link (pad, sinkpad) != GST_PAD_LINK_OK) {
    app_logger->error("[pipeline][%s][%s] Failed to link decodebin to pipeline", this_bin->source_id.c_str(), name);
    gst_object_unref (sinkpad);
  } 
  app_logger->debug("[pipeline][%s][%s] Link element %s successfully ...", this_bin->source_id.c_str(), name, GST_ELEMENT_NAME(next_elem));

  gst_caps_unref (caps);
}
  1. In case I change decodebin to uridecodebin3, I run pipeline with seek_decode (loop video) implemented but the pipeline works in a short of time then FPS goes to zero for some video streams with no reason. Do you have any ideas?

It depends on your stream. Maybe your stream does not contain an audio track.

It acts like a demuxer, so it offers as many source pads as streams are found in the media.

https://gstreamer.freedesktop.org/documentation/tutorials/basic/handy-elements.html?gi-language=c#uridecodebin

This should be related to the GST_MESSAGE_EOS message, you can try to monitor it in bus_call.

In addition, if you want to use the file loop function, recommend to use nvurisrcbin and set file-loop to 1.

Use gst-inspect-1.0 nvurisrcbin to view the relevant properties.

1 Like

It depends on your stream. Maybe your stream does not contain an audio track.

My stream has audio track but still cannot link

 gst-inspect-1.0 |grep aac

Do you have an AAC codec installed?

Run

/opt/nvidia/deepstream/deepstream/user_additional_install.sh
1 Like
root@0fcc94d610f0:/workspace/AsillaSDK_Client#  gst-inspect-1.0 |grep aac
audioparsers:  aacparse: AAC audio stream parser
fdkaac:  fdkaacdec: FDK AAC audio decoder
fdkaac:  fdkaacenc: FDK AAC audio encoder
libav:  avdec_aac: libav AAC (Advanced Audio Coding) decoder
libav:  avdec_aac_fixed: libav AAC (Advanced Audio Coding) decoder
libav:  avdec_aac_latm: libav AAC LATM (Advanced Audio Coding LATM syntax) decoder
libav:  avenc_aac: libav AAC (Advanced Audio Coding) encoder
libav:  avmux_adts: libav ADTS AAC (Advanced Audio Coding) muxer (not recommended, use aacparse instead)
typefindfunctions: audio/aac: aac, adts, adif, loas
voaacenc:  voaacenc: AAC audio encoder

because I can link elements when using uridecodebin3 so I think the environment is ok

This is a patch for deepstream_test3_app.c, which works fine. I have tested it on DS-7.0

diff --git a/sources/apps/sample_apps/deepstream-test3/deepstream_test3_app.c b/sources/apps/sample_apps/deepstream-test3/deepstream_test3_app.c
index 4c2cf00..f614a46 100644
--- a/sources/apps/sample_apps/deepstream-test3/deepstream_test3_app.c
+++ b/sources/apps/sample_apps/deepstream-test3/deepstream_test3_app.c
@@ -66,7 +66,6 @@ gchar pgie_classes_str[4][32] = { "Vehicle", "TwoWheeler", "Person",
 
 static gboolean PERF_MODE = FALSE;
 
-
 /* tiler_sink_pad_buffer_probe  will extract metadata received on OSD sink pad
  * and update params for drawing rectangle, object information etc. */
 
@@ -208,7 +207,7 @@ cb_newpad (GstElement * decodebin, GstPad * decoder_src_pad, gpointer data)
      * NVMM memory features. */
     if (gst_caps_features_contains (features, GST_CAPS_FEATURES_NVMM)) {
       /* Get the source bin ghost pad */
-      GstPad *bin_ghost_pad = gst_element_get_static_pad (source_bin, "src");
+      GstPad *bin_ghost_pad = gst_element_get_static_pad (source_bin, "vsrc");
       if (!gst_ghost_pad_set_target (GST_GHOST_PAD (bin_ghost_pad),
               decoder_src_pad)) {
         g_printerr ("Failed to link decoder src pad to source bin ghost pad\n");
@@ -217,6 +216,14 @@ cb_newpad (GstElement * decodebin, GstPad * decoder_src_pad, gpointer data)
     } else {
       g_printerr ("Error: Decodebin did not pick nvidia decoder plugin.\n");
     }
+  } else if (!strncmp (name, "audio", 5)) {
+    GstPad *bin_ghost_pad = gst_element_get_static_pad (source_bin, "asrc");
+    if (!gst_ghost_pad_set_target (GST_GHOST_PAD (bin_ghost_pad),
+            decoder_src_pad)) {
+      g_printerr ("Failed to link decoder src pad to source bin ghost pad\n");
+    }
+    gst_object_unref (bin_ghost_pad);
+    g_print("Audio pad linked\n");
   }
 }
 
@@ -279,7 +286,13 @@ create_source_bin (guint index, gchar * uri)
    * now. Once the decode bin creates the video decoder and generates the
    * cb_newpad callback, we will set the ghost pad target to the video decoder
    * src pad. */
-  if (!gst_element_add_pad (bin, gst_ghost_pad_new_no_target ("src",
+  if (!gst_element_add_pad (bin, gst_ghost_pad_new_no_target ("vsrc",
+              GST_PAD_SRC))) {
+    g_printerr ("Failed to add ghost pad in source bin\n");
+    return NULL;
+  }
+
+  if (!gst_element_add_pad (bin, gst_ghost_pad_new_no_target ("asrc",
               GST_PAD_SRC))) {
     g_printerr ("Failed to add ghost pad in source bin\n");
     return NULL;
@@ -360,6 +373,9 @@ main (int argc, char *argv[])
       num_sources = argc - 1;
   }
 
+  GstElement *audio_queue = gst_element_factory_make ("queue", "audio-queue1");
+  gst_bin_add (GST_BIN (pipeline), audio_queue);
+
   for (i = 0; i < num_sources; i++) {
     GstPad *sinkpad, *srcpad;
     gchar pad_name[16] = { };
@@ -385,7 +401,7 @@ main (int argc, char *argv[])
       return -1;
     }
 
-    srcpad = gst_element_get_static_pad (source_bin, "src");
+    srcpad = gst_element_get_static_pad (source_bin, "vsrc");
     if (!srcpad) {
       g_printerr ("Failed to get src pad of source bin. Exiting.\n");
       return -1;
@@ -399,6 +415,21 @@ main (int argc, char *argv[])
     gst_object_unref (srcpad);
     gst_object_unref (sinkpad);
 
+    GstPad *asrcpad = gst_element_get_static_pad (source_bin, "asrc");
+    if (!asrcpad) {
+      g_printerr ("Failed to get src pad of source bin. Exiting.\n");
+      return -1;
+    }
+
+    GstPad *aqsinkpad = gst_element_get_static_pad (audio_queue, "sink");
+    if (gst_pad_link (asrcpad, aqsinkpad) != GST_PAD_LINK_OK) {
+      g_printerr ("Failed to link source bin to stream muxer. Exiting.\n");
+      return -1;
+    }
+
+    gst_object_unref (asrcpad);
+    gst_object_unref (aqsinkpad);
+
     if (yaml_config) {
       src_list = src_list->next;
     }
@@ -415,6 +446,8 @@ main (int argc, char *argv[])
     pgie = gst_element_factory_make ("nvinfer", "primary-nvinference-engine");
   }
 
+  GstElement *audio_fakesink = gst_element_factory_make ("fakesink", "audio-fakesink");
+
   /* Add queue elements between every two elements */
   queue1 = gst_element_factory_make ("queue", "queue1");
   queue2 = gst_element_factory_make ("queue", "queue2");
@@ -445,7 +478,8 @@ main (int argc, char *argv[])
 #ifdef __aarch64__
       sink = gst_element_factory_make ("nv3dsink", "nvvideo-renderer");
 #else
-      sink = gst_element_factory_make ("nveglglessink", "nvvideo-renderer");
+      //sink = gst_element_factory_make ("nveglglessink", "nvvideo-renderer");
+      sink = gst_element_factory_make ("fakesink", "nvvideo-renderer");
 #endif
     }
   }
@@ -540,7 +574,7 @@ main (int argc, char *argv[])
   /* Set up the pipeline */
   /* we add all elements into the pipeline */
   gst_bin_add_many (GST_BIN (pipeline), queue1, pgie, queue2, nvdslogger, tiler,
-      queue3, nvvidconv, queue4, nvosd, queue5, sink, NULL);
+      queue3, nvvidconv, queue4, nvosd, queue5, sink, audio_fakesink, NULL);
   /* we link the elements together
   * nvstreammux -> nvinfer -> nvdslogger -> nvtiler -> nvvidconv -> nvosd
   * -> video-renderer */
@@ -550,6 +584,11 @@ main (int argc, char *argv[])
     return -1;
   }
 
+  if (!gst_element_link_many (audio_queue, audio_fakesink, NULL)) {
+    g_printerr ("Elements could not be linked 2. Exiting.\n");
+    return -1;
+  }
+
   /* Lets add probe to get informed of the meta data generated, we add probe to
    * the sink pad of the osd element, since by that time, the buffer would have
    * had got all the metadata. */

1 Like

Thank you for this. I can run it successfully

But if I change to uri_decode_bin = gst_element_factory_make ("uridecodebin3", "uri-decode-bin");, I have to remove if(gst_caps_features_contains (features, GST_CAPS_FEATURES_NVMM)) in cb_newpad.
And it seems like uridecodebin3 consume more GPU (Gpu usage) than uridecodebin.

This is a problem with gstreamer. No matter uridecodebin or uridecodebin3, it just calls nvv4l2decoder.