Creating custom plugin in Deepstream

Hello @CJR, sorry for causing trouble but I have one question. I just simply have to connect extra elements with other elements. Hence I don’t have to use bin I just have to connect all elements like-

pgie->nvtracker->nvdsanalytics->tiler->nvvidconv->nvosd->nvideoconvvert->caps filter(x/raw)->encoder->codecparse->mux->filesink

please correct me if I am wrong.

you’re right

Thank you for your conformation. I will try it right away. one more thing do we have to use nvvidconv both before and after nvosd.

Yes, we do. OSD needs the input to be in RGBA format while the encoders accept I420/NV12 formats.

Hello @CJR,
I tried creating pipeline as you suggested. I was able to resolve some errors but errors are difficult to understand. I am posting my error and the code.

My error is -

(deepstream-nvdsanalytics-test:203): GStreamer-WARNING **: 16:08:13.115: Name ‘nvvideo-converter’ is not unique in bin ‘nvdsanalytics-test-pipeline’, not adding

(deepstream-nvdsanalytics-test:203): GStreamer-CRITICAL **: 16:08:13.116: gst_element_link_pads_full: assertion ‘GST_IS_ELEMENT (dest)’ failed
Elements could not be linked. Exiting.

My code is-

#include <gst/gst.h>
    #include <glib.h>
    #include <stdio.h>
    #include <math.h>
    #include <string.h>
    #include <sys/time.h>
    #include <iostream>
    #include <vector>
    #include <unordered_map>
    #include "gstnvdsmeta.h"
    #include "nvds_analytics_meta.h"
    #include "deepstream_config.h"
    #ifndef PLATFORM_TEGRA
    #include "gst-nvmessage.h"
    #endif

    [....]

    int
    main (int argc, char *argv[])
    {
      GMainLoop *loop = NULL;
      GstElement *pipeline = NULL, *streammux = NULL, *sink = NULL, *pgie = NULL,
                 *nvtracker = NULL, *nvdsanalytics = NULL,
          *nvvidconv = NULL, *nvosd = NULL, *nvvidconv1 = NULL, *transform1 = NULL, *cap_filter = NULL, *encoder = NULL, *codecparse = NULL, *mux = NULL, *tiler = NULL;
      GstCaps *caps = NULL;

    #ifdef PLATFORM_TEGRA
      GstElement *transform = NULL;
    #endif
      GstBus *bus = NULL;
      guint bus_watch_id;
      GstPad *nvdsanalytics_src_pad = NULL;
      guint i, num_sources;
      guint tiler_rows, tiler_columns;
      guint pgie_batch_size;
      gulong bitrate = 2000000;
      guint profile = 0;

      /* Check input arguments */
      if (argc < 2) {
        g_printerr ("Usage: %s <uri1> [uri2] ... [uriN] \n", argv[0]);
        return -1;
      }
      num_sources = argc - 1;

      /* Standard GStreamer initialization */
      gst_init (&argc, &argv);
      loop = g_main_loop_new (NULL, FALSE);

      /* Create gstreamer elements */
      /* Create Pipeline element that will form a connection of other elements */
      pipeline = gst_pipeline_new ("nvdsanalytics-test-pipeline");

      /* Create nvstreammux instance to form batches from one or more sources. */
      streammux = gst_element_factory_make ("nvstreammux", "stream-muxer");

      if (!pipeline || !streammux) {
        g_printerr ("One element could not be created. Exiting.\n");
        return -1;
      }
      gst_bin_add (GST_BIN (pipeline), streammux);

      for (i = 0; i < num_sources; i++) {
        GstPad *sinkpad, *srcpad;
        gchar pad_name[16] = { };
        GstElement *source_bin = create_source_bin (i, argv[i + 1]);

        if (!source_bin) {
          g_printerr ("Failed to create source bin. Exiting.\n");
          return -1;
        }

        gst_bin_add (GST_BIN (pipeline), source_bin);

        g_snprintf (pad_name, 15, "sink_%u", i);
        sinkpad = gst_element_get_request_pad (streammux, pad_name);
        if (!sinkpad) {
          g_printerr ("Streammux request sink pad failed. Exiting.\n");
          return -1;
        }

        srcpad = gst_element_get_static_pad (source_bin, "src");
        if (!srcpad) {
          g_printerr ("Failed to get src pad of source bin. Exiting.\n");
          return -1;
        }

        if (gst_pad_link (srcpad, sinkpad) != GST_PAD_LINK_OK) {
          g_printerr ("Failed to link source bin to stream muxer. Exiting.\n");
          return -1;
        }

        gst_object_unref (srcpad);
        gst_object_unref (sinkpad);
      }

      /* Use nvinfer to infer on batched frame. */
      pgie = gst_element_factory_make ("nvinfer", "primary-nvinference-engine");

      /* Use nvtracker to track detections on batched frame. */
      nvtracker = gst_element_factory_make ("nvtracker", "nvtracker");

      /* Use nvdsanalytics to perform analytics on object */
      nvdsanalytics = gst_element_factory_make ("nvdsanalytics", "nvdsanalytics");

      /* Use nvtiler to composite the batched frames into a 2D tiled array based
       * on the source of the frames. */
      tiler = gst_element_factory_make ("nvmultistreamtiler", "nvtiler");

      /* Use convertor to convert from NV12 to RGBA as required by nvosd */
      nvvidconv = gst_element_factory_make ("nvvideoconvert", "nvvideo-converter");
      if (!nvvidconv) {
        g_printerr ("nvvdiconv element could not be created. Exiting.\n");
      }

      /* Create OSD to draw on the converted RGBA buffer */
      nvosd = gst_element_factory_make ("nvdsosd", "nv-onscreendisplay");
      if (!nvosd) {
        g_printerr ("nvosd element could not be created. Exiting.\n");
      }

      /* converter to convert RGBA to NV12 */
      nvvidconv1 = gst_element_factory_make ("nvvideoconvert", "nvvideo-converter1");
      if (!nvvidconv1) {
        g_printerr ("nvvidconv1 element could not be created. Exiting.\n");
      }
      /*create cap_filter */
      cap_filter = gst_element_factory_make (NVDS_ELEM_CAPS_FILTER, "cap_filter");
      if (!cap_filter) {
        g_printerr ("cap_filter element could not be created. Exiting.\n");
      }

      /* create cap for filter */
      caps = gst_caps_from_string ("video/x-raw, format=I420");
      g_object_set (G_OBJECT (cap_filter), "caps", caps, NULL);

      /* creatge encoder*/
      encoder = gst_element_factory_make (NVDS_ELEM_ENC_H264_HW, "encoder");
      if (!encoder) {
        g_printerr ("encoder element could not be created. Exiting.\n");
      }

      /* create transform1 */
      transform1 = gst_element_factory_make (NVDS_ELEM_VIDEO_CONV, "transform1");
      g_object_set (G_OBJECT (transform1), "gpu-id", 0, NULL);
      if (!transform1) {
        g_printerr ("transform1 element could not be created. Exiting.\n");
      }

      #ifdef IS_TEGRA
        g_object_set (G_OBJECT (encoder), "bufapi-version", 1, NULL);
      #endif

      g_object_set (G_OBJECT (encoder), "profile", profile, NULL);
      g_object_set (G_OBJECT (encoder), "bitrate", bitrate, NULL);

      /* create codecparse */
      codecparse = gst_element_factory_make ("h264parse", "h264-parser");
      if (!codecparse) {
        g_printerr ("codecparse element could not be created. Exiting.\n");
      }
      /* create mux */
      mux = gst_element_factory_make (NVDS_ELEM_MUX_MP4, "mux");
      if (!mux) {
        g_printerr ("mux element could not be created. Exiting.\n");
      }

      /* create sink */
      sink = gst_element_factory_make (NVDS_ELEM_SINK_FILE, "filesink");
      if (!sink) {
        g_printerr ("sink element could not be created. Exiting.\n");
      }
      g_object_set (G_OBJECT (sink), "location", "capture.mp4", "sync", 0, "async" , FALSE, NULL);

    //   /* Finally render the osd output */
    #ifdef PLATFORM_TEGRA
      transform = gst_element_factory_make ("nvegltransform", "nvegl-transform");
    #endif
    //   sink = gst_element_factory_make (NVDS_ELEM_SINK_FILE, "filesink");
    //   g_object_set (G_OBJECT (sink), "location", "capture.mp4", "sync", 0, "async" , FALSE, NULL);

      if (!pgie || !nvtracker || !nvdsanalytics || !nvvidconv ||
          !nvosd || !nvvidconv1 || !cap_filter || !encoder || !codecparse || !mux || !sink) {
        g_printerr ("One element could not be created. Exiting.\n");
        return -1;
      }

    #ifdef PLATFORM_TEGRA
      if(!transform) {
        g_printerr ("One tegra element could not be created. Exiting.\n");
        return -1;
      }
    #endif

      g_object_set (G_OBJECT (streammux), "width", MUXER_OUTPUT_WIDTH, "height",
          MUXER_OUTPUT_HEIGHT, "batch-size", num_sources,
          "batched-push-timeout", MUXER_BATCH_TIMEOUT_USEC, NULL);

      /* Configure the nvinfer element using the nvinfer config file. */
      g_object_set (G_OBJECT (pgie),
          "config-file-path", "nvdsanalytics_pgie_config.txt", NULL);

      /* Configure the nvtracker element for using the particular tracker algorithm. */
      g_object_set (G_OBJECT (nvtracker),
          "ll-lib-file", "/opt/nvidia/deepstream/deepstream-5.0/lib/libnvds_nvdcf.so",
          "ll-config-file", "tracker_config.yml", "tracker-width", 640, "tracker-height", 480,
           NULL);

      /* Configure the nvdsanalytics element for using the particular analytics config file*/
      g_object_set (G_OBJECT (nvdsanalytics),
          "config-file", "config_nvdsanalytics.txt",
           NULL);

      /* Override the batch-size set in the config file with the number of sources. */
      g_object_get (G_OBJECT (pgie), "batch-size", &pgie_batch_size, NULL);
      if (pgie_batch_size != num_sources) {
        g_printerr
            ("WARNING: Overriding infer-config batch-size (%d) with number of sources (%d)\n",
            pgie_batch_size, num_sources);
        g_object_set (G_OBJECT (pgie), "batch-size", num_sources, NULL);
      }

      tiler_rows = (guint) sqrt (num_sources);
      tiler_columns = (guint) ceil (1.0 * num_sources / tiler_rows);
      /* we set the tiler properties here */
      g_object_set (G_OBJECT (tiler), "rows", tiler_rows, "columns", tiler_columns,
          "width", TILED_OUTPUT_WIDTH, "height", TILED_OUTPUT_HEIGHT, NULL);

      /* we add a message handler */
      bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline));
      bus_watch_id = gst_bus_add_watch (bus, bus_call, loop);
      gst_object_unref (bus);

      /* Set up the pipeline */
      /* we add all elements into the pipeline */
    #ifdef PLATFORM_TEGRA
      gst_bin_add_many (GST_BIN (pipeline), pgie, nvtracker, nvdsanalytics ,
              nvvidconv, nvosd, nvvidconv1, cap_filter, encoder, codecparse, mux, sink,
          NULL);

      /* we link the elements together
       * nvstreammux -> nvinfer -> nvtracker -> nvdsanalytics -> nvtiler ->
       * nvvideoconvert -> nvosd -> transform -> sink
       */
      if (!gst_element_link_many (streammux, pgie, nvtracker, nvdsanalytics,
                                  nvvidconv, nvosd, nvvidconv1, cap_filter, encoder, codecparse, mux, sink, NULL)) {
        g_printerr ("Elements could not be linked. Exiting.\n");
        return -1;
      }
    #else
      gst_bin_add_many (GST_BIN (pipeline), pgie, nvtracker, nvdsanalytics,
                        nvvidconv, nvosd, nvvidconv1, cap_filter, encoder, codecparse, mux, sink, NULL);
      /* we link the elements together
       * nvstreammux -> nvinfer -> nvtracker -> nvdsanalytics -> nvtiler ->
       * nvvideoconvert -> nvosd -> sink
       */
      if (!gst_element_link_many (streammux, pgie, nvtracker, nvdsanalytics,
          nvvidconv, nvosd, nvvidconv1, cap_filter, encoder, codecparse, mux, sink, NULL)) {
        g_printerr ("Elements could not be linked. Exiting.\n");
        return -1;
      }
    #endif

      /* Lets add probe to get informed of the meta data generated, we add probe to
       * the sink pad of the nvdsanalytics element, since by that time, the buffer
       * would have had got all the metadata.
       */
      nvdsanalytics_src_pad = gst_element_get_static_pad (nvdsanalytics, "src");
      if (!nvdsanalytics_src_pad)
        g_print ("Unable to get src pad\n");
      else
        gst_pad_add_probe (nvdsanalytics_src_pad, GST_PAD_PROBE_TYPE_BUFFER,
            nvdsanalytics_src_pad_buffer_probe, NULL, NULL);

      /* Set the pipeline to "playing" state */
      g_print ("Now playing:");
      for (i = 0; i < num_sources; i++) {
        g_print (" %s,", argv[i + 1]);
      }
      g_print ("\n");
      gst_element_set_state (pipeline, GST_STATE_PLAYING);

      /* Wait till pipeline encounters an error or EOS */
      g_print ("Running...\n");
      g_main_loop_run (loop);

      /* Out of the main loop, clean up nicely */
      g_print ("Returned, stopping playback\n");
      gst_element_set_state (pipeline, GST_STATE_NULL);
      g_print ("Deleting pipeline\n");
      gst_object_unref (GST_OBJECT (pipeline));
      g_source_remove (bus_watch_id);
      g_main_loop_unref (loop);
      return 0;
    }

Please have a look.
Thanks in advance

Hello @CJR,
I have seen one post on deepstream forum about adding filesink in deepstream-test1 app the links is -

Encoding and saving to a file with deepstream_test1_app.c

The solution pipeline suggested is -

source->h264parser->decoder->pgie->filter1->nvvidconv->filter2->nvosd->nvvidconv1->filter3->videoconvert->filter4->x264enc->qtmux->filesink

In the above post it is not mentioned that which platform they are using. I don’t know if this solution will work for me or not because they don’t have nvstreammux plugin which is present in my deepstream-test1 app.

Do you think I can use above pipeline or it can work with some changes or it is platform dependent? Please let me know.

Hello @CJR,
I tried above method but it is not working. I think above given solution will not work for my usecase.
Please can you help in solving the error which I have posted along with the code.

Hi,

There is the small correction in the pipeline. The element after OSD should be videoconvert and not nvvideoconvert.

(deepstream-nvdsanalytics-test:203): GStreamer-WARNING **: 16:08:13.115: Name ‘nvvideo-converter’ is not unique in bin ‘nvdsanalytics-test-pipeline’, not adding

There are two elements with the same name “nvvideo-converter” but looking at your code it seems like you have rectified this already.

Hello @CJR,
one more thing I want to add. when I tried with debug level 4 I found that the error is source-pad of filter is not able to link with sink-pad of encoder. It showed caps are incompatible.
Will changing to videoconvert solve this problem? I will try and see if the error persist.

Hello @CJR,
I tried and still got one error.

0:00:00.175920502   952 0x560bb402b400 INFO        GST_ELEMENT_PADS gstutils.c:1774:gst_element_link_pads_full: trying to link element video-converter1:(any) to element cap_filter:(any)
0:00:00.175930479   952 0x560bb402b400 INFO                GST_PADS gstutils.c:1035:gst_pad_check_link: trying to link video-converter1:src and cap_filter:sink
0:00:00.176020543   952 0x560bb402b400 INFO                GST_PADS gstpad.c:4232:gst_pad_peer_query:<cap_filter:src> pad has no peer
0:00:00.176117780   952 0x560bb402b400 INFO        GST_ELEMENT_PADS gstelement.c:920:gst_element_get_static_pad: found pad cap_filter:sink
0:00:00.176131378   952 0x560bb402b400 INFO                GST_PADS gstutils.c:1588:prepare_link_maybe_ghosting: video-converter1 and cap_filter in same bin, no need for ghost pads
0:00:00.176141603   952 0x560bb402b400 INFO                GST_PADS gstpad.c:2378:gst_pad_link_prepare: trying to link video-converter1:src and cap_filter:sink
0:00:00.176225993   952 0x560bb402b400 INFO                GST_PADS gstpad.c:4232:gst_pad_peer_query:<cap_filter:src> pad has no peer
0:00:00.176235446   952 0x560bb402b400 INFO                GST_PADS gstpad.c:2434:gst_pad_link_prepare: caps are incompatible
0:00:00.176244790   952 0x560bb402b400 INFO                GST_PADS gstpad.c:2529:gst_pad_link_full: link between video-converter1:src and cap_filter:sink failed: no common format
0:00:00.176256796   952 0x560bb402b400 INFO                GST_PADS gstutils.c:1035:gst_pad_check_link: trying to link video-converter1:src and cap_filter:sink
0:00:00.176265166   952 0x560bb402b400 INFO                GST_PADS gstpad.c:4232:gst_pad_peer_query:<cap_filter:src> pad has no peer
0:00:00.176349735   952 0x560bb402b400 INFO                GST_PADS gstpad.c:4232:gst_pad_peer_query:<cap_filter:src> pad has no peer
0:00:00.176366136   952 0x560bb402b400 INFO        GST_ELEMENT_PADS gstelement.c:920:gst_element_get_static_pad: found pad video-converter1:src
0:00:00.176375808   952 0x560bb402b400 INFO                GST_PADS gstutils.c:1588:prepare_link_maybe_ghosting: video-converter1 and cap_filter in same bin, no need for ghost pads
0:00:00.176385473   952 0x560bb402b400 INFO                GST_PADS gstpad.c:2378:gst_pad_link_prepare: trying to link video-converter1:src and cap_filter:sink
0:00:00.176468577   952 0x560bb402b400 INFO                GST_PADS gstpad.c:4232:gst_pad_peer_query:<cap_filter:src> pad has no peer
0:00:00.176477949   952 0x560bb402b400 INFO                GST_PADS gstpad.c:2434:gst_pad_link_prepare: caps are incompatible
0:00:00.176485958   952 0x560bb402b400 INFO                GST_PADS gstpad.c:2529:gst_pad_link_full: link between video-converter1:src and cap_filter:sink failed: no common format
Elements could not be linked. Exiting.

First filter and encoder was not linking and now videoconvert and filter is not linking. Am I missing something? I changed “nvvideoconvert” to “videoconvert” after osd.

Whats the type of encoder you are using ? If it’s either nvv4l2h265enc or nvv4l2h264enc then you will need to use nvvideoconvert since these encoders work on nvmm buffers. If you are using a different encoder, then you will need to use videoconvert plugin and set the cap_filter to match the required format type.

You can check the capabilities of each plugin by running gst-inspect-1.0 plugin and look at the format types supported at the sink and src pads.

I think I am using h264 encoder means do I have to change “video/x-raw” with “video/x-raw(memory:NVMM)” in

caps = gst_caps_from_string (“video/x-raw, format=I420”);
And videoconvert with nvvideoconvert

Sorry for such inconvenience.

Hello @CJR,
Now I am able to save video and all data can be seen in the video.
Thank you very much for solving this problem of mine. Before posting this issue I was not having much idea about deepstream but now I got some good understanding.
And sorry for causing trouble.

Yes, that’s right. In code snippet you have shared above, i dont see NVDS_ELEM_ENC_H264_HW being defined anywhere. So if that is pointing to either nvv4l2h265enc or nvv4l2h264enc then use "video/x-raw(memory:NVMM), format=I420"for your caps filter.

Basically you should make sure the capability of src pad of the first plugin should match the sink pad capabilities of the second plugin where the data is flowing from first plugin to the second one. You can read more about gstreamer plugin capabilities over here.

Hello @CJR, I came across some questions -

  1. If I try to run this pipeline with multiple streams then which element I have defined for individual streams and which can handle multiple streams. I think sink and the encoding part has to be individual and other can handle multiple stream.
    Please can you tell if I have to run multiple stream which code I can refer.

  2. As much I know about dsanalytics plugin, it can count when some cross line by +1 but it can’t subtract if someone move in opposite direction. For this I thought of creating two line with same coordinates and giving opposite direction and then creating simple custom plugin which can parse buffer from dsanalytics source pad and do this subtraction. Please can you refer alternative solution or something to create such plugin.

Thanks in advance.

Hi,

  1. Please refer to deepstream-app common sources where these features have been implemented. You can refer to deepstream_sink_bin.c and deepstream_src_bin.c specifically.

  2. Yes, you’re right. But you dont need a plugin to aggregate the information from analytics plugin. You can do that in a probe which is attached downstream to the analytics plugin.

Hello @CJR,
I am thinking of adding nvdsanalytics plugin in “deepstream-apps common”. I want to ask can I add probe for printing and changing buffer data regarding nvdsanalytics similar to what is implemented on “deepstream-nvdsanalytics-test”.

If yes where can I add the code of attaching probe.

Thank you in advance.

You can add it to the src pad of the analytics plugin so that the analytics metadata is available in the GstBuffer for you to aggregate it.

Thank you @CJR, I think there is a bit of misunderstanding. I want to know in which file do I have to add the probe means in “deepstream_dsanalytics.c”(this file I will add by referencing dsexample in /apps-common/src) or in “deepstream_osd_bin.c”.

Because I didn’t saw any probe attached on source files in “/apps-common/src”.

The files in /apps-common/src is the shared code between multiple apps which use same components for building the pipeline. You can take a look at the deepstream_app.c file which already has a few implementations of probes.