gstnvdsmeta Not Working - Cannot attach metadata at appsrc level

I am trying to add metadata on appsrc as to get them on the appsink taken from deepstream-gst-metadata-test.
appsrc->jpegparse->queue->nvv4l2decoder->nvstreammux->nvinfer->appsink

How the Metadata is added.

Napi::Buffer<char> buffer = info[0].As<Napi::Buffer<char>>();
	char *frame = buffer.Data();
	size_t frameLength = buffer.Length();
	GstBuffer *gst_buffer = gst_buffer_new_allocate(nullptr, frameLength,
			nullptr);
	gst_buffer_fill(gst_buffer, 0, frame, frameLength);

	NvDecoderMeta *decoder_meta = (NvDecoderMeta*) g_malloc0(
			sizeof(NvDecoderMeta));

	decoder_meta->camname = "test";
	decoder_meta->frameIndex = 0;

	meta = gst_buffer_add_nvds_meta(gst_buffer, decoder_meta, NULL,
			decoder_meta_copy_func, decoder_meta_release_func);

	meta->meta_type = (GstNvDsMetaType) NVDS_DECODER_GST_META_EXAMPLE;
	meta->gst_to_nvds_meta_transform_func =
			decoder_gst_to_nvds_meta_transform_func;
	meta->gst_to_nvds_meta_release_func = decoder_gst_nvds_meta_release_func;

	g_signal_emit_by_name(myPipe->rtspPipeline.source, "push-buffer",
			gst_buffer, &ret);

How I try to get the data.

RtspPipeline *_rtspPipeline = (RtspPipeline*) u_data;
	GstBuffer *buffer = (GstBuffer*) info->data;
	NvDsObjectMeta *obj_meta = NULL;
	NvDsMetaList *l_frame = NULL;
	NvDsMetaList *l_obj = NULL;
	NvDsDisplayMeta *display_meta = NULL;
	std::vector<DetectionResults> detResults;
	NvDsUserMeta *user_meta = NULL;
	NvDecoderMeta * decoder_meta = NULL;

	NvDsBatchMeta *batch_meta = gst_buffer_get_nvds_batch_meta(buffer);
	for (l_frame = batch_meta->frame_meta_list; l_frame != NULL; l_frame =
			l_frame->next) {
		NvDsFrameMeta *frame_meta = (NvDsFrameMeta*) (l_frame->data);
		int offset = 0;
		for (l_obj = frame_meta->obj_meta_list; l_obj != NULL;
				l_obj = l_obj->next) {
			user_meta = (NvDsUserMeta*) (l_obj->data);
			if (user_meta->base_meta.meta_type == NVDS_DECODER_GST_META_EXAMPLE) { // we never get in here!!!
				std::cout << " **We found our metadata!!!" << std::endl;
				//decoder_meta = (NvDecoderMeta *)user_meta->user_meta_data;
			}
			DetectionResults res;
			obj_meta = (NvDsObjectMeta*) (l_obj->data);
			res.classid = obj_meta->class_id;
			res.confidence = obj_meta->confidence; // modify gst plugin to get probability
			res.left = obj_meta->rect_params.left;
			res.top = obj_meta->rect_params.top;
			res.width = obj_meta->rect_params.width;
			res.height = obj_meta->rect_params.height;
			res.border_width = obj_meta->rect_params.border_width;
			detResults.push_back(res);
		}
	}

Expected behavior

I should get the metadata with this line if code.

if (user_meta->base_meta.meta_type == NVDS_DECODER_GST_META_EXAMPLE) {
			}

I also tried to use

gst_buffer_get_nvds_meta

but it seems like it is in the header file but not in the libs.

Any help or advice will be appreciated.

2 Likes

Hi,
It is not clear on what the issue is. Is it possible to share a test code or patch on existing sample to reproduce the issue?

Hi,

in deepstream-gst-metadata-test, if you change the decoder probe from src to sink you will see what I mean. The metadata is lost for some reason in the nvinfer_src_pad_buffer_probe.

i.e From

decoder_src_pad = gst_element_get_static_pad (decoder, "src");
  if (!decoder_src_pad)
    g_print ("Unable to get source pad\n");
  else
    gst_pad_add_probe (decoder_src_pad, GST_PAD_PROBE_TYPE_BUFFER,
        nvdecoder_src_pad_buffer_probe, NULL, NULL);

to

decoder_src_pad = gst_element_get_static_pad (decoder, "sink");
  if (!decoder_src_pad)
    g_print ("Unable to get source pad\n");
  else
    gst_pad_add_probe (decoder_src_pad, GST_PAD_PROBE_TYPE_BUFFER,
        nvdecoder_src_pad_buffer_probe, NULL, NULL);

.

Full code:

/*
 * Copyright (c) 2019, NVIDIA CORPORATION. All rights reserved.
 *
 * Permission is hereby granted, free of charge, to any person obtaining a
 * copy of this software and associated documentation files (the "Software"),
 * to deal in the Software without restriction, including without limitation
 * the rights to use, copy, modify, merge, publish, distribute, sublicense,
 * and/or sell copies of the Software, and to permit persons to whom the
 * Software is furnished to do so, subject to the following conditions:
 *
 * The above copyright notice and this permission notice shall be included in
 * all copies or substantial portions of the Software.
 *
 * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
 * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
 * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.  IN NO EVENT SHALL
 * THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
 * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
 * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
 * DEALINGS IN THE SOFTWARE.
 */

#include <gst/gst.h>
#include <glib.h>
#include <stdio.h>
#include <string.h>
#include "stdlib.h"
#include "gstnvdsmeta.h"

#define PGIE_CLASS_ID_VEHICLE 0
#define PGIE_CLASS_ID_PERSON 2

/** set the user metadata type */
#define NVDS_DECODER_GST_META_EXAMPLE (nvds_get_user_meta_type("NVIDIA.DECODER.GST_USER_META"))

/* The muxer output resolution must be set if the input streams will be of
 * different resolution. The muxer will scale all the input frames to this
 * resolution. */
#define MUXER_OUTPUT_WIDTH 1920
#define MUXER_OUTPUT_HEIGHT 1080

/* Muxer batch formation timeout, for e.g. 40 millisec. Should ideally be set
 * based on the fastest source's framerate. */
#define MUXER_BATCH_TIMEOUT_USEC 4000000

gint frame_number = 0;
gchar pgie_classes_str[4][32] = { "Vehicle", "TwoWheeler", "Person",
  "Roadsign"
};

typedef struct _NvDecoderMeta
{
  guint frame_type;
  guint frame_num;
  gboolean dec_err;
} NvDecoderMeta;

/* gst meta copy function set by user */
static gpointer decoder_meta_copy_func(gpointer data, gpointer user_data)
{
  NvDecoderMeta *src_decoder_meta = (NvDecoderMeta *)data;
  NvDecoderMeta *dst_decoder_meta = (NvDecoderMeta*)g_malloc0(
      sizeof(NvDecoderMeta));
  memcpy(dst_decoder_meta, src_decoder_meta, sizeof(NvDecoderMeta));
  return (gpointer)dst_decoder_meta;
}

/* gst meta release function set by user */
static void decoder_meta_release_func(gpointer data, gpointer user_data)
{
  NvDecoderMeta *decoder_meta = (NvDecoderMeta *)data;
  if(decoder_meta) {
    g_free(decoder_meta);
    decoder_meta = NULL;
  }
}

/* gst to nvds transform function set by user. "data" holds a pointer to NvDsUserMeta */
static gpointer decoder_gst_to_nvds_meta_transform_func(gpointer data, gpointer user_data)
{
  NvDsUserMeta *user_meta = (NvDsUserMeta *)data;
  NvDecoderMeta *src_decoder_meta =
    (NvDecoderMeta*)user_meta->user_meta_data;
  NvDecoderMeta *dst_decoder_meta =
    (NvDecoderMeta *)decoder_meta_copy_func(src_decoder_meta, NULL);
  return (gpointer)dst_decoder_meta;
}

/* release function set by user to release gst to nvds transformed metadata.
 * "data" holds a pointer to NvDsUserMeta */
static void decoder_gst_nvds_meta_release_func(gpointer data, gpointer user_data)
{
  NvDsUserMeta *user_meta = (NvDsUserMeta *) data;
  NvDecoderMeta *decoder_meta = (NvDecoderMeta *)user_meta->user_meta_data;
  decoder_meta_release_func(decoder_meta, NULL);
}

/* nvinfer_src_pad_buffer_probe() will extract the metadata received on nvinfer
 * src pad.
 * It explains the mechanism to extract the decoder metadata (which is attached
 * using gstnvdsmeta API's in nvdecoder_src_pad_buffer_probe()),
 * now transformed into nvdsmeta. Decoder meta, attached to gst buffer
 * is set as user data at NvDsFrameMeta level
 */

static GstPadProbeReturn
nvinfer_src_pad_buffer_probe (GstPad * pad, GstPadProbeInfo * info,
    gpointer u_data)
{
  GstBuffer *buf = (GstBuffer *) info->data;
  NvDsMetaList * l_frame = NULL;
  NvDsUserMeta *user_meta = NULL;
  NvDecoderMeta * decoder_meta = NULL;
  NvDsMetaList * l_user_meta = NULL;

  NvDsBatchMeta *batch_meta = gst_buffer_get_nvds_batch_meta (buf);

    for (l_frame = batch_meta->frame_meta_list; l_frame != NULL;
      l_frame = l_frame->next) {
        NvDsFrameMeta *frame_meta = (NvDsFrameMeta *) (l_frame->data);

        for (l_user_meta = frame_meta->frame_user_meta_list; l_user_meta != NULL;
            l_user_meta = l_user_meta->next) {
          user_meta = (NvDsUserMeta *) (l_user_meta->data);
          if(user_meta->base_meta.meta_type == NVDS_DECODER_GST_META_EXAMPLE)
          {
            decoder_meta = (NvDecoderMeta *)user_meta->user_meta_data;
            g_print("Dec Meta retrieved as NVDS USER METADTA For Frame_Num = %d  \n",
                decoder_meta->frame_num);
            g_print("frame type = %d, frame_num = %d decode_error_status = %d\n\n",
                decoder_meta->frame_type, decoder_meta->frame_num,
                decoder_meta->dec_err);
          }
        }
    }
    return GST_PAD_PROBE_OK;
}

/* nvdecoder_src_pad_buffer_probe() will attach decoder metadata to gstreamer
 * buffer on src pad. The decoder can not attach to NvDsBatchMeta metadata because
 * batch level metadata is created by nvstreammux component. The decoder
 * component is present is before nvstreammmux. So it attached the metadata
 * using gstnvdsmeta API's.
 */
static GstPadProbeReturn
nvdecoder_src_pad_buffer_probe (GstPad * pad, GstPadProbeInfo * info,
    gpointer u_data)
{
  GstBuffer *buf = (GstBuffer *) info->data;
  NvDsMeta *meta = NULL;

  NvDecoderMeta *decoder_meta = (NvDecoderMeta *)g_malloc0(sizeof(NvDecoderMeta));
  if(decoder_meta == NULL)
  {
    return GST_FLOW_ERROR;
  }
  /* Add dummy metadata */
  decoder_meta->frame_type = 9;
  decoder_meta->frame_num = 11;
  decoder_meta->dec_err = ((frame_number % 4) / 3);

  /* Attach decoder metadata to gst buffer using gst_buffer_add_nvds_meta() */
  meta = gst_buffer_add_nvds_meta (buf, decoder_meta, NULL,
      decoder_meta_copy_func, decoder_meta_release_func);

  /* Set metadata type */
  meta->meta_type = (GstNvDsMetaType)NVDS_DECODER_GST_META_EXAMPLE;

  /* Set transform function to transform decoder metadata from Gst meta to
   * nvds meta */
  meta->gst_to_nvds_meta_transform_func = decoder_gst_to_nvds_meta_transform_func;

  /* Set release function to release the transformed nvds metadata */
  meta->gst_to_nvds_meta_release_func = decoder_gst_nvds_meta_release_func;

  g_print("GST Dec Meta attached with gst decoder output buffer for Frame_Num = %d\n",
      decoder_meta->frame_num);
  g_print("frame type = %d, frame_num = %d decode_error_status = %d\n\n",
      decoder_meta->frame_type, decoder_meta->frame_num,
      decoder_meta->dec_err);

  return GST_PAD_PROBE_OK;
}

static gboolean
bus_call (GstBus * bus, GstMessage * msg, gpointer data)
{
  GMainLoop *loop = (GMainLoop *) data;
  switch (GST_MESSAGE_TYPE (msg)) {
    case GST_MESSAGE_EOS:
      g_print ("End of stream\n");
      g_main_loop_quit (loop);
      break;
    case GST_MESSAGE_ERROR:{
      gchar *debug;
      GError *error;
      gst_message_parse_error (msg, &error, &debug);
      g_printerr ("ERROR from element %s: %s\n",
          GST_OBJECT_NAME (msg->src), error->message);
      if (debug)
        g_printerr ("Error details: %s\n", debug);
      g_free (debug);
      g_error_free (error);
      g_main_loop_quit (loop);
      break;
    }
    default:
      break;
  }
  return TRUE;
}

int
main (int argc, char *argv[])
{
  GMainLoop *loop = NULL;
  GstElement *pipeline = NULL, *source = NULL, *h264parser = NULL,
      *decoder = NULL, *streammux = NULL, *sink = NULL, *pgie = NULL, *nvvidconv = NULL,
      *nvosd = NULL;
#ifdef PLATFORM_TEGRA
  GstElement *transform = NULL;
#endif
  GstBus *bus = NULL;
  guint bus_watch_id;
  GstPad *infer_src_pad = NULL;
  GstPad *decoder_src_pad = NULL;

  /* Check input arguments */
  if (argc != 2) {
    g_printerr ("Usage: %s <H264 filename>\n", argv[0]);
    return -1;
  }

  /* Standard GStreamer initialization */
  gst_init (&argc, &argv);
  loop = g_main_loop_new (NULL, FALSE);

  /* Create gstreamer elements */
  /* Create Pipeline element that will form a connection of other elements */
  pipeline = gst_pipeline_new ("dstest1-pipeline");

  /* Source element for reading from the file */
  source = gst_element_factory_make ("filesrc", "file-source");

  /* Since the data format in the input file is elementary h264 stream,
   * we need a h264parser */
  h264parser = gst_element_factory_make ("h264parse", "h264-parser");

  /* Use nvdec_h264 for hardware accelerated decode on GPU */
  decoder = gst_element_factory_make ("nvv4l2decoder", "nvv4l2-decoder");

  /* Create nvstreammux instance to form batches from one or more sources. */
  streammux = gst_element_factory_make ("nvstreammux", "stream-muxer");

  if (!pipeline || !streammux) {
    g_printerr ("One element could not be created. Exiting.\n");
    return -1;
  }

  /* Use nvinfer to run inferencing on decoder's output,
   * behaviour of inferencing is set through config file */
  pgie = gst_element_factory_make ("nvinfer", "primary-nvinference-engine");

  /* Use convertor to convert from NV12 to RGBA as required by nvosd */
  nvvidconv = gst_element_factory_make ("nvvideoconvert", "nvvideo-converter");

  /* Create OSD to draw on the converted RGBA buffer */
  nvosd = gst_element_factory_make ("nvdsosd", "nv-onscreendisplay");

  /* Finally render the osd output */
#ifdef PLATFORM_TEGRA
  transform = gst_element_factory_make ("nvegltransform", "nvegl-transform");
#endif
  sink = gst_element_factory_make ("nveglglessink", "nvvideo-renderer");

  if (!source || !h264parser || !decoder || !pgie
      || !nvvidconv || !nvosd || !sink) {
    g_printerr ("One element could not be created. Exiting.\n");
    return -1;
  }

#ifdef PLATFORM_TEGRA
  if(!transform) {
    g_printerr ("One tegra element could not be created. Exiting.\n");
    return -1;
  }
#endif

  /* we set the input filename to the source element */
  g_object_set (G_OBJECT (source), "location", argv[1], NULL);

  g_object_set (G_OBJECT (streammux), "width", MUXER_OUTPUT_WIDTH, "height",
      MUXER_OUTPUT_HEIGHT, "batch-size", 1,
      "batched-push-timeout", MUXER_BATCH_TIMEOUT_USEC, NULL);

  /* Set all the necessary properties of the nvinfer element,
   * the necessary ones are : */
  g_object_set (G_OBJECT (pgie),
      "config-file-path", "dsmeta_pgie_config.txt", NULL);

  /* we add a message handler */
  bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline));
  bus_watch_id = gst_bus_add_watch (bus, bus_call, loop);
  gst_object_unref (bus);

  /* Set up the pipeline */
  /* we add all elements into the pipeline */
#ifdef PLATFORM_TEGRA
  gst_bin_add_many (GST_BIN (pipeline),
      source, h264parser, decoder, streammux, pgie,
      nvvidconv, nvosd, transform, sink, NULL);
#else
  gst_bin_add_many (GST_BIN (pipeline),
      source, h264parser, decoder, streammux, pgie,
      nvvidconv, nvosd, sink, NULL);
#endif

  GstPad *sinkpad, *srcpad;
  gchar pad_name_sink[16] = "sink_0";
  gchar pad_name_src[16] = "src";

  sinkpad = gst_element_get_request_pad (streammux, pad_name_sink);
  if (!sinkpad) {
    g_printerr ("Streammux request sink pad failed. Exiting.\n");
    return -1;
  }

  srcpad = gst_element_get_static_pad (decoder, pad_name_src);
  if (!srcpad) {
    g_printerr ("Decoder request src pad failed. Exiting.\n");
    return -1;
  }

  if (gst_pad_link (srcpad, sinkpad) != GST_PAD_LINK_OK) {
      g_printerr ("Failed to link decoder to stream muxer. Exiting.\n");
      return -1;
  }

  gst_object_unref (sinkpad);
  gst_object_unref (srcpad);

  /* we link the elements together */
  /* file-source -> h264-parser -> nvh264-decoder ->
   * nvinfer -> nvvidconv -> nvosd -> video-renderer */

  if (!gst_element_link_many (source, h264parser, decoder, NULL)) {
    g_printerr ("Elements could not be linked: 1. Exiting.\n");
    return -1;
  }

#ifdef PLATFORM_TEGRA
  if (!gst_element_link_many (streammux, pgie,
      nvvidconv, nvosd, transform, sink, NULL)) {
    g_printerr ("Elements could not be linked: 2. Exiting.\n");
    return -1;
  }
#else
  if (!gst_element_link_many (streammux, pgie,
      nvvidconv, nvosd, sink, NULL)) {
    g_printerr ("Elements could not be linked: 2. Exiting.\n");
    return -1;
  }
#endif

  /* Lets add probe to access decoder metadata attached with NvDsMeta.
   * This metadata is tranformed into nvdsmeta and set as user metadata at
   * frame level.
   * We add probe to the src pad of the decoder element */
  decoder_src_pad = gst_element_get_static_pad (decoder, "sink");
  if (!decoder_src_pad)
    g_print ("Unable to get source pad\n");
  else
    gst_pad_add_probe (decoder_src_pad, GST_PAD_PROBE_TYPE_BUFFER,
        nvdecoder_src_pad_buffer_probe, NULL, NULL);

  /* Lets add probe at decoder src pad to attach dummy decoder metadata using
   * gstnvdsmeta APIs.
   * This metadata is transformed into nvdsmeta and set as user metadata at
   * frame level */
  infer_src_pad = gst_element_get_static_pad (pgie, "src");
  if (!infer_src_pad)
    g_print ("Unable to get source pad\n");
  else
    gst_pad_add_probe (infer_src_pad, GST_PAD_PROBE_TYPE_BUFFER,
        nvinfer_src_pad_buffer_probe, NULL, NULL);

  /* Set the pipeline to "playing" state */
  g_print ("Now playing: %s\n", argv[1]);
  gst_element_set_state (pipeline, GST_STATE_PLAYING);

  /* Wait till pipeline encounters an error or EOS */
  g_print ("Running...\n");
  g_main_loop_run (loop);

  /* Out of the main loop, clean up nicely */
  g_print ("Returned, stopping playback\n");
  gst_element_set_state (pipeline, GST_STATE_NULL);
  g_print ("Deleting pipeline\n");
  gst_object_unref (GST_OBJECT (pipeline));
  g_source_remove (bus_watch_id);
  g_main_loop_unref (loop);
  return 0;
}

This is what I want to accomplish:

I want to add the metadata i.e cam name, index etc so that I know for which camera and index are the results of nvinfer based on. I would like to add this data on appsrc, when I add the buffer.

Do you have any suggestions as how that can be accomplished.

Hi,
You are correct this case is not supported in DS4.0.1. We will evaluate to support it in future releases.
For more information, you replace ‘nvegltransform ! nveglglesisnk’ with appsink in deepstream-gst-metadata-test, and encode each frame into JPG in appsink?

No, this was just an example to show the use case where it does not work.

appsrc->jpegparse->queue->nvv4l2decoder->nvstreammux->nvinfer->appsink. I would like to add the metadata at appsrc level or nvv4l2decoder sink.

Do you have any suggestions as to how I can accomplish that in the mean time.

Hi,
Are you able to run like [primary-gie] and [secondary-gie] as demonstrated in deepstream-app? Or deepstream-test2?

It seems not necessary to break into two pipeline and bring the metdata from one pipeline to the other. We would like to suggest you run as the samples demonstrate.

Or could you share more detail about your exact usecase? We need more information to evaluate the feature request of passing gstnvdsmeta in nvv4l2decoder. Thanks.

Hi, its not about bringing metadata from one pipeline to another. I should be able to add metadata at any element in my pipeline and it should persist.

Expected behavior:
As stated above, even if i add the metadata at the sink of nvv4l2decoder, it should persist for the entire
pipeline. But if I add it at the sink, its removed but when added at the src, it does not get removed.

Take the below pipeline.
appsrc->queue->nvv4l2decoder->appsink

irrespective of where I add the metadata, no element should remove the metadata. It should just add more metadata or leave it alone.

Currently if I add it at the sink/src of appsrc, queue or sink or nvv4l2decoder, nvv4l2decoder removes the metadata.

Bottom line.
I should be able to add metadata anywhere in the pipeline. It should not be, you can add it here but not there.

Hi,
It seems difficult to implement it for h264/h265 decoding through nvv4l2decoder. Since we can feed h264/h265 stream in frame-based or size-based buffers. In size-based case, one input GstBuffer does not match with one output GstBuffer, and metadata gets mismatched.

Your appsrc looks to be jpeg images. Are you able to try jpegdec in your usecase?

Hi,

If I add metadata like:

NvDecoderMeta *decoder_meta = (NvDecoderMeta *)g_malloc0(sizeof(NvDecoderMeta));
meta = gst_buffer_add_nvds_meta (buf, decoder_meta, NULL,
      decoder_meta_copy_func, decoder_meta_release_func);

Why Cant nvv4l2decoder read the metadata, temporarily store it, do its decoding, then attach the metadata back before it sends the buffer upstream.

If it was not difficult, life would not be fun:).

Now I am storing the metadata in my own datastructure (queue) which is risky. I hope it will be implemented in the next release.

Are you able to try jpegdec in your usecase?
The problem with that is it will not use the HW.
Do you have any suggestions as to how I can accomplish this while using HW.

Here is the Pipeline.
appsrc[jpeg images] → jpegparse → queue-> nvv4l2decoder → nvstreammux-> nvinfer ->appsink

At appsrc I want to add metadata so that at appsink I know for which image etc.

Note that app appsink I am only interested in the nvinfer metadata and the metadata put in the pipeline by me at appsrc via.

NvDecoderMeta *decoder_meta = (NvDecoderMeta *)g_malloc0(sizeof(NvDecoderMeta));
meta = gst_buffer_add_nvds_meta (buf, decoder_meta, NULL,
      decoder_meta_copy_func, decoder_meta_release_func);

Hi,
We are evaluating to support it in future release.