Plot bounding box on saved image

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) GPU RTX3080
• DeepStream Version 6.3
• JetPack Version (valid for Jetson only) 3.5.2
My deepstream-app saves images in gst-dsexample plugin inside the following function.

if (dsexample->process_full_frame) {
     if ((result = facequeue.pop(request_, (int)frame_meta->source_id)) != FaceBoxSafeQ<facebox_>::NOTOK)
         {
            
             //attach_metadata_full_frame_own (dsexample, frame_meta, 1, output, i);
             /* Assign bounding box coordinates */
	     rect_params.left = request_.left_;
	     rect_params.top = request_.top_;
	     rect_params.width = request_.width_;
	     rect_params.height = request_.height_;

	     /* Semi-transparent yellow background */
	     rect_params.has_bg_color = 0;
	     rect_params.bg_color = (NvOSD_ColorParams) {1, 1, 0, 0.4};
	     /* Red border of width 6 */
	     rect_params.border_width = 3;
	     rect_params.border_color = (NvOSD_ColorParams) {0, 1, 0, 1};

	     /* Scale the bounding boxes proportionally based on how the object/frame was scaled during input */
	     rect_params.left /= scale_ratio;
	     rect_params.top /= scale_ratio;
	     rect_params.width /= scale_ratio;
	     rect_params.height /= scale_ratio;
	     
	     object_meta->object_id = UNTRACKED_OBJECT_ID;
	     /* display_text required heap allocated memory */
	     g_strlcpy (object_meta->obj_label, request_.name_.c_str(), MAX_LABEL_SIZE);
	     text_params.display_text = g_strdup (request_.name_.c_str());
	     /* Display text above the left top corner of the object */
	     text_params.x_offset = rect_params.left;
	     text_params.y_offset = rect_params.top - 10;
	     /* Set black background for the text */
	     text_params.set_bg_clr = 1;
	     text_params.text_bg_clr = (NvOSD_ColorParams) {0, 0, 0, 1};
	     /* Font face, size and color */
	     text_params.font_params.font_name = font_name;
	     text_params.font_params.font_size = 12;
	     text_params.font_params.font_color = (NvOSD_ColorParams) {1, 1, 1, 1};
	     nvds_add_obj_meta_to_frame(frame_meta, object_meta, NULL);
	     frame_meta->bInferDone = TRUE;
            
         }
 

}

nvds_add_obj_meta_to_frame saves images.

But the saved image doesn’t have bounding box.
May I know how to add in bounding box to the saved images?

  1. which sample are you referring to? how do you saved image? nvds_add_obj_meta_to_frame only add object information to frame mata. the downstream nvdsosd plugin is responsible for drawing all bounding boxes.

If you want to save the images with bounding box drawn by the nvdsosd, either create a pad probe function on the srcpad of the nvdsosd or create a pad probe function for the next element e.g., fakesink.
Then save the images.

The important thing is what fanzh mentioned, nvdsosd is the element which draws the bounding box so you need to save it after nvdsosd does it job.

Sure. So then I can’t save image in gst-dsexample plugin, which happens before nvdosd. I need to shift all my codes to deepstream-app’s c file.

Is it possible to insert the gstdsexample plugin after nvdsosd. gstdsexample is inserted after gstinfer and before osd, so images have been saved with no bounding boxes.
So if gstdsexample plugin is inserted after nvdsosd, I can save image with bounding boxes.

yes. nvdsosd will draw the bounding boxes.

Yes, it is a custom element after all. You can use it after nvdsosd too.
You can always go gst-inspect-1.0 dsexample to see its sink and src.

In my case, I used it before nvinfer so it depends on your use-case.

According to the graph, my gstdsexample is before tiled_display_tiler. gstnvdsosd plugin is after tiler.
I need to shift my gstdsexample to after gstnvdsosd.
So what I did was put gstdsexample after gstnvdsosd.

if (config->osd_config.enable) {
    if (!create_osd_bin (&config->osd_config, &instance_bin->osd_bin)) {
      goto done;
    }

    gst_bin_add (GST_BIN (instance_bin->bin), instance_bin->osd_bin.bin);

    NVGSTDS_LINK_ELEMENT (instance_bin->osd_bin.bin, last_elem);

    last_elem = instance_bin->osd_bin.bin;
  }
  
  if (config->dsexample_config.enable) {
    // Create dsexample element bin and set properties
    if (!create_dsexample_bin (&config->dsexample_config,
            &pipeline->dsexample_bin)) {
      goto done;
    }
    // Add dsexample bin to instance bin
    gst_bin_add (GST_BIN (pipeline->pipeline), pipeline->dsexample_bin.bin);

    // Link this bin to the last element in the bin
    NVGSTDS_LINK_ELEMENT (pipeline->dsexample_bin.bin, last_elem);

    // Set this bin as the last element
    last_elem = pipeline->dsexample_bin.bin;
  }

I have the following errors.

atic@ubuntu:/opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/rectitude$ ./deepstream-app -c ../../../../samples/configs/deepstream-app/rectitude_config_main.txt

(deepstream-app:14612): GStreamer-WARNING **: 04:17:05.425: Trying to link elements dsexample_bin and osd_bin that don't share a common ancestor: dsexample_bin is in pipeline, and osd_bin is in processing_bin_0

(deepstream-app:14612): GStreamer-WARNING **: 04:17:05.425: Trying to link elements dsexample_bin and osd_bin that don't share a common ancestor: dsexample_bin is in pipeline, and osd_bin is in processing_bin_0
** ERROR: <create_processing_instance:953>: Failed to link 'dsexample_bin' (video/x-raw(memory:NVMM), width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], format=(string){ I420, NV12, RGBA }) and 'osd_bin' (video/x-raw(memory:NVMM), format=(string)RGBA, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-raw(memory:NVMM), width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], format=(string){ I420, NV12, P010_10LE, I420_12LE, BGRx, RGBA, GRAY8, YUY2, UYVY, YVYU, Y42B, RGB, BGR, BGR10A2_LE, UYVP }; video/x-raw, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], format=(string){ I420, NV12, P010_10LE, BGRx, RGBA, GRAY8, YUY2, UYVY, YVYU, Y42B, RGB, BGR, BGR10A2_LE, UYVP })
** ERROR: <create_processing_instance:974>: create_processing_instance failed
** ERROR: <create_pipeline:1583>: create_pipeline failed
** ERROR: <main:697>: Failed to create pipeline
Quitting
nvstreammux: Successfully handled EOS for source_id=0
App run failed

Is it because of I need to put gstqueue in between?
How can I solve the issue?

The whole function is as follows.

static gboolean
create_processing_instance (AppCtx * appCtx, guint index)
{
  gboolean ret = FALSE;
  NvDsConfig *config = &appCtx->config;
  NvDsPipeline *pipeline = &appCtx->pipeline;
  NvDsInstanceBin *instance_bin = &appCtx->pipeline.instance_bins[index];
  GstElement *last_elem;
  gchar elem_name[32];

  instance_bin->index = index;
  instance_bin->appCtx = appCtx;

  g_snprintf (elem_name, 32, "processing_bin_%d", index);
  instance_bin->bin = gst_bin_new (elem_name);

  if (!create_sink_bin (config->num_sink_sub_bins,
          config->sink_bin_sub_bin_config, &instance_bin->sink_bin, index)) {
    goto done;
  }

  gst_bin_add (GST_BIN (instance_bin->bin), instance_bin->sink_bin.bin);
  last_elem = instance_bin->sink_bin.bin;

  if (config->osd_config.enable) {
    if (!create_osd_bin (&config->osd_config, &instance_bin->osd_bin)) {
      goto done;
    }

    gst_bin_add (GST_BIN (instance_bin->bin), instance_bin->osd_bin.bin);

    NVGSTDS_LINK_ELEMENT (instance_bin->osd_bin.bin, last_elem);

    last_elem = instance_bin->osd_bin.bin;
  }
  
  if (config->dsexample_config.enable) {
    // Create dsexample element bin and set properties
    if (!create_dsexample_bin (&config->dsexample_config,
            &pipeline->dsexample_bin)) {
      goto done;
    }
    // Add dsexample bin to instance bin
    gst_bin_add (GST_BIN (pipeline->pipeline), pipeline->dsexample_bin.bin);

    // Link this bin to the last element in the bin
    NVGSTDS_LINK_ELEMENT (pipeline->dsexample_bin.bin, last_elem);

    // Set this bin as the last element
    last_elem = pipeline->dsexample_bin.bin;
  }
    

  NVGSTDS_BIN_ADD_GHOST_PAD (instance_bin->bin, last_elem, "sink");
  if (config->osd_config.enable) {
    NVGSTDS_ELEM_ADD_PROBE (instance_bin->all_bbox_buffer_probe_id,
        instance_bin->osd_bin.nvosd, "sink",
        gie_processing_done_buf_prob, GST_PAD_PROBE_TYPE_BUFFER, instance_bin);
  } else {
    NVGSTDS_ELEM_ADD_PROBE (instance_bin->all_bbox_buffer_probe_id,
        instance_bin->sink_bin.bin, "sink",
        gie_processing_done_buf_prob, GST_PAD_PROBE_TYPE_BUFFER, instance_bin);
  }

  ret = TRUE;
done:
  if (!ret) {
    NVGSTDS_ERR_MSG_V ("%s failed", __func__);
  }
  return ret;
}

I am not sure about gstqueue but you can try it.
From what I see, it won’t link so you can even try putting a nvvidconv/nvvideoconvert after nvdsosd and dsexample.

At this point, you just need to correcct your pipeline so I think you can make it work.

You need to rememeber that in the end, dsexample is just an example custom plugin so there is always a way to customize it according to your needs.

I hope you can make this work!

Thanks for the reply.
I know what you mean. I can plot boxes on the image in gstdsexample.
Now opencv is deprecated in deepstream. So to draw boxes on the image, what API is I need to use to draw box in gstdsexample cpp file?
Thanks for suggesting me the way to view graph. That is quite useful :)

Please refer this, you can still use opencv if you enable it. I did it in deepstream 6.2 and 6.3.

By default, OpenCV is deprecated in DeepStream 6.1. However, you can enable OpenCV in plugins such as nvinfer (nvdsinfer ) and dsexample (gst-dsexample ) by setting WITH_OPENCV=1 in the Makefile of these components. Refer to the component README for more instructions.

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.