DeepStream6.1.1 Multiple models for multiple inputs

Hi,
Is it possible to have the following configuration in DeepStream?

source0 (RTSP file - test0.mp4) → model-1 → detections

source1(RTSP file - test1.mp4) → model-2 → detections

The output is tiled (1 row, 2 columns) with detections from the RTSP files.

Note, the intent of model1 & 2 is different so they can’t be integrated back to back.

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)

• DeepStream Version

• JetPack Version (valid for Jetson only)

• TensorRT Version

• NVIDIA GPU Driver Version (valid for GPU only)

• Issue Type( questions, new requirements, bugs)

• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)

• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

Its a dGPU setup, details below:

+-----------------------------------------------------------------------------+
| NVIDIA-SMI 515.65.01    Driver Version: 515.65.01    CUDA Version: 11.7     |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                               |                      |               MIG M. |
|===============================+======================+======================|
|   0  Tesla T4            Off  | 00000001:00:00.0 Off |                  Off |
| N/A   42C    P0    28W /  70W |     95MiB / 16384MiB |      0%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+

+-----------------------------------------------------------------------------+
| Processes:                                                                  |
|  GPU   GI   CI        PID   Type   Process name                  GPU Memory |
|        ID   ID                                                   Usage      |
|=============================================================================|
|    0   N/A  N/A       931      G   /usr/lib/xorg/Xorg                 82MiB |
|    0   N/A  N/A      1182      G   /usr/bin/gnome-shell                9MiB |
+-----------------------------------------------------------------------------+


There is no test app but it is a requirement I am trying to understand if DeepStream supports.

Thanks.

Hi @rahul17

We have done something similar on DeepStream:

RTSP stream (thermal) → model-1 (human detection) → model-2 (average temperature) → detections 1 + 2

RTSP stream (RGBA) → meta transfer detections 1 + 2 → model-3 → detections 1 + 2 + 3 over RGBA stream

We created a custom GStreamer element called dsmetatransfer that transfers DeepStream meta from one stream to another.

You can do something similar with meta manipulation to join the meta from the 2 individual pipelines and transfer it to another pipeline where the 2 streams are batched. That way you can use the multistream tiler.

However, I think it would be easier just to create your own overlay element that joins the outputs from the two DeepStream pipelines and displays them.

One way or another, you will need to add something custom. With the default DeepStream components you can’t perform multiple primary inferences over different streams from the same batch.

@miguel.taylor thanks for your sharing, could you share more about your use case? could you share the whole pipeline with dsmetatransfer? could you elaborate the dsmetatransfer plugin, how to transfers DeepStream meta from one stream to another? thanks!

@rahul17 please refer to deepstream_parallel_inference_app, which is similar to your use case, it can support inference parallelly, for example: some sources will go to model1, some sources will go to model2, then use dsmetamux plugin to mux the meta.

Thanks @miguel.taylor & @fanzh for your feedback! I will try these options and let you know.

@fanzh to elaborate on dsmetatransfer:

The element is based on GstAggregatorwith 2 sink pads and 1 source pad:

  1. bypass pad (sink): We get the buffer from this pad. If there is a meta stored we transfer it to the buffer.
  2. meta pad (sink): We keep a ref to the last buffer received to get the meta from it.
  3. source pad: Contains the video from bypass and the meta from meta. We scale all boxes and coordinates to fit the new resolution.

At the beginning we were transferring all DeepStream metadata:

  /* Transfer metas */
  if (meta_pad != NULL) {
    gpointer state = NULL;
    GQuark nvdsmeta_quark = g_quark_from_static_string (NVDS_META_STRING);
    GstMeta *gst_meta;
    GstBuffer *metabuf = gst_aggregator_pad_pop_buffer (meta_pad);
    while ((gst_meta = gst_buffer_iterate_meta (metabuf, &state))) {
      if (gst_meta_api_type_has_tag (gst_meta->info->api, nvdsmeta_quark)) {
        GstMetaTransformCopy copy_data = { FALSE, 0, -1 };
        const GstMetaInfo *info = gst_meta->info;
        info->transform_func (outbuf, gst_meta, metabuf,
            _gst_meta_transform_copy, &copy_data);
      }
    }
    gst_buffer_unref (metabuf);
  } else {
    GST_WARNING_OBJECT (self,
        "No meta pad found, the buffer will be pushed without new meta");
  }

But then we switched to our own custom metadata. We serialize the values we need from the DS meta into a JSON. Then, we send that JSON around using NvDsEventMsgMeta, we had to do this because some DeepStream elements were only copying DS meta and deleting our custom meta type. We also have separate custom elements to transform a serialized meta back to DeepStream so that it will be displayed with the overlay.

There is no update from you for a period, assuming this is not an issue anymore.
Hence we are closing this topic. If need further support, please open a new one.
Thanks

@rahul17
Sorry for the late reply, Is this still an DeepStream issue to support? Thanks

@miguel.taylor Thanks for your sharing, if meet any DeepStream issue, please open a new topic to track, thanks!

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.