Blob tracking for a black and white video?

I have a pipeline that is generating a binary video: black background with small white blobs moving around. I would like to track those blobs, and I am looking for guidance in implementing this capability.

  1. Is there a known way to do this in DeepStream?

  2. In the nv-infer plugin, there is a cluster-mode configuration parameter for which one of the options is DBSCAN. Could this be used to perform clustering on the binary video frame and generate a bounding box? Or is it intended to cluster overlapping bounding boxes?

  3. In my pipeline, I have a custom plugin that uses an OpenCV GpuMat to generate the binary video frame. If there isn’t a way to generate bounding boxes using existing plugins, then I’m planning to download the binary video frame to the CPU, run a blob detector, attach the bounding boxes to the output metadata, and connect the nvtracker to the pipeline. Does this approach make sense or is there a better way? This is similar to the gst-dsexample plugin.

  4. I would like to save the tracks to file instead of putting them on the OSD. Is there a known way to do this, or do I have to write a custom plugin to write them to file?

I am running DeepStream 5.0.1 on a GPU (T4 & V100).

Thanks in advance for any help or guidance you can offer!

  1. Is there a known way to do this in DeepStream? ==> no existing soluition for this.
  2. In the nv-infer plugin, there is a cluster-mode configuration parameter for which one of the options is DBSCAN. Could this be used to perform clustering on the binary video frame and generate a bounding box? Or is it intended to cluster overlapping bounding boxes? ===> “cluster-mode” DBSCAN is for detector bounding box clustering,
  3. In my pipeline, I have a custom plugin that uses an OpenCV GpuMat to generate the binary video frame. If there isn’t a way to generate bounding boxes using existing plugins, then I’m planning to download the binary video frame to the CPU, run a blob detector, attach the bounding boxes to the output metadata, and connect the nvtracker to the pipeline. Does this approach make sense or is there a better way? This is similar to the gst-dsexample plugin. ==> You can use appsrc to feed data into gstreamer and pass the data to nvvideoconvert to convert the data to nvmm buffer so that it can be accepted by nvstreammux, you can wrap this in the application.
  4. I would like to save the tracks to file instead of putting them on the OSD. Is there a known way to do this, or do I have to write a custom plugin to write them to file? ==> what are the info do you mean “tracks”?