How to run ReIdentification in DeepStream

Jetson Nano & dGPU
DS 6.0.1
JP 4.6.1
TRT 8

Hi, I am using PeopleNet and NvDCF for tracking in DeepStream. I’d like to pass the cropped boxes into the ReIdentification network. Is there a plugin for this as I dont see any ReID configs?

Please update to the latest DS release 6.1.1.
Can you use DEEPSORT tracker?

https://docs.nvidia.com/metropolis/deepstream/dev-guide/text/DS_plugin_gst-nvtracker.html

But I am using a Jetson Nano which supposedly does not support 6.1+
Also, I’d like to do this for tracking across multi camera, so does the current plugin support that? I’m referring to the pipeline in the DeepStream portion of this video. Tracking Objects Across Multiple Cameras Made Easy with Metropolis Microservices | NVIDIA On-Demand

DeepStream can’t support tracking across multi camera. Metropolis can support tracking across multi camera. Please check with Metropolis for the feature.

yes i know that. Metropolis hasnt released the microservices yet. but the main point is not tracking across camera. Main question is how to run the ReIdentificationNet from TAO, which embeds features, on DeepStream.

1 Like

If you want to run model in DeepStream. You can set those model in SGIE configure file. Maybe you need process the raw tensor output in your application.

@ Metropolis can support tracking across multi camera. Please check with Metropolis for the feature.

When ???

There is no update from you for a period, assuming this is not an issue anymore.
Hence we are closing this topic. If need further support, please open a new one.
Thanks

Please check below:

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.