I have a DS 6.0.1 pipeline running a pgie, tracker and a secondary model. I am using the Python API.
My secondary model works well when classifier-async-mode=0 but if I set it to classifier-async-mode=1 I have two issues:
Some objects do not have metadata (despite being bigger than input-object-min-width and input-object-min-height and having set interval=0)
Some objects have metadata that belongs to other object previously seen (e.g. in Frame 1 I had an object with attribute MYATTRIBUTE=VALUE1, and in Frame 2 (after Frame 1) I had two objects with MYATTRIBUTE=VALUE1 despite the correct value being clearly MYATTRIBUTE=VALUE2). I am sure the classifier is accurate as this does not happen when classifier-async-mode=0
Based on this, I am assuming I am not parsing the metadata correctly when classifier-async-mode=1.
I looked at the documentation and found :
So it seems that when classifier-async-mode=1, the classifier
Attaches metadata after the inference results are available to next Gst Buffer in its internal queue
However, I am not sure what this internal queue would be. I couldn’t find any other mention of it so I am not sure how and where this metadata is attached.
Furthermore, I don’t understand how I could possibly get metadata from past objects. I looked at the documentation and I couldn’t find any attribute of NvDsClassifierMeta or NvDsLabelInfo that could suggest me that some metadata would refer to another object. Given an obj_meta of type NvDsObjectMeta, what I am doing is the following:
How can I know that the metadata for a given object has not been produced yet? And how can I wait for it? I have to report the results of my pipeline to the cloud. I’d like to wait for the metadata before uploading the results.
EDIT: based on this topic [Secondary GIE] Custom Classifier in sgie outputs only random entry in label.txt - #30 by rohitnairkp , I changed the setting secondary-reinfer-interval=0. It helps but there are still quite a few images wrongly classified (this does not happen when classifier-async-mode=0) and a lot of detections have no classifier metadata even when setting input-object-min-width=0 and input-object-min-height=0
Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU)
• DeepStream Version
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)
• The pipeline being used
I don’t need to use gst-nvmsbroker because I am reporting the metadata using a custom probe.
However, I am not sure how to wait for the metadata to be processed. When you say
old classification meta will be attach to object meta by object’s trackid
do you mean that the metadata will be attached once the object is detected again in a subsequent frame, or will the metadata be attached to the first detection of the object?
E.g. let’s consider two subsequent frames Frame1 (captured at time t1) and Frame2 (captured at time t2). The same car is detected in both frames.
Will the metadata be attached to the detection in the Frame2 or also in Frame1? If they will be attached to Frame1, how can I know that the secondary model is currently running so that I know that I should wait for the metadata?
sgie’s classifier-async-mode need to work with tracker, let take this pipeline for example,
pgie(car detecor) + tracker + sgie(car color classification) + osd( draw).
at time t1, sgie pushed inference task T1 to thread, and push object meta to osd without waiting inference end, at time t2, if T1 end, sgie will attach classification meta to object meta by trackerid, two objects are the same if have the same trackerid, if T1 did not end, there is still no classification meta.
you can run sample deepstream-test2 to verify.
Hey @fanzh , thank you for your explanation. I still have a doubt: when you “sgie will attach classification meta to object meta by trackerid” do you mean that it will attach the metadata to very same NvDsObjectMeta detected at time T1 or will it be another NvDsObjectMeta with the same tracker id of the original?
will it be another NvDsObjectMeta with the same tracker id. After T1 inference end, the classifcaiton output will be saved. which will add to new object meta with the same tracker id.
nvinfer plugin is opensource, you can add logs in gst_nvinfer_process_objects to verify.
At time T1 an object is detected and the detection will be saved in an instance of NvDsObjectMeta that we’ll call nv_ds_object_meta_t1. The secondary model starts.
The secondary model keeps computing until the inference is complete.
At time T2 the same object is detected again. A new instance of type NvDsObjectMeta is created. We’ll call this instance nv_ds_object_meta_t2. The output from the secondary model is attached to nv_ds_object_meta_t2 because it has the same tracker id that nv_ds_object_meta_t1 had.
Is this correct?
At this point my last question is: will the output of the model ever be attached to nv_ds_object_meta_t1 or will it be attached only to nv_ds_object_meta_t2?
what dose “an object is detected only once” mean? an object only exist in one frame? in this case, this object did not come again, will not have attributes from secondary models.
Yes, I meant when an object appears only in one frame. This is likely to happen when processing many video streams, therefore processing them at a low FPS.
Thank you, this is what I needed to know.
If I may, it would be great if, in future releases, metadata from secondary models would be attached even to the first detection even when async-mode for secondary model is enabled. I understand there could be delays. But one could keep a reference to the object metadata and wait for the secondary model metadata to be attached. Maybe you could introduce a callback to signal when metadata are produced.