• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing):
I was trying to infer an image(single frame) with both pgie and sgie in a pipeline and get the tensor meta data output from the sgie. From the sgie_pad_buffer_probe, I checked the data from the obj_meta and verified that it managed to receive the object data but when i checked the output data from obj_meta->obj_user_meta_list, it did not manage to receive any data in obj_user_meta_list. The list is empty and i am not able the get the tensor meta output.
Steps to reproduce this issues:
Run Deepstream sample app test: deepstream-infer-tensor-meta-tests
Change h264parse to jpegparse at line 645
Run the app: ./deepstream-infer-tensor-meta-app -t infer /opt/nvidia/deepstream/deepstream/samples/streams/sample_720p.jpg
Hi, I am already using an image file in /opt/nvidia/deepstream/deepstream-6.2/samples/streams/sample_720p.jpg provided in the docker image nvcr.io/nvidia/deepstream:6.2-triton and i had already checked there are objects detected (person, vehicles, etc) but there is no tensor meta output from the secondary-gie.
Hi, i actually had a similar case of using pgie and sgie (the pgie will detect an object and then the sgie will generate feature vectors) which i need to use the sgie to output the tensor meta in my implementation. The solutions above works on the sample apps but when i applied the same method in my implementation configs, i still could not get the output from the obj_meta->obj_user_meta_list.
Hi, Sorry for the late reply. I made some changes in the deepstream-infer-tensor-meta-testand try to reproduce my issue as close as possible. One issue i managed to replicate is when i change the sink from nveglglessink to fakesink, I could not get the obj_meta->obj_user_meta_list. My use case was to link to a msgconvbroker but it seems like connecting to msgconvbroker also have the same behavior as the fakesink.
Hi, Apologized with the confusion. You are right that there is no issue with deepstream-infer-tensor-meta-test sample app after changing to fakesink. I actually make changes to the pgie and sgie too. Both pgie and sgie were referenced from deepstream-app. I have attached a zip file for your reference.
Hi, I am trying to run a face detection in pgie, with feature extraction in the sgie on a single image and get the message (pgie metadata, sgie tensor meta) output from the message converter broker. The app we develop are similar to deepstream-app and when we try to infer on a single image, we are facing this issue. Any suggests or advice to make changes in the deepstream-app to resolve this?
There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks