No Output tensor meta from sgie when running single image (1 frame)

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU): GPU
• DeepStream Version: 6.2
• TensorRT Version: 8.5.2-1+cuda11.8
• NVIDIA GPU Driver Version (valid for GPU only): 470.199.02
• Issue Type( questions, new requirements, bugs): questions

• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing):

I was trying to infer an image(single frame) with both pgie and sgie in a pipeline and get the tensor meta data output from the sgie. From the sgie_pad_buffer_probe, I checked the data from the obj_meta and verified that it managed to receive the object data but when i checked the output data from obj_meta->obj_user_meta_list, it did not manage to receive any data in obj_user_meta_list. The list is empty and i am not able the get the tensor meta output.

Steps to reproduce this issues:

  • Run Deepstream sample app test: deepstream-infer-tensor-meta-tests
    • Change h264parse to jpegparse at line 645
    • Run the app: ./deepstream-infer-tensor-meta-app -t infer /opt/nvidia/deepstream/deepstream/samples/streams/sample_720p.jpg

It’s no a problem. Just because sample image have not object detected.

You can try cap image use below command. then use the out.jpg as input.

 ffmpeg -i sample_720p.h264 -ss 00:00:05 -ss 00:00:05  -r 1 -vframes 1  out.jpg

Hi, I am already using an image file in /opt/nvidia/deepstream/deepstream-6.2/samples/streams/sample_720p.jpg provided in the docker image nvcr.io/nvidia/deepstream:6.2-triton and i had already checked there are objects detected (person, vehicles, etc) but there is no tensor meta output from the secondary-gie.

Although objects are detected, they are too small to be processed by sgie.

You can modify two parameters.

input-object-min-width=30
input-object-min-height=30

Or this, to make it work with other classes.

operate-on-class-ids=0;1;2;3

Here is documetion.

Hi, i actually had a similar case of using pgie and sgie (the pgie will detect an object and then the sgie will generate feature vectors) which i need to use the sgie to output the tensor meta in my implementation. The solutions above works on the sample apps but when i applied the same method in my implementation configs, i still could not get the output from the obj_meta->obj_user_meta_list.

Do you modify the configuration file or source code ? Do you use your own models?

This problem related many things,i can only test on sampe data and code,can you share your code and configuration file ?

Hi, Sorry for the late reply. I made some changes in the deepstream-infer-tensor-meta-testand try to reproduce my issue as close as possible. One issue i managed to replicate is when i change the sink from nveglglessink to fakesink, I could not get the obj_meta->obj_user_meta_list. My use case was to link to a msgconvbroker but it seems like connecting to msgconvbroker also have the same behavior as the fakesink.

This problem is usually not related to the sink, be it fakesink or msgconvbroker .

Is there output-tensor-meta=1 in your sgie configuration file ?

Can you share your pgie and sgie configuration file ?

Hi, Apologized with the confusion. You are right that there is no issue with deepstream-infer-tensor-meta-test sample app after changing to fakesink. I actually make changes to the pgie and sgie too. Both pgie and sgie were referenced from deepstream-app. I have attached a zip file for your reference.

deepstream-infer-tensor-meta-test.zip (27.6 KB)

Sorry for the late reply.

I tried your code and it looks like there are some problems in the pipeline.

The tensor meta of pgie can normally be accessed like this

    g_print("Access frame meta\n");
    for (NvDsMetaList *l_user0 = frame_meta->frame_user_meta_list; l_user0 != NULL;
      l_user0 = l_user0->next) {
      NvDsUserMeta *user_meta = (NvDsUserMeta *) l_user0->data;
        if (user_meta->base_meta.meta_type != NVDSINFER_TENSOR_OUTPUT_META)
          continue;
      g_print("Access user meta from tensor\n");
    }

But as you describe,the tensor meta of sgie cannot be accessed.
Maybe due to the use of tee, it caused some sgie problems.

What is your goal, maybe I can give some better advice ?

Hi, I am trying to run a face detection in pgie, with feature extraction in the sgie on a single image and get the message (pgie metadata, sgie tensor meta) output from the message converter broker. The app we develop are similar to deepstream-app and when we try to infer on a single image, we are facing this issue. Any suggests or advice to make changes in the deepstream-app to resolve this?

Sorry for the late reply.

You can try add probe function add pad of sgie src,because some plugins will not copy meta to the downstream element.


Hi, just want to check the solutions u mention above, is it adding a probe to src pad circle in red?

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

Yes, you can try it.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.