Set network-type=100 in Deepstream, how to get data from nvinfer?

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
• DeepStream Version6.1
**• JetPack Version (valid for Jetson only)**4.6
• TensorRT Version7.3
**• NVIDIA GPU Driver Version (valid for GPU only)**1
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

I depoyment obeject Detection and Anomaly Detection in the pipeline using Deepstream;object Detection as primary infer, Anomaly Detection as senconday infer,.The Secondary infer’s input get from the sepcial label from the primay.

** The problems as fllow:
1)The secondary infer(Anomaly Detecion ) is not Objection and Classifer , so how should I set the secondary infer config file?

  1. the network model in the secondary infer(Anomaly Detecion ) outputlayerinfo is a array, as (c,h,w)=(1,8,70). I want to parsing function to cope with this array , then output vlaue . The parsing function means like, detection function :parse-bbox-func-name=parse_bbox_resnet, classifier function : parse-classifier-func-name=­parse_bbox_softmax.

  2. if I set net-work-type= 0, it is Detector. When I want to use buffer point get outputLayers Info, but I do not clear how to get buffer data in outputLayerinfer. At the same time , The output data from outputLayers Infor is stored in the buffer, and the order of the stored data structure is related to what ?

Could somebody please point me in the right direction?
Any help is much appreciated!

Hi, @user112251 , you can refer the source code demo to get the data from the pgie_pad_buffer_probe or sgie_pad_buffer_probe.

/opt/nvidia/deepstream/deepstream/sources/apps/sample_apps/deepstream-infer-tensor-meta-test/

I refer the source code demo, but when I get " obj_meta->obj_user_meta_list =None , I do not konw why?

Did you set the output-tensor-meta para in your config file?

yeah, I sett he output-tensor-meta =1 in Secondary infer config file,

There is no update from you for a period, assuming this is not an issue anymore.
Hence we are closing this topic. If need further support, please open a new one.
Thanks

You can try to set your Secondary config file network-type to 0.
https://forums.developer.nvidia.com/t/how-to-look-at-the-tensor-output-in-deepstream-infer-tensor-meta-test/145985/4

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.