Please provide complete information as applicable to your setup.
• Jetson Nano
• DeepStream Version 5.0
• JetPack Version 4.4
• TensorRT Version
• Issue Type - Question
Hey, so I have two models running in the pipeline.
- Detector
- Text recognition
Have written a custom parser function for the secondary model, where I am storing the output string mal
in the NvDsInferAttribute
variable.
attr.attributeLabel = mal;
attrList.push_back(attr);
So, am unable to figure out how I can access the variable in the probe function of the main deepstream pipeline.
- Are there any other steps to be followed for attaching metadata?
- I went through the
/opt/nvidia/deepstream/deepstream-5.0/sources/apps/sample_apps/deepstream-infer-tensor-meta-test/deepstream_infer_tensor_meta_test.cpp
, and it seems to parse in the main file itself, so the same can be done through custom parser function as well? Or it is the only way?
Thanks in advance :)