I have a requirement where I need to Infer using a custom classifer. I built the TRT engine, and running it using Deepstream App for Testing. This is the Signature for my custom Parser.
bool NvDsInferCRNN (std::vector<NvDsInferLayerInfo> const &outputLayersInfo,
NvDsInferNetworkInfo const &networkInfo,
float classifierThreshold,
std::vector<NvDsInferAttribute> &attrList,
std::string &descString)
Now I am able to parse NvInfer Tensor Output Correctly and the final “descString” is also correct (I am doing Multiple Couts in the Parser Function Itself to Validate), But I am not able to display it on Screen.
When I test the Primary Classifer Example (/opt/nvidia/deepstream/deepstream-5.0/samples/configs/deepstream-app-trtis/source1_primary_classifier.txt) I am getting OSD, but not in my custom case. What am I missing??