Hi @snoop221 ,
The template is right. The problem is to set right value, but we are not sure what values of below parameters you used in the training.
force-implicit-batch-dim= // is it fixed batch of your engine?
model-color-format= // RGB input , not BGR input, right?
operate-on-gie-id= // the “gie-unique-id” of detector is 2, right?
basically, for most of the properties, you need to check what you used in training and the model properties, and set the corresponding values in this config file. From the info you provided above, hard for us to find what you used.
I used pretrained efficientnet-b0. Resized image to (3x224x224) and normalized(mean=[0.485, 0.456, 0.406],std=[0.229, 0.224, 0.225]). How to compute net scale factor and offsets?
5. Right. Classify bboxes from detector. How to chose proper process mode number for classifer and detector? What if I add one more classifier for bboxes from detector?
What about other properties? Did I choose operate-on-class-ids=1 correctly? What if I want classify all classes from detector? Also what about classifier-async-mode? how to use it? Im a bit confused with definition written in documentation
About post-processor for classifier, I normalize image and resize with the same parameters above. Then take top first class from softmax and show a label. I think deepstream has this kind of simple post-processor method for classifier and no need to write post-processor. I think output-blob-names=predictions/Softmax handles it. Correct me if I’m wrong.
y = net-scale-factor*(x-mean)
here, mean is the corresponding mean value, read either from the mean file or as offsets[c]
so, there is only one net-scale-factor, but can have three different mean which can be set by offsets,e.g. offsets=77.5;21.2;11.8