Hi, I have trained a classifier with jetson-inference package and exported to onnx file. I am trying to use this with Deepstream. I seem to be able to run deepstream with the config and extracting meta data using pyds.so from the python bindings and through: NvDsFrameMeta->NvDsObjectMeta->NvDsClassifierMeta->NvDsLabelInfo. I have changed the labels file so that it is in the format label1;label2.
However, the results only show label1 with a probability of 1.0 and num_classes=0 from NvDsLabelInfo. Please could you advise, thanks!
Config file:
[property]
gpu-id=0
net-scale-factor=1
onnx-file=models/resnet18.onnx
labelfile-path=labels.txt
force-implicit-batch-dim=0
batch-size=1
# 0=FP32 and 1=INT8 mode 2=FP16
network-mode=2
#classifier
network-type=1
gie-unique-id=1
classifier-threshold=0.5
# process-mode: 2 - inferences on crops from primary detector, 1 - inferences on whole frame
process-mode=1
I am running Deepstream 5.1, Jetpack 4.5.1 on Xavier NX.
I am also trying to write a custom parser for classification as you have suggested, but using the files at:
/opt/nvidia/deepstream/deepstream/sources/libs/nvdsinfer_customparser
The README suggests that it should also work with Resnet18. Currently, the variables output:
numAttributes: 1
numClasses: 2
This seems ok but printing float probability = outputCoverageBuffer[c] is always 1 or 0.
Hi, after more experimenting, it seems that the net-scale-factor and offset seem to make the outputs more sensitive but still give poor results. I have read the docs and understand what the terms mean, but unsure how to calculate them?