I have trained an image classification model using TensorFlow, and the network type is Mobilenet_v2 using nvidia-tao toolkit. I was able to get the TLT files and convert them into ONNX files using the command:
tao model classification_tf2 train -e /workspace/tao-experiments/specs/spec.yaml --gpus 1
tao model classification_tf2 export -e /workspace/tao-experiments/specs/spec.yaml --gpus 1
I also generated the TRT engine file using the command:
tao deploy classification_tf2 gen_trt_engine -e /workspace/tao-experiments/specs/spec.yaml
When I try to perform inference in DeepStream using the generated engine file, I am getting a classifier meta as None. I have attached the SGIE config file and the DeepStream code that I have used. Can anyone help resolve this issue?
myoutput.txt (2.7 MB)
I am attaching the log file from when I ran tao deploy classification_tf2 gen_trt_engine -e /workspace/tao-experiments/specs/spec.yaml
There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks