Proper configuration for training and using the Efficientnet image classification model?

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) → GPU
• DeepStream Version → 5.1
• JetPack Version (valid for Jetson only)
• TensorRT Version → 7.2
• NVIDIA GPU Driver Version (valid for GPU only) → 455.32.00
• Issue Type( questions, new requirements, bugs) How to properly setup the specs for training Efficientnet image classification model and the specs for using Efficientnet image classification model in Deepstream
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

The inference result produced by Deepstream and Tao inference.py script is different. How to properly configure the spec file on both sides so that both can consistently produce the same inference result.

I have attached the spec file for training the efficientnet image classification model and the spec file to run the classifier as Secondary Inference Engine in Deepstream.
sgie_efficientnet_tlt_config.txt (2.9 KB)

classification_dvd_spec.cfg (1.3 KB)

1 can you elaborate the difference? is there any error in deepstream test result?
2 you can test deepstream-test2 to check how sgie works, and replace your model for testing. please refer to parameters introdcution doc Gst-nvinfer — DeepStream 6.1.1 Release documentation

  1. There’s no difference. I am using the default settings. There’s also no error in deepstream test result.
  2. Yep, I have tried following the config file in deepstream-test2 and tried almost all the parameters in the docs. The classifier is still not outputting the correct result.

1 About “The inference result produced by Deepstream and Tao inference.py script is different” , is the result of Tao inference.py right? is the result of deepstream is wrong?
2 If the result of Tao inference.py is right, please check the parameter in deepstream again, you can read some demo to know how to test TAO model in deepstream, pelease refer to GitHub - NVIDIA-AI-IOT/deepstream_tao_apps: Sample apps to demonstrate how to deploy models trained with TAO on DeepStream

Moving this topic from Deepstream forum to TAO forum.

Hi @yuanxintaxon
For TAO classification model, please refer to the setting in Issue with image classification tutorial and testing with deepstream-app - #21 by Morganh

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.