DeepStream configuration file (Secondary GIE) for TAO classification training

• Hardware Platform (Jetson / GPU)
Jetson AGX Xavier
• DeepStream Version
Deepstream 6
• JetPack Version (valid for Jetson only)
JetPack 4.6
• TensorRT Version
TensorRT 8
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)

  • Train the classification/classification.ipynb using Jupyter Notebook
  • Use DeepStream test2 sample for inference

• Issue Type( questions, new requirements, bugs)
I downloaded the TAO Toolkit Computer Vision Samples from NVIDIA NGC
I successfully trained the classification/classification.ipynb using Jupyter Notebook. However, there is no sample configuration file to deploy on DeepStream. I tried to write a configuration file but nothing work. There are parameters like net-scale-factor, offsets, model-color-format which maybe I put wrong. Could you please share with me the configuration file to run that sample on Deepstream as secondary model?

Which model do you prefer?

@Fiona.Chen If you take a closer look at the repository, you will notice that all the provided configuration files are for object detection as primary GIE only. But, there are no samples for image classification as primary or secondary GIE.
deepstream_tao_apps/configs at master · NVIDIA-AI-IOT/deepstream_tao_apps (github.com)
The only config files I found are for Caffe in deepstream_test2 but there are no for TAO.

To change PGIE to SGIE, you only need to change the nvinfer config file, set “process-mode=1” and set “operate-on-gie-id” to the PGIE you want to handle.

https://docs.nvidia.com/metropolis/deepstream/dev-guide/text/DS_plugin_gst-nvinfer.html#id2

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.