Update the docs for deepstream_tao_apps

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) dGPU, NVIDIA GeForce RTX 4090
• DeepStream Version Docker container nvcr.io/nvidia/deepstream:7.1-triton-multiarch
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only) 560.70
• Issue Type( questions, new requirements, bugs) Outdated documentation
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)

On Windows 11 with WSL2, the Ubuntu-22.04 distro:

docker pull nvcr.io/nvidia/deepstream:7.1-triton-multiarch
xhost +
docker run --gpus all -it --rm --net=host --privileged -v /tmp/.X11-unix:/tmp/.X11-unix -e DISPLAY=$DISPLAY -w /opt/nvidia/deepstream/deepstream-7.1 nvcr.io/nvidia/deepstream:7.1-triton-multiarch
./user_deepstream_python_apps_install.sh -b
./user_additional_install.sh
cd sources/deepstream_python_apps/apps/deepstream-test
3

The README of this app says:

To setup peoplenet model and configs (optional):
Please follow instructions in the README located here : /opt/nvidia/deepstream/deepstream/samples/configs/tao_pretrained_models/README

Now, cat /opt/nvidia/deepstream/deepstream/samples/configs/tao_pretrained_models/README and it says:

Please refer to https://github.com/NVIDIA-AI-IOT/deepstream_tao_apps/deepstream_app_tao_configs/README.md for the details.

Let’s follow the link. The result is:

I’ve figured out that the right path is deepstream_tao_apps/deepstream_app_tao_configs/README.md at master · NVIDIA-AI-IOT/deepstream_tao_apps. But now it says:

which is incorrect, because deepstream_reference_apps/deepstream_app_tao_configs/ does not exist in the https://github.com/NVIDIA-AI-IOT/deepstream_reference_apps.git repo right now. This folder was deleted in the DS_7.1 release.
Ok, let’s clone the 7.0 release by providing -b DS_7.0 and continue following instructions. But then, when running ./prepare_triton_models.sh, trtexec fails.

My request is to update the instructions, so it is clear how to set up deepstream-test3 python app with peoplenet and triton. Thank you.

• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

Thanks for pointing out the inconsistency, will check and update.

1 Like

You’re welcome. I am glad to contribute to DeepStream.

Thanks for your feedback. This is indeed a documentation issue
For DS-7.1,how to run deepstream-test3 python app with peoplenet and triton

Apply the following patch for config_triton_infer_primary_peoplenet.

diff --git a/apps/deepstream-test3/config_triton_infer_primary_peoplenet.txt b/apps/deepstream-test3/config_triton_infer_primary_peoplenet.txt
index 9645398..b9706d4 100644
--- a/apps/deepstream-test3/config_triton_infer_primary_peoplenet.txt
+++ b/apps/deepstream-test3/config_triton_infer_primary_peoplenet.txt
@@ -28,10 +28,10 @@ infer_config {
       {name: "output_cov/Sigmoid:0"}
     ]
     triton {
-      model_name: "peoplenet"
+      model_name: "peopleNet"
       version: -1
       model_repo {
-        root: "/opt/nvidia/deepstream/deepstream/samples/triton_model_repo"
+        root: "/opt/nvidia/deepstream/deepstream/samples/configs/tao_pretrained_models/triton"
         strict_model_config: true
       }
     }
@@ -51,7 +51,7 @@ infer_config {
   }
 
   postprocess {
-    labelfile_path: "/opt/nvidia/deepstream/deepstream/samples/configs/tao_pretrained_models/labels_peoplenet.txt"
+    labelfile_path: "/opt/nvidia/deepstream/deepstream/samples/configs/tao_pretrained_models/triton/peopleNet/labels.txt"
     detection {
       num_detected_classes: 4
       per_class_params {
git clone https://github.com/NVIDIA-AI-IOT/deepstream_tao_apps.git
cd deepstream_tao_apps/deepstream_app_tao_configs
cp -a * /opt/nvidia/deepstream/deepstream/samples/configs/tao_pretrained_models/
cd /opt/nvidia/deepstream/deepstream/samples/configs/tao_pretrained_models/
./prepare_triton_models.sh 

cd /opt/nvidia/deepstream/deepstream/sources/deepstream_python_apps/apps/deepstream-test3

python3 deepstream_test_3.py --no-display -i file:///opt/nvidia/deepstream/deepstream/samples/streams/sample_720p.h264 -g nvinferserver -c config_triton_infer_primary_peoplenet.txt
1 Like

Now that is helpful! Just a few steps - and PeopleNet with Triton are usable, instead of many redirections and nothing working in result. Except that it should be cd deepstream_tao_apps/deepstream_app_tao_configs after the git clone ... command.

Thank you very much!