Raw output is not saved!

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
Nvidia TitanX
• DeepStream Version
5.1 latest Docker image (nvcr.io/nvidia/deepstream:5.1-21.02-triton)
• JetPack Version (valid for Jetson only)
• TensorRT Version
default one i guess 7.2.1
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

I have a strange problem. I set infer-raw-output-dir parameter in the source file, however, network outputs are not saved in the given directory. Similarly, when i add that parameter to densenet example, it also didn’t save the results to given directory. On the contrary, faster-rcnn example works, i.e. it saves .bin files in the given directory.
I am running docker image with the command:
sudo docker run --gpus all -it --rm -v /path/to/other/mount_dir/video:/video -v /tmp/.X11-unix:/tmp/.X11-unix -e DISPLAY=$DISPLAY -w /opt/nvidia/deepstream/deepstream-5.1 nvcr.io/nvidia/deepstream:5.1-21.02-triton

And running the model with: deepstream-app -c source1_primary_mymodel.txt

Below I attached my corresponding files, help would be appreciated :)

source1_primary_mymodel.txt (2.1 KB) config_infer_mymodel.txt (849 Bytes) config.pbtxt (436 Bytes)

1 Like

I’ve tested infer-raw-output-dir with deepstream sample models /opt/nvidia/deepstream/deepstream-5.1/samples/configs/deepstream-app-trtis on nvcr.io/nvidia/deepstream:5.1-21.02-triton container. It works well.

Please refer to our samples.


densenet example also works right in your experiments? I will try a clean install then. thanks.

All samples in /opt/nvidia/deepstream/deepstream-5.1/samples/configs/deepstream-app-trtis can work.