Please provide complete information as applicable to your setup.
**• Hardware Platform (Jetson / GPU) - Tesla T4 **
• DeepStream Version - Deepstream-6.2
• JetPack Version (valid for Jetson only)
• TensorRT Version - 220.127.116.11nvidia-
• NVIDIA GPU Driver Version (valid for GPU only) - 525.85.12
• Issue Type( questions, new requirements, bugs) - Query
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)
My application is a simple one where I have the following -
Source → Scaling → Decoding → Inference → Sink
While I am able to observe the inferences in the form of bounding boxes if I add the nvdsosd plugin, I wanted to ask how I can observe these inferences as raw data (Maybe as print messages on the console itself)?
I tried enabling the raw-output-file flag in the config file but it did not do anything. Also, I am confused about this particular flag as it is supposed to provide the “Pathname of raw inference output file” but is of the type Boolean.
I did find other answers as well such as the use of “osd_sink_pad_buffer_probe”. However, I am not sure how to add the same if I try to run my application just using Gstreamer command line (Without using APIs).
Can you please suggest what to do?