Set NvInfer 'network-type' NULL or N/A for raw tensor output

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
GPU
• DeepStream Version
6.0
• JetPack Version (valid for Jetson only)
• TensorRT Version
8
• NVIDIA GPU Driver Version (valid for GPU only)
510.06
• Issue Type( questions, new requirements, bugs)
question
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

Hi,

I am using a custom onnx model with NvInfer.
I am reading the raw tensor output by setting ‘output-tensor-meta=1’.
Currently ‘network-type’ has to be set to either 0, 1, 2 or 3 for Detector, Classifier, Segmentation or Instance Segmentation.
Is it possible to set the NvInfer ‘network-type’ to null or similar for custom models that are neither of the above and requires no post-process by the NvInfer plugin and only read the raw tensor outputs collected from NvdsBatchMeta?

Thanks,

You can set “network-type=100”, 100 is for customized models.

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.