Can nvinferserver-element infer against tensors made by nvdspreprocess?

Please provide complete information as applicable to your setup.

**• Hardware Platform (Jetson / GPU) : dGPU
**• DeepStream Version : 6.0.1
**• JetPack Version (valid for Jetson only) : None
**• TensorRT Version : 8.0.1
**• NVIDIA GPU Driver Version (valid for GPU only) : 495.29.05
**• Issue Type( questions, new requirements, bugs) : questions
**• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing) : None
**• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description) : None


Can nvinferserver-element infer against tensors made by nvdspreprocess?

If not, can IInferCustomProcessor do the same preprocessing with libcustom2d_preprocess.so?
(The fact that IInferCustomProcessor does not change the “primaryInputs” is my concern.)

1 please refer to the official doc Gst-nvinferserver — DeepStream 6.1.1 Release documentation, nvinferserver can’t accept preprocessed tensor.
2 IInferCustomProcessor is for extra input process only. if want process primary inputs, please configure preprocess parameter, you can refer to samples\configs\deepstream-app-triton\config_infer_primary_classifier_densenet_onnx.txt

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.