Please provide complete information as applicable to your setup.
**• Hardware Platform (Jetson / GPU) : dGPU
**• DeepStream Version : 6.0.1
**• JetPack Version (valid for Jetson only) : None
**• TensorRT Version : 8.0.1
**• NVIDIA GPU Driver Version (valid for GPU only) : 495.29.05
**• Issue Type( questions, new requirements, bugs) : questions
**• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing) : None
**• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description) : None
Can nvinferserver-element infer against tensors made by nvdspreprocess?
If not, can IInferCustomProcessor do the same preprocessing with libcustom2d_preprocess.so?
(The fact that IInferCustomProcessor does not change the “primaryInputs” is my concern.)