Image normalization in deepstream pipeline

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) Jetson AGX Orin
• DeepStream Version 6.2
• JetPack Version (valid for Jetson only) 5.1
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

I have a question about the image processing in the deepstream pipeline.

In image normalization in deepstream pipeline, as far as I understood,
Image input for PGIE = (Read image - offsets) * net-scale-factor
is how it works.

But I wonder if the read image is divided by 255 as default or not.
Since the model normalization is usually done after division by 255.

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

The net-scale-factor is the factor of dividing. You also can add some log in our demo code for normalization, it’s open source.

NvDsInferStatus InferPreprocessor::transform()

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.