Preprocessing of image in InferPreprocessor::transform


Thanks for your patience.

We try to reproduce this with an MNIST model whose size is 28x28 (<128).
But the Deepstream output looks good without issue.

Could you check this sample to see if any difference in the setting or condition between us?
(This is a pgie as classification pipeline, but it should be valid for reproducing the buffer issue.)

If we miss anything, could you modify the sample and reproduce the issue with this MNIST pipeline directly.
(It will be much easier for us to debug since the dependency is relatively few.)

Please download and extract the file. (31.6 KB)

  • Put data.mp4 and label.txt under the /usr/src/tensorrt/data/mnist/ folder.
  • Put bug_200669225.txt and config_infer_secondary_mnist.txt to /opt/nvidia/deepstream/deepstream-5.0/samples/configs/deepstream-app.
  • And run the pipeline with the following command:
$ cd /opt/nvidia/deepstream/deepstream-5.0/samples/configs/deepstream-app
$ deepstream-app -c bug_200669225.txt