Output of deepstream doesn't match with output of onnx model

Hello all!

  1. I have an onnx model for Resnet50+SSD & passed an image to it, to do inference(out of DS) & got the output arrays of shapes 1x4x8732 & 1x81x8732 as expected.
  2. Now, I converted the above onnx model to TensorRT engine & passed it to DeepStream. When I use the same image for inference using DS, I get output arrays of sizes 4x8732 & 81x8732.

Problem:
The tensors outputted by DS are different from those obtained in Process-1 above.

I took the element in [0][0][0]index (which was 0.7378552) of 1st array in Process-1 & compared it with every number in 1st array in process-2. There was no exact match. The only close matches were 0.73835, 0.737011, 0.738657, 0.736894 , 0.737505 , 0.738648, 0.737501, 0.738449, 0.738631

Please let me know what would have been the problem.
Thanks!

• Hardware Platform: Jetson Nano
• DeepStream Version: 5.0
• JetPack Version (valid for Jetson only): 4.4
• TensorRT Version: 7.1.3.0

Hi,

Do you feed the same input data into the TensorRT?

Deepstream has its own decoder and pre-processing.
So the input may be slightly different to the version from other modules(ex. PIL or OpenCV).

Thanks.

Yes, I feed the path to the same input image in the config file of DS.

Okay. I’m using OpenCV to parse the image & fed it to onnx model. The element in [0][0][0] index of onnx’s 1st output tensor is -0.1431275. Whereas the corresponding element outputed by DS is 0.244358. Isn’t this a huge difference(My model outputs values b/w -3 & +3. )?

Thanks!

Hi,

In general, OpenCV is in BGR format and you may also have some customized mean and normalization factor.
Please update the deepstream configure according to your OpenCV setting.

https://docs.nvidia.com/metropolis/deepstream/plugin-manual/index.html#page/DeepStream%20Plugins%20Development%20Guide/deepstream_plugin_details.html#wwpID0E04DB0HA

Property Meaning Type and Range Example
net-scale-factor Pixel normalization factor Float, >0.0 net-scale-factor=0.031
offsets Array of mean values of color components to be subtracted from each pixel. Array length must equal the number of color components in the frame. The plugin multiplies mean values by net-scale-factor. Semicolon delimited float array,
all values ≥0
offsets=77.5;21.2;11.8
model-color-format Color format required by the model Integer
0: RGB
1: BGR
2: GRAY
model-color-format=0

If the issue goes on, we will need to reproduce this issue in our environment first.
Please provide a complete source and a reproducible steps with us.

Thanks.