Accuracy issues while running a model inside and outside deepstream with same trt modeld

You need to set that according to our formula below.

y = net-scale-factor*(x-mean)
x is the input pixel value. It is an int8 with range [0,255].
mean is the corresponding mean value, read either from the mean file or as offsets[c], where c is the channel to which the input pixel belongs, and offsets is the array specified in the configuration file. It is a float.
net-scale-factor is the pixel scaling factor specified in the configuration file. It is a float.
y is the corresponding output pixel value. It is a float.

You can refer to our FAQ to tune some parameter first.

You can refer to our FAQ.