Deepstream secondary gie classification results are incorrect

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
jetson
• DeepStream Version
6.1.1
• JetPack Version (valid for Jetson only)
• TensorRT Version
8.4
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
questions
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
Secondary GIE uses mobilenetv2, in which the softmax layer is not added during mobilenetv2 model training. When testing the tensorRT model in python, the inference result is correct, but the classification result of secondary GIE in the C++ version of the pipeline is wrong.Below is the image preprocessing before python inference tensorRT model。

        mean = np.array([123.675, 116.28, 103.53])
        std = np.array([58.395, 57.12, 57.375])
        images = [cv2.resize(cv2.cvtColor(img, cv2.COLOR_BGR2RGB), (112, 112)) for img in images]
        images = [(img - mean) / std for img in images]

        images_array = np.array(images)
        batch_data = images_array.transpose((0, 3, 1, 2))

Configuration parameters of secondary gie in pipeline

[property]
gpu-id=0
net-scale-factor=0.018
offsets=123.675;116.28;103.53
#onnx-file=../../models/vehicle_class/vehicle_type_cls.onnx
model-engine-file=../../models/vehicle_class/vehicle_type_cls.onnx_b4_gpu0_fp16.engine
mean-file=../../models/vehicle_class/cls.ppm
labelfile-path=../../models/vehicle_class/labels.txt
#force-implicit-batch-dim=1
#batch-size=1
model-color-format=1
## 0=FP32, 1=INT8, 2=FP16 mode
network-mode=2
is-classifier=1
process-mode=2
output-blob-names=output
classifier-async-mode=1
classifier-threshold=0.51
input-object-min-width=112
input-object-min-height=112
num-detected-classes=7
#infer-dims=3;112;112
network-input-order=0
#operate-on-gie-id=1
#operate-on-class-ids=2
classifier-type=vehicleCls
#scaling-filter=1
scaling-compute-hw=2

Is there something wrong with the configuration? Can anyone help me?
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

how do you get this value? please refer to topic1, topic2.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.