Inferring detectnet_v2 .trt model in python

Please modify

binding_to_type = {
“input_1”: np.float32,
“output_bbox/BiasAdd”: np.float32,
“output_cov/Sigmoid”: np.float32,
}

to

binding_to_type = {
“Input”: np.float32,
“BatchedNMS”: np.int32,
“BatchedNMS_1”: np.float32,
“BatchedNMS_2”: np.float32,
“BatchedNMS_3”: np.float32
}

Also, please refer to preprocess way in Discrepancy between results from tlt-infer and trt engine - #8 by Morganh

Refer to postprocess way in https://github.com/NVIDIA-AI-IOT/deepstream_tlt_apps/blob/master/nvdsinfer_customparser_yolov3_tlt/nvdsinfer_custombboxparser_yolov3_tlt.cpp

2 Likes