Peoplenet post-processing logic to run with triton inference server

I trained peoplenet on custom dataset. Renamed trained(TLT) model “model.trt”->“model.plan”. Able to load this model(peoplenet) with inference server and also getting inference results(“output_cov/Sigmoid”,“output_bbox/BiasAdd”) with client(grpc) application. To get proper bounding boxes and confidence scores, I didn’t find post-processing logic any where in Transfer Learning Toolkit. Did any one help in getting peoplenet’s post-processing logic?

Environment:
TLT setup: “tlt-streamanalytics:v2.0_dp_py2”
Inference Server: “nvcr.io/nvidia/tensorrtserver:20.02-py3

3 Likes