Skip postprocessing when using nvinfer

Hello, I would like to use nvinfer to run a TensorRT model and get its raw output that I will process in a Python probe function. I don’t need to run any postprocessing function nor I know C++. Is there a way I can skip the postprocessing? It seems that nvinfer wants an input function. I’d like to explicitly tell nvinfer not to do any postprocessing, otherwise the plugin tries to run some default processing an raise the following error:

0:00:06.330034453    82      0x409ad20 ERROR                nvinfer gstnvinfer.cpp:632:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::parseBoundingBox() <nvdsinfer_context_impl_output_parsing.cpp:59> [UID = 1]: Could not find output coverage layer for parsing objects
0:00:06.330071451    82      0x409ad20 ERROR                nvinfer gstnvinfer.cpp:632:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::fillDetectionOutput() <nvdsinfer_context_impl_output_parsing.cpp:735> [UID = 1]: Failed to parse bboxes
0:00:06.330086189    82      0x409ad20 ERROR                nvinfer gstnvinfer.cpp:632:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::parseBoundingBox() <nvdsinfer_context_impl_output_parsing.cpp:59> [UID = 1]: Could not find output coverage layer for parsing objects
0:00:06.330096395    82      0x409ad20 ERROR                nvinfer gstnvinfer.cpp:632:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::fillDetectionOutput() <nvdsinfer_context_impl_output_parsing.cpp:735> [UID = 1]: Failed to parse bboxes
0:00:06.330109152    82      0x409ad20 ERROR                nvinfer gstnvinfer.cpp:632:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::parseBoundingBox() <nvdsinfer_context_impl_output_parsing.cpp:59> [UID = 1]: Could not find output coverage layer for parsing objects
0:00:06.330118605    82      0x409ad20 ERROR                nvinfer gstnvinfer.cpp:632:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::fillDetectionOutput() <nvdsinfer_context_impl_output_parsing.cpp:735> [UID = 1]: Failed to parse bboxes

I know that this can be done with nvinferserver but I’d like to know if this can be done too using nvinfer.
I am currently using DS 6.0 and Tesla T4.
Thank you!

Please refer to deepstream_python_apps/apps/deepstream-ssd-parser at master · NVIDIA-AI-IOT/deepstream_python_apps (github.com)

It seems I was able to do it by setting the model as a classifier. Thank you!

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.