Acessing tensormeta in deep stream using nvinfer plugin

Description

We are trying to access tensormeta from deepstream using nvinfer plugin. we got Not a Number in tensormeta always. our triton inference server uses tensorflow saved model backend. we also tried inference with triton client results are fine. Tensorrt and onnx too give the same results
.

Environment

TensorRT Version: 8.0.1
GPU Type: dGPU
CUDA Version: CUDA 11.3

All the files needed to reproduce issue are shared inthis link

Hi,

This looks more related to Deepstream. We are moving this post to the Deepstream forum to get better help.

Thank you.

Sorry for the late response, is this still an issue to support? Thanks

Sorry! what do you mean " we got Not a Number in tensormeta always" ? And, where did you get the tensormeta?

Did you enable “output_tensor_meta=true”?

https://docs.nvidia.com/metropolis/deepstream/dev-guide/text/DS_plugin_gst-nvinferserver.html

yes i had set output_tensor_meta to true

can you check this?

when i read tensormata inside python code the returned numpy array contains full of np.nan as values

sorry, the repo files are invalid, could you share me the repo file again?

when i read tensormata inside python code ===> I don’t mean python or C++, I mean if it’s add in nvinfer sink/src pad or any other places, can you share the piece of code?

shared the files again WeTransfer - Send Large Files & Share Photos Online - Up to 2GB Free