• Hardware Platform (Jetson / GPU) Jetson Nano 4Gb Dev Kit • DeepStream Version5.1 • JetPack Version (valid for Jetson only) 4.5.1 rev1 • TensorRT Version7.1.3
Gst-nvinfer — DeepStream 6.1.1 Release documentation I read that we can use preprocessed input tensor attached as metadata for input for Primary Inference NvInfer.
Is there any example implementation of the same? I have taken a look at the sample apps and didn’t find anything similar, am i missing something?
How to attach input tensor as metadata and how will the nvinfer plugin use it for inference?
Hello @AastaLLL , i have looked at this example its about attaching the output data of previous or primary infer plugin to secondary or other infer plugins.
my question is quite the reverse of it. That is how to attach preprocessed image data as metadata so that it can be used for inference? isnt this what the property input-tensor-meta is for? how to use this?