How to use tensor in classifier and get the output out of it, how that time osd function look like

Please provide complete information as applicable to your setup.

**• Hardware Platform-------> GPU
**• DeepStream Version-------->6.1.1
**• TensorRT Version ----------> 8.4
**• NVIDIA GPU Driver Version ----->525.

Suppose I want to use classifier tensor for get the output from the classifier and I unable “output-tensor-meta=1” and any other option I have to unable ?

Can anyone provide OSD_pad_probe code and how the are getting information from the tensor ?

Yes, you only need to enable “output-tensor-meta=1”

Are you using C or Python for the probe?

1 Like

You can refer to our open source code demo: sources\apps\sample_apps\deepstream-infer-tensor-meta-test.

Sorry I forgot to inform I’m using Python as a programming language.

You can refer to the link below: pgie_src_pad_buffer_probe

1 Like

I got the last layer of the output! But my question is how to get prediction output from the layer for classifier,
there I could not find any information about this in python !
Please if you can provide the information about this ! I want to get the result from tensor

If you set the output-tensor-meta=1, you need to parse it by yourself based on the output of your own model. You need to write the parse code by yourself like: ssd_parser.py

1 Like

I understand.But for classification suppose my last layer name is “softmax_1” and this is a gender classifier
from this how I can access the output , I mean predicted values ! (male,female)probability

You need to parse it from the softmax_1 layer.buffer by yourself. You should know the output format about the buffer of your own model.

1 Like

Yeah, I got it .

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.