Access all probabilities of all classes in python


I’m running inference on Jetson Nano and everything works fine. Now I wan’t to access the probabilities for all (three) classes in my python code.

gst_buffer = info.get_buffer()
        if not gst_buffer:
            print("Unable to get GstBuffer ")
        batch_meta = pyds.gst_buffer_get_nvds_batch_meta(hash(gst_buffer))

        l_frame = batch_meta.frame_meta_list
        while l_frame is not None:
                frame_meta = pyds.NvDsFrameMeta.cast(
            except StopIteration:

            while l_obj is not None:
                except StopIteration:

                # Iterate through all classifiers of this object.
                l_cls = obj_meta.classifier_meta_list 
                while l_cls is not None:
                        cls_meta = pyds.NvDsClassifierMeta.cast(
                    except StopIteration:

                    # Iterate through all 
                    l_info = cls_meta.label_info_list
                    while l_info is not None:
                            label_meta = pyds.NvDsLabelInfo.cast(
                        except StopIteration:

My question: At which step and how can I now access the probabilities of all classes?
Thanks in Advance!

You can access the meta-data in loop while l_obj is not None:
So this loop is opened only when an object is detected, and hence if an object is detected a confidence score will be present. Otherwise all the confidence scores will be 0 as nothing is detected.

You can check confidence score of each class with their suitable class ID

1 Like

Well now I know the correct loop in which I can access it but still lacking the information on how to access it. I guess obj_meta.something?

I assume you want to check all confidence probabilities of all objects, meaning you want low confidence values also. Otherwise the detector throws out low probability confidence scores every time.
The default confidence value in the config.txt file will be pretty high, so change that to a low value


Note: you will find this in config primary .txt file.

Now the detector will value all the predictions and output every garbage prediction also.
Now inside the while loop while l_obj is not None:
Just print out the class ID using obj_meta.class_id and confidence as obj_meta.confidence

Now some object confidence will still not be outputted so you can assume their confidence score to be zero or very close to zero because you have set the bar so low with changing the threshold values to low values. So you can print that out explicitly (because its a fair assumption)

I don’t understand why you would want to do this as the object detector will output all sorts of false positives but if you want to experiment go ahead.

Thank you! First of all I am not performing detection but classification. I only have 3 classes and I simply want to look at all three probabilities… Now as far as I can see and as I understood l_obj is only one object or? So the while loop will only run once leaving me with one ID (which is -1 for whatever reason) and one confidence which is 0.
I also have an error gstnnvtracker: obj 1 Class mismatch! -1 -> 65535 but I always used to have this error although everything seems to work fine.

What model are you using to run these detections ?

I’m using Mobilenet

So I tried to set output-tensor-meta = 1 in the config file and access the tensor data via:

l_user = frame_meta.frame_user_meta_list

but l_user is None and via:

obj = obj_meta.obj_user_meta_list

and again obj is None. What am I doing wrong? This problem is still not solved for me.

Not a definite fix but go to the the config.txt file and then make sure you have set the property is-clasifier = 0 (as that is default ) and change it to is-clasifier = 1

The reason You run into issues is that Deep-stream uses object-detection for primary detections and classification for secondary inferences.
Please note this something I have observed and not something the Devs have said per say.

If you are looking for pure classification and already you work with mobileNet.

I would suggest tensforflow or tensorflow - lite. Tensorflow-lite is faster and used primarily for inference .
Use this link, then download the starter model and install the tflite interpreter and just get started. this should take you about an 1 hr if you are a complete beginner

Yes you can run it on the Jetson

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.