LPRNET - unable to get the vector of probabilities for every class for lprnet classifier in DeepStream

I have written the code GitHub - NVIDIA-AI-IOT/deepstream_lpr_app: Sample app code for LPR deployment on DeepStream in python and I want to get the vector probabilities of all classes of the secondary classifier.

I followed the tutorial at deepstream_python_apps/custom_parser_guide.md at master · NVIDIA-AI-IOT/deepstream_python_apps · GitHub and checked the issue at Get the full vector of probabilities for every class for one classifier in DeepStream

But I am unable to retrieve the frame user meta list

l_user = frame_meta.frame_user_meta_list

Please help me solve the issue.

• Hardware Platform (Jetson / GPU)
Jetson NX
• DeepStream Version
5.0
• JetPack Version (valid for Jetson only)
4.6-b199
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

Hi @user78169 , Could you attach the simplified code you have written?

Hi @yuweiw . Thank you very much for the reply. Please find the source code in the attach file.
infepy.txt (22.9 KB)

Hi @user78169 , I cannot run your code in my env.
Did you try to get the data from the deepstream_lpr_app with the c code?If you can get the vector from the c code in our demo, the reason maybe the python binding issue which you could check by yourself. Thanks

hi @yuweiw Thank you for the reply.

I haven’t tried with C code.
I could run the python code but the issue I am having is not being able to get the class probabilities.

@yuweiw may I upload the working directory in GitHub and provide you the link?

We suggest you tried with C code first and get the data you wanted, then you can tansfer it to python.
You can share your github link, but if your project has too much code, it maybe take a very long time to debug for it. Thanks

@yuweiw Thank you for the suggestion. But I couldn’t fix the issue yet.

I have uploaded the working code in the following GitHub repository.

https://github.com/nixondutt/licence_plate_debug

This time it should work in your environment. All the necessary code is in inference.py file.

Please help me solve the issue.
Thank you.

Hi @user78169 , yeah, the project can work in my env. From the log and your simple code, you don’t add any user meta in your code and neither add the output-tensor-meta in your config file. So there is nothing in the user_meta structure.
You can refer the link below to learn how to use output-tensor-meta
https://docs.nvidia.com/metropolis/deepstream/dev-guide/text/DS_plugin_gst-nvinfer.html#id2
Also you can refer the deepstream demo code from :

/opt/nvidia/deepstream/deepstream-6.1/sources/apps/sample_apps/deepstream-user-metadata-test/deepstream_user_metadata_app.c
1 Like