• Hardware Platform (Jetson / GPU)
Jetson Orin AGX Dev Kit
RTXA4000, Driver Version 515.86.01 Dell XPS intel I7-10700 CPU
• DeepStream Version
6.1.1
• JetPack Version (valid for Jetson only)
5.0.2
We have a post processing library that works with a secondary inference, we are able to access the output tensor of the secondary inference in a source probe. We are using a python application.
The issue is the obj_meta.obj_user_meta_list shows as None when implementing the exact post processor on the primary inference source probe instead of using a postproc library.
The object meta gets placed correctly including the rect_params, confidence, classid, object_id, object label, text_params, bboxinfo, and unique_component_id.
On the frame level, the number of objects is set correctly, bInferDone is set to 1, and the source width/height are set correctly. We can access the frame and object meta data in the secondary source probe as shown:
obj_user_meta_list None, height 375.0 width 162.0 class_id 1
top 233.0 left 271.0
bbox info: left 271.0406188964844
bbox info: top 233.3119354248047
bbox info: width 162.89999389648438
bbox info: height 375.16741943359375
As you see the obj_user_meta_list is None, we know the models are working and the configuration files are correct because, as mentioned, we are able to extract the obj_user_meta_list from the post processor library implementation.
So my question is whats missing here? Is the secondary inference looking for some kind frame_user_meta, or obj_user_meta data?