We are probing the source pad of the secondary inference and are able to get the class id and the width/height.
obj_user_meta_list is showing None and we expect to get the [1x512] raw tensor. What could be wrong? We are getting class 1 for example, and it should be operating on class 1 but we aren’t getting the output tensor from frame_meta_list> obj_meta_list> obj_user_meta_list
Thank you @fanzh the example was in fact exactly what we were trying to accomplish, and the solution was to use a large batch-size.
Would it be possible to achieve the same thing with performing the post processor on the source pad of the primary inference plugin instead of using a post processor library? I can open another ticket if needed.
Yes this is a new issue, and instead of a post processor lib we are using the source pad to do the post processing like the example from the python deep stream repo.