I have a detector linked to a tracker then linked to classifier.
The detector detects charachters.
The tracker tracks them and gives a unique id for every one of them.
The classifier classifies the charchter into the appropriate class (‘A’, ‘B’, …)
I just want the full vector of the output layer (softmax layer) of the classifier.
It has the dimensions [1, 26] and contains the [probability to belong to ‘A’, probability to belong to ‘B’, …]
The example does not work, it generates this error:
(python3:16826): GStreamer-WARNING **: 09:09:46.960: Failed to load plugin ‘/usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstlibav.so’: /usr/lib/aarch64-linux-gnu/libgomp.so.1: cannot allocate memory in static TLS block
Unable to create Encoder
If the following error is encountered:
/usr/lib/aarch64-linux-gnu/libgomp.so.1: cannot allocate memory in static TLS block
Preload the offending library:
export LD_PRELOAD=/usr/lib/aarch64-linux-gnu/libgomp.so.1
Traceback (most recent call last):
File “deepstream_ssd_parser.py”, line 458, in
sys.exit(main(sys.argv))
File “deepstream_ssd_parser.py”, line 363, in main
encoder.set_property(“bitrate”, 2000000)
AttributeError: ‘NoneType’ object has no attribute ‘set_property’
Go to samples directory and run the following command.
$ ./prepare_ds_trtis_model_repo.sh
All the sample models should be downloaded/generated into
samples/trtis-model-repo directory.
I run the this sample. It is very heavy on the GPU (based on the Triton server).
The l_user variable is not None, it returns this value:
l_user <pyds.GList object at 0x7eccd66730>
But I run another sample: deepstream-test2-tensor-meta (run on tensorRT like my example) and we can recuperate the l_user variable. It prints this message in the screen:
Inside l_user = obj_meta.obj_user_meta_list Loop
By the way, it shows me this error message while running my app:
Unknown or legacy key specified ‘output_tensor_meta’ for group [property]
Unknown or legacy key specified ‘is-classifier’ for group [property]
Yes fixed. But not resolved the issue. I needed to change to make another probe function linked in the sink of the plugin just coming after the classifier plugin. And it works now. The l_user is not always None. It returns data related to the charachter detected.
Thanks for your help.
I have another issue presented here: Read the buffer data recuperated from NvDsInferLayerInfo (in Python)
Yes it is solved. The solution is to put another sgie probe in the sink that just follows the classifier plugin. In my case it was the nvvidconv sink :
vidconvsinkpad = nvvidconv.get_static_pad(“sink”)
if not vidconvsinkpad:
sys.stderr.write(" Unable to get sink pad of nvosd \n")
vidconvsinkpad.add_probe(Gst.PadProbeType.BUFFER, sgie_sink_pad_buffer_probe, 0)