Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU) GPU
• DeepStream Version 5.0
• JetPack Version (valid for Jetson only)
• TensorRT Version 7.2.1
• NVIDIA GPU Driver Version (valid for GPU only) 450.102.04
• Issue Type( questions, new requirements, bugs) Question
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
Add following in deepstream-app.c -
NVGSTDS_ELEM_ADD_PROBE(pipeline->common_elements.primary_tensor_buffer_probe_id,
pipeline->common_elements.primary_gie_bin.bin, “src”,
pgie_pad_buffer_probe, GST_PAD_PROBE_TYPE_BUFFER,
&pipeline->common_elements);
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)
Hi,
I want to use tensor-meta-data output in deepstream-app.c
I have checked “deepstream-infer-tensor-meta-test” for the same.
I have enabled “output-tensor-meta=1” in config file and used “pgie_pad_buffer_probe()” function in deepstream-app.c.
My question is -
is only configuring “output-tensor-meta=1”, i can get tensor data out as gst buffer in “pgie_pad_buffer_probe()”? or i need to probe this api like-
if(config->primary_gie_config.enable){
NVGSTDS_ELEM_ADD_PROBE(pipeline>common_elements.primary_tensor_buffer_probe_id, pipeline->common_elements.primary_gie_bin.bin, “src”,
pgie_pad_buffer_probe, GST_PAD_PROBE_TYPE_BUFFER,
&pipeline->common_elements);
inside “deepstream-app.c”?
As if i do that , i get “Segmentation fault”. And if not, then not able to access pgie_pad_buffer_probe().
Thanks