Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU) T4
• DeepStream Version 6.1
• JetPack Version (valid for Jetson only)
• TensorRT Version 8.2
• NVIDIA GPU Driver Version (valid for GPU only) 11.6
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)
im using post-processor plugin Gst-nvdspostprocess in DeepStream — DeepStream 6.1.1 Release documentation
root@39026ff1831b:/opt/nvidia/deepstream/deepstream-6.1/sources/gst-plugins/gst-nvdspostprocess#
i have modified config_detector.yml file and made an build libnvdsgst_postprocess.so got generated and also
inside root@39026ff1831b:/opt/nvidia/deepstream/deepstream-6.1/sources/gst-plugins/gst-nvdspostprocess/postprocesslib_impl# ls
made an build and generated libpostprocess_impl.so
And i have linked postprocess component in the plugin , and im inferencing custom yolov4 tlt model .
not able get the inference output (metadata)
below is the pipeline
srcpad.link(sinkpad)
streammux.link(queue2)
queue2.link(pgie)
pgie.link(nvvidconv1)
nvvidconv1.link(filter1)
filter1.link(tiler)
tiler.link(nvvidconv)
nvvidconv.link(postprocess)
postprocess.link(nvosd)
And the probe function
postprocess_sink_pad = postprocess.get_static_pad("sink")
if not postprocess_sink_pad:
sys.stderr.write(" Unable to get src pad \n")
return
else:
postprocess_sink_pad.add_probe(Gst.PadProbeType.BUFFER, tiler_sink_pad_buffer_probe , 0)
im getting l_obj as None always
attached is the config file and yaml file
helmetLabels.txt (27 Bytes)
config_infer_postprocess.txt (1.1 KB)
helmet.yaml (1.1 KB)
app_PostProcessor.py (33.7 KB)