Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU) Jetson • DeepStream Version 6.0.1 • JetPack Version (valid for Jetson only) 4.6 • TensorRT Version • NVIDIA GPU Driver Version (valid for GPU only) • Issue Type( questions, new requirements, bugs) • How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing) • Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)
Hi, I am using deepstream_imagedata-multistream.py sample application. In this application I can get the frame, perform some operation and save the frame.
My Question is, is there any way that I can show this frame as output using the sink element of the pipeline.
For Example. I want to draw the bounding boxes only on a one class and then show that frame as output using sink.
Currently I can only draw the bounding boxes and save it. I can’t show that as output of the pipeline.
What if instead of using nvdsosd, you create a custom gstreamer element.
Take your image buffer and directly map into gpumat and then draw boxes using cuda kernel?
You can refer dsexample for the custom gstreamer element stuff.
OK. You can consider using the following plugin nvdsvideotemplate. Using this plugin, you can process the image data into any other data you want. We have many demos of using this plugin, like deepstream-emotion-app.