gst-launch-1.0 jpeg processing with nvinfer - how to configure nvmsgbroker for json output to consol...

Hello,
I’m configuring pipeline for jpeg images’ DeepStream classification. I use Yolo Version 3 model (converted to TensorRT engine for better performance, of course) from DeepStream SDK example and want to classify objects from jpeg file to text output (and save original jpeg file, not jpeg with shapes and labels). Below is my code for gst-launch-1.0:

gst-launch-1.0 giosrc location=http://server-name/image.jpg ! tee name=t ! queue ! filesink location=~/test/`date +%F-%H-%M-%S`.jpg t. ! nvinfer config-file-path=./config_infer_primary_yoloV3.txt batch-size=1 ! nvmsgbroker

Which arguments are needed for nvmsgbroker for redirect json output to console? And is it really possible?

Thanks!

You can refer to sources/objectDetector_Yolo + deepstream-app
deepstream-app also support image input.

You can add “gst-dsexample” (sources/gst-plugins/gst-dsexample/README) to dump original buffer and metdata(inference result)

I cannot see any output of dsexample plugin in console. I looked at the source code and couldn’t find any lines about console output (except for debug code). But, at the other side, I found in the dsexample plugin’s sources things absolutely not needed for me, such as displaying metadata labels atop an video.

As I see, CLI is very uncomfortable for real world DeepStream applications, right?
Is coding on C++ the only way to work with DeepStream SDK or one provide another ways to work with?

redirect json output to console
nvmsgconv which transform metadata to be json has source code sources/gst-plugins/gst-nvmsgconv, you can add print and rebuild a new libnvdsgst_msgconv.so
It’s the same for dsexample plugin.

displaying metadata labels atop an video.
We have nvdsosd to display label. Please refer to test1 osd_sink_pad_buffer_probe()

coding on C++ the only way to work with DeepStream SDK
Yes. I think we need to perfect the function and extention in the first step. C/C++ engineer will find 4.0 sdk is very helpful. Next version, we will provide python-binding.