Custom Bindings object to send custom payload in Deepstream 6.2

Can’t compile deepstream nvmsgconv file on Jetson.
root@ubuntu:/opt/nvidia/deepstream/deepstream-6.2/sources/libs/nvmsgconv# make install
protoc -I./deepstream_schema --cpp_out=./deepstream_schema/ Makefile
Makefile: File does not reside within any path specified using --proto_path (or -I). You must specify a --proto_path which encompasses this file. Note that the proto_path must be an exact prefix of the .proto file names – protoc is too dumb to figure out when two paths (e.g. absolute and relative) are equivalent (it’s harder than you think).
make: *** [Makefile:53: deepstream_schema/schema.pb.cc] Error 1

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) : Jetson ORIN AGX 32GB DEV KIT
• DeepStream Version 6.2
• JetPack Version (valid for Jetson only) 5.1.1
Is there a tutorial or example to use custom binding objects to be embedded in custom payload using kafka?

schema.proto (6.9 KB)

This is a bug in version 6.2,It will be fixed in the next version

You can put schema.proto to /opt/nvidia/deepstream/deepstream/sources/libs/nvmsgconv/deepstream_schema

Then you can compile successfully

You can refer deepstream-app-test4

Thanks

Thanks @junshengy I added schema.proto. I am able to send custom objects.
My kakfka library is breaking when I am sending huge string for a field (frame image bytes to base64 strings). Is there an upper limit for STRING_PROPERTY size?

What does STRING_PROPERTY mean ?

Someone else did something similar before.link

Can you share stack and error log ? I’m not sure if it’s related to DeepStream.

Dose your message bigger than 1M ?If so, maybe need more configure item for kafka.
You can refer it

I increased Kafka message size. It is around 5MB, but it still fails with the error:
Error :
Error: gst-library-error-quark: GStreamer encountered a general supporting library error. (1): /dvs/git/dirty/git-master_linux/deepstream/sdk/src/gst-plugins/gst-nvmsgbroker/gstnvmsgbroker.cpp(522): legacy_gst_nvmsgbroker_render (): /GstPipeline:pipeline0/GstNvMsgBroker:nvmsg-broker:

I am running python bindings code on a custom app similar to deepstream-test4 but instead of sending custom vehicle and person object, I am sending custom objects. I can successfully send messages to kafka server as long the size of messages is small. Is there any limitation on deepstream side for string properties for custom objects.

By STRING_PROPERTY:

Thanks @junshengy for helping out.

I solved the problem here.
Update kafka server, topic and consumer for large messages.
Deepstream App changes:
proto-cfg = “message.max.bytes=10485880” had to be added in cfg_kafka.txt
and had to explicitly free the pointer for the field with huge string.
and had to make sure buffer is freed even if it is part of extMsg. Since it is a huge string. free of extMsg doesn’t get free and cause

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.