Light weight message queue

Currently I am using Kafka as MQ for sending status messages. To elaborate, once recording is done, I would be sending message to Inference program to kick start the image processing using already recorded/downloaded video.
Though Kafka is serving the purpose, sometimes, its occupying in-memory of around 500 MB. Is there any other light weight reliable message queue solution?

Hi,
There is a general usecase in DeepStream SDK is to run deep learning inference on Jetson platform and send data to a linux PC with kafka server. Not sure if it is same as your usecase. Seems like you install kafka server on Jetson Nano?

Hi @DaneLLL

We installed Kafka broker and zoo keeper. Do you think, can we able to use Kafka using any other options?

Thanks

Hi,
Please check
https://docs.nvidia.com/metropolis/deepstream/5.0DP/plugin-manual/index.html#page/DeepStream%20Plugins%20Development%20Guide/deepstream_plugin_details.3.14.html#wwpID0E0BQ0HA

They’re adapters for Azure, AMQP. You may take these into consideration.

Hi,

We are not using message queue to integrate cloud, rather we are using to track the status of recording as well inference. As we are not able to run inference in real time, we changed our logic to run inference upon recording completion. This is precisely where we are using Kafka for communicating between the components.