Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU) GPU
• DeepStream Version 6.2
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs) Question
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)
Hi, I am trying to save the inference results to redis with the msg broker sink.
In order to do so, I did the followings.
-
I followed all the instructions from this page to set up redis.
Gst-nvmsgbroker — DeepStream 6.2 Release documentation
Then, I created a host group named mygroup with a key named metadata. -
in deepstream_app_config, I set type=6 under the sink property.
I also added the config, proto-lib, and conn-str property.
So the entire sink property looks like this
[sink0]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File
type=6
sync=0
source-id=0
gpu-id=0
nvbuf-memory-type=0
msg-conv-config=cfg_redis.txt
msg-broker-proto-lib=/opt/nvidia/deepstream/deepstream-6.2/lib/libnvds_redis_proto.so
msg-broker-conn-str=localhost;6379
where the cfg_redis looks like this
[message-broker]
hostname=localhost
port=6379
payloadkey=metadata
consumergroup=mygroup
consumername=myname
streamsize=10000
Then, I run an app, it finishes successfully.
But it seems like it has not saved any inference results.
Am I missing something?