Unable to connect to broker library within a Triton docker container

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
x86, ubuntu 20.04, GeForce 1660Ti, CUDA 11.1
• DeepStream Version
5.1
• JetPack Version (valid for Jetson only)
• TensorRT Version
7.2.3-1
• NVIDIA GPU Driver Version (valid for GPU only)
460.32.03
• Issue Type( questions, new requirements, bugs)
** ERROR: main:655: Failed to set pipeline to PAUSED
Quitting
ERROR from sink_sub_bin_sink1: Could not configure supporting library.
Debug info: gstnvmsgbroker.c(388): legacy_gst_nvmsgbroker_start (): /GstPipeline:pipeline/GstBin:sink_sub_bin1/GstNvMsgBroker:sink_sub_bin_sink1:
unable to connect to broker library
ERROR from sink_sub_bin_sink1: GStreamer error: state change failed and some element failed to post a proper error message with the reason for the failure.
Debug info: gstbasesink.c(5265): gst_base_sink_change_state (): /GstPipeline:pipeline/GstBin:sink_sub_bin1/GstNvMsgBroker:sink_sub_bin_sink1:
Failed to start
App run failed
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
It works when I run deepstream-test5 locally. And the kafka consumer can receive the detection results. However, it doesn’t work when I use Triton Inference Server within the docker container.

I build a custom docker image “custom_triton” based on 5.1-21.02-triton and run the docker container as follows:

Dockerfile:
FROM nvcr.io/nvidia/deepstream:5.1-21.02-triton
ADD config_new.txt /opt/nvidia/deepstream/deepstream-5.1/samples/configs/deepstream-app-trtis/
ADD dstest5_msgconv_sample_config.txt /opt/nvidia/deepstream/deepstream-5.1/samples/configs/deepstream-app-trtis/
CMD ["/bin/bash"]
WORKDIR /opt/nvidia/deepstream/deepstream-5.1/

sudo docker run --shm-size=1g --ulimit memlock=-1 --ulimit stack=67108864 --gpus all -p 8554:8554 -it --rm -v /tmp/.X11-unix:/tmp/.X11-unix -v /opt/nvidia/deepstream/deepstream-5.1/samples/trtis_model_repo:/opt/nvidia/deepstream/deepstream-5.1/samples/trtis_model_repo -e DISPLAY=$DISPLAY custom_triton:latest

Here is my configuration file:

[application]
enable-perf-measurement=1
perf-measurement-interval-sec=5
#gie-kitti-output-dir=kitti-trtis

[tiled-display]
enable=1
rows=1
columns=1
width=1280
height=720
gpu-id=0
nvbuf-memory-type=0

[source0]
enable=1
#Type - 1=CameraV4L2 2=URI 3=MultiURI 4=RTSP
type=2
uri=file://…/…/streams/sample_1080p_h264.mp4
num-sources=1
#drop-frame-interval=2
gpu-id=0
cudadec-memtype=0

[sink0]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=UDPSink 5=nvoverlaysink 6=MsgConvBroker
type=6
msg-conv-config=dstest5_msgconv_sample_config.txt
#(0): PAYLOAD_DEEPSTREAM - Deepstream schema payload
#(1): PAYLOAD_DEEPSTREAM_MINIMAL - Deepstream schema payload minimal
#(256): PAYLOAD_RESERVED - Reserved type
#(257): PAYLOAD_CUSTOM - Custom schema payload
msg-conv-payload-type=0
msg-broker-proto-lib=/opt/nvidia/deepstream/deepstream-5.1/lib/libnvds_kafka_proto.so
#Provide your msg-broker-conn-str here
msg-broker-conn-str=localhost;9092;guest_zyhe
topic=guest_zyhe
#Optional:
#msg-broker-config=…/…/deepstream-test4/cfg_kafka.txt

[sink1]
enable=0
#Type - 1=FakeSink 2=EglSink 3=File
type=1
sync=0
source-id=0
gpu-id=0
nvbuf-memory-type=0

[sink2]
enable=0
type=3
#1=mp4 2=mkv
container=1
#1=h264 2=h265
codec=1
sync=0
#iframeinterval=10
bitrate=2000000
output-file=out.mp4
source-id=0

[osd]
enable=1
gpu-id=0
border-width=1
text-size=15
text-color=1;1;1;1;
text-bg-color=0.3;0.3;0.3;1
font=Serif
show-clock=0
clock-x-offset=800
clock-y-offset=820
clock-text-size=12
clock-color=1;0;0;0
nvbuf-memory-type=0

[streammux]
gpu-id=0
live-source=0
batch-size=1
batched-push-timeout=40000
width=1920
height=1080
enable-padding=0
nvbuf-memory-type=0

[primary-gie]
enable=1
#(0): nvinfer; (1): nvinferserver
plugin-type=1
#infer-raw-output-dir=trtis-output
batch-size=1
interval=0
gie-unique-id=1
config-file=config_infer_primary_detector_ssd_inception_v2_coco_2018_01_28.txt

[tests]
file-loop=1

• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

Hi,
For Kafka - Connection string of format: host;port
please change accordingly.

msg-broker-conn-str=localhost;9092;guest_zyhe

Hi,

Thanks for your reply. I have changed it to
msg-broker-conn-str=localhost;9092
topic=guest_zyhe

The issue remains the same. Any suggestions?

Where did you install kafka broker? can you use host ip where kafka broker installed?

I’m following the quick start guide to install and run the Kafka broker. I followed another thread: https://forums.developer.nvidia.com/t/using-kafka-protocol-for-retrieving-data-from-a-deepstream-pipeline/67626/4 to build the Kafka broker under the docker container. And then I run the DeepStream image with the same network as Kafka. Now, there is no issue of “unable to connect to broker library”, but my Kafka consumer cannot receive anything from DeepStream…My command is shown below:

For Kafka:
sudo docker run -d --name zookeeper -p 2181:2181 --network kafka_net zookeeper:latest
sudo docker run -d --name kafka -p 9092:9092 --network kafka_net --env ZOOKEEPER_IP=zookeeper ches/kafka
sudo docker run --rm --network kafka_net ches/kafka \kafka-topics.sh --create --topic USER_CREATED_TOPIC --replication-factor 1 --partitions 1 --zookeeper zookeeper:2181
Start the consumer: sudo docker run --rm --network kafka_net ches/kafka \kafka-console-consumer.sh --topic
USER_CREATED_TOPIC --from-beginning --bootstrap-server kafka:9092

For DeepStream:
sudo docker run --shm-size=1g --ulimit memlock=-1 --ulimit stack=67108864 --gpus all -p 8554:8554 --net kafka_net -it --rm -v /tmp/.X11-unix:/tmp/.X11-unix -v /opt/nvidia/deepstream/deepstream-5.1/samples/trtis_model_repo:/opt/nvidia/deepstream/deepstream-5.1/samples/trtis_model_repo -e DISPLAY=$DISPLAY -e ZOOKEEPER_IP=zookeeper custom_triton:latest

And my msg-broker-conn-str is changed to kafka;9092;USER_CREATED_TOPIC

I believe the DeepStream container finds Kafka server, but the server cannot receive anything…Could you give me some hints?

Please enable logging sources/tools/nvds_logger/setup_nvds_logger.sh and run again to check the log for more info. you can set log level to 7 for debug.