Error for test5-app with azure iotedge text file using DeepStream 6

I am trying to run test5 app with test5_config_file_src_infer_azure_iotedge.txt this text file. but getting below errors -
(deepstream-test5-app:1): GLib-CRITICAL **: 09:18:54.676: g_strrstr: assertion ‘haystack != NULL’ failed
** ERROR: main:1455: Failed to set pipeline to PAUSED
Quitting
ERROR from sink_sub_bin_sink2: Could not initialize supporting library.
Debug info: gstnvmsgbroker.cpp(373): legacy_gst_nvmsgbroker_start (): /GstPipeline:pipeline/GstBin:sink_sub_bin2/GstNvMsgBroker:sink_sub_bin_sink2:
unable to open shared library
ERROR from sink_sub_bin_sink2: GStreamer error: state change failed and some element failed to post a proper error message with the reason for the failure.
Debug info: gstbasesink.c(5265): gst_base_sink_change_state (): /GstPipeline:pipeline/GstBin:sink_sub_bin2/GstNvMsgBroker:sink_sub_bin_sink2:
Failed to start
App run failed

I tried with devel and samples dockers too… both are giving same errors.

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
• DeepStream Version
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

Here are the details -
• Hardware Platform (Jetson / GPU) GPU
• DeepStream Version - 6.0
• JetPack Version (valid for Jetson only)
• TensorRT Version 8.0.1
• NVIDIA GPU Driver Version (valid for GPU only) 470.57
• Issue Type( questions, new requirements, bugs) bugs
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing) test5 app run with azure config file from docker
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description) issue with messagebroker lib

Can deepstream-test1-app run correctly on your platform?

Hi @Fiona.Chen and @kayccc deepstream-test1-app works correctly inside docker Also deepstream-test5-app withiout message broker config is working proper.
while using iotedge config for test5 it is giving below error -

(deepstream-test5-app:1): GLib-CRITICAL **: 09:34:43.692: g_strrstr: assertion 'haystack != NULL' failed
** ERROR: <main:1455>: Failed to set pipeline to PAUSED
Quitting
ERROR from sink_sub_bin_sink2: Could not initialize supporting library.
Debug info: gstnvmsgbroker.cpp(373): legacy_gst_nvmsgbroker_start (): /GstPipeline:pipeline/GstBin:sink_sub_bin2/GstNvMsgBroker:sink_sub_bin_sink2:
unable to open shared library
ERROR from sink_sub_bin_sink2: GStreamer error: state change failed and some element failed to post a proper error message with the reason for the failure.
Debug info: gstbasesink.c(5265): gst_base_sink_change_state (): /GstPipeline:pipeline/GstBin:sink_sub_bin2/GstNvMsgBroker:sink_sub_bin_sink2:
Failed to start
App run failed

Any update on this? I’m also facing same issue with DeepStream 6 docker. I have checked the libnvds_azure_edge_proto.so and libnvds_msgconv.so files are present in folder /opt/nvidia/deepstream/deepstream-6.0/lib. Except message broker sink all other sinks are working fine in test5 app.

Hi @Fiona.Chen and @kayccc I tried with nvcr.io/nvidia/deepstream:6.0-iot , nvcr.io/nvidia/deepstream:6.0-samples and nvcr.io/nvidia/deepstream:6.0-devel dockers But everywhere it is giving same error.

@Fiona.Chen and @kayccc you need any extra details from my side ?
I have installed everything on machine as per DS-6 requirements.

With deepstream-test4 app and libnvds_azure_proto.so library, I get slightly different error messages but still related to gstnvmsgbroker and unable to open shared library . My error was due to wrong IoT hub and IoT Edge device’s configs.

To solve it, I followed this guide up to Verify successful configuration. You will need to do some extra steps to deploy your DeepStream container as a module on your Edge device.

Then, use your device’s primary connection string from Iot Edge as the connection string in your cfg file.

Hi ,
I referred same guide for IOT installations and I am having latest version of iotedge.
Also added primary configuration string to cfg file.
But still facing same error.

The error message shows that the proto_lib can not be opened correctly. Can you try to run the deepstream-test5 case with root priviledge?

@Fiona.Chen I am runing deepstream-test5-app inside docker with IOTEdge messagebroker sink. So we cant run app seprately with root priviledges.

Perhaps the azure server is not configured and running correctly. Deepstream failed to connect to the server.

I tested with same azure server with Deepstream -5 and it is running correctly… Connection between Azure iotedge and DS-5 works properly.

So azure server is correctly configured , But facing issues with DS-6.

Please use below to get more logs.
#check edge runtime status
systemctl status iotedge

#List the modules running
sudo iotedge list

#check output from the modules
sudo iotedge logs test_async

Can you try " ldd libnvds_azure_edge_proto.so" in the container?

Seems libcurl3 is missing in the container. You may try to install libcurl3 and check whether it can help to resolve this issue.