Deepstream-test5 doesnt receive mqtt message to trigger smart video recording

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) GPU
• DeepStream Version 6.3
• TensorRT Version 8.6.1.6-1+cuda12.0
• NVIDIA GPU Driver Version (valid for GPU only) 535.104.12

• Issue Type( questions, new requirements, bugs) bugs
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)

Problem Statement: When I send an MQTT topic to trigger smart video recording on the deepstream-test5 reference graph, the app works, but no video is recorded. Here’s what I have done for troubleshooting so far:

The original deepstream-test5 reference graph works on my GPU, but it uses the Kafka protocol. I want to use the MQTT protocol instead, so I made the following two changes in the graph:
Proto-lib: Path to ‘libnvds_mqtt_proto.co’
Conn-str: 127.0.0.1;1883
image

The app works without any complaints, but when the MQTT message was published to the Mosquitto broker, no video was recorded. I used the Paho MQTT client to publish messages. I confirmed that the message was sent successfully using the command mosquitto_sub -t test-topic-sr and here is the screenshot from the command::

I have also confirmed that the deepstream-test5 client is connected to the Mosquitto broker using the command journalctl -xefu mosquitto.service which returns:
image

Next, I used Wireshark to monitor the network connection, and I can see that the deepstream-test5 client sent a subscribe request, as shown below:

So far, everything seems to be working fine. Finally, I enabled the debug flag in the deepstream-test5 “message broker cloud2device receiver” plugin. I expected to see “Received Message Topic: ‘test-topic-sr’ Message: …” printed in the terminal, but it didn’t. My hypothesis is that, for some reason, the MQTT topic was not received by the “message broker cloud2device receiver” plugin, and I wonder why. Any help would be appreciated. Thank you.

Side notes (not sure if they matter): I installed the MQTT protocol adapter following the instruction provided, but later I upgraded Mosquitto to version 2.0.18.

Sorry for late response. We are investigating the problem. Have you tried with deepstream-test5 c/c++ sample? Can the deepstream-test5 c/c++ sample enable the smart recording by the mqtt message?

For DeepStream 6.3, the corresponding Mosquitto version is 1.6.15. Can you use the same version?

Thank you for the reply. I tried Mosquitto 1.6.15 but it didn’t solve the issue. Please find the details below:

  • I installed mosquitto-1.6.15 and copied libmosquitto* to deepstream/lib directory
    I killed the current mosquitto broker and started the new one.
    I have a paho client connected to the broker to publish topic of ‘test-topic-sr’, shown below at 1710452459.
    I have a deepstream app client connected to the broker too to subscribe to that topic, shown blow at 1710452470.
    Since it (deepstream one) subscribed but never received anything, it was kicked out by the broker eventually, shown below at 1710452561.
1710452449: mosquitto version 1.6.15 starting
1710452449: Using default config.
1710452449: Opening ipv4 listen socket on port 1883.
1710452449: Opening ipv6 listen socket on port 1883.
1710452449: mosquitto version 1.6.15 running
1710452449: New connection from 127.0.0.1 on port 1883.
1710452449: New client connected from 127.0.0.1 as mosq-QqrKKjDBhsPMPKCb16 (p2, c1, k60).
1710452459: New connection from 127.0.0.1 on port 1883.
1710452459: New client connected from 127.0.0.1 as python-mqtt-817 (p2, c1, k60).  # paho 
1710452470: New connection from 127.0.0.1 on port 1883.
1710452470: New client connected from 127.0.0.1 as uniqueID (p2, c1, k60).  # deepstream 
1710452561: Client uniqueID has exceeded timeout, disconnecting.
  • The paho client is sending ‘test-topic-sr’ confirmed by calling mosquitto_sub with the output shown below:
    test-topic-sr {"command": "start-recording", "start": "2024-03-14T21:43:19Z", "sensor": {"id": "HWY_20_AND_LOCUST__WBA__4_11_2018_4_59_59_379_AM_UTC-07_00"}}

  • When I run the graph on x84, here is the output from terminal
    I can tell the client sent CONNECT and sent SUBSCRIBE but there is no log showing it received the topic.

Graphs: mqtt_deepstream-test5.yaml
Target: ../common/target_x86_64.yaml
===================================================================
Running mqtt_deepstream-test5.yaml
===================================================================
[INFO] Writing manifest to /tmp/ds.mqtt_deepstream-test5/manifest.yaml 
2024-03-14 14:41:10.616 INFO  gxf/gxe/gxe.cpp@182: Creating context
2024-03-14 14:41:10.774 INFO  gxf/gxe/gxe.cpp@107: Loading app: '/home/lam/nvgraph/deepstream-test5/mqtt_deepstream-test5.yaml'
2024-03-14 14:41:10.774 INFO  gxf/std/yaml_file_loader.cpp@170: Loading GXF entities from YAML file '/home/lam/nvgraph/deepstream-test5/mqtt_deepstream-test5.yaml'...
2024-03-14 14:41:10.784 INFO  gxf/gxe/gxe.cpp@259: Initializing...
2024-03-14 14:41:10.794 INFO  extensions/nvdsbase/nvds_scheduler.cpp@270: This program is linked against GStreamer 1.16.3 

2024-03-14 14:41:10.795 INFO  extensions/nvdsmuxdemux/nvstreammux.hpp@27: initialize: nvstreammux streammux

2024-03-14 14:41:10.795 INFO  extensions/nvdstracker/nvtrackerbin.hpp@23: initialize: nvtrackerbin tracker

2024-03-14 14:41:10.795 INFO  ./extensions/nvdsbase/tee.hpp@23: initialize: tee tee_1

2024-03-14 14:41:10.796 INFO  extensions/nvdsinference/nvinferbin.hpp@24: initialize: nvinferbin object_detector

2024-03-14 14:41:10.796 INFO  extensions/nvdsinference/nvinferbin.hpp@24: initialize: nvinferbin car_color_classifier

2024-03-14 14:41:10.796 INFO  extensions/nvdsinference/nvinferbin.hpp@24: initialize: nvinferbin car_make_classifier

2024-03-14 14:41:10.796 INFO  extensions/nvdsinference/nvinferbin.hpp@24: initialize: nvinferbin vehicle_type_classifier

2024-03-14 14:41:10.796 INFO  extensions/nvdsbase/nvdsbuffersyncbin.hpp@23: initialize: nvdsbuffersyncbin buffer_sync

2024-03-14 14:41:10.797 INFO  extensions/nvdsvisualization/nvtilerbin.hpp@37: initialize: nvtilerbin tiler

2024-03-14 14:41:10.797 INFO  extensions/nvdsvisualization/nvosdbin.hpp@24: initialize: nvosdbin onscreen_display

2024-03-14 14:41:10.797 INFO  extensions/nvdsoutputsink/nvvideorenderersinkbin.hpp@24: initialize: nvvideorenderersinkbin video_renderer

2024-03-14 14:41:10.797 WARN  extensions/nvdsinferenceutils/kitti_dump.cpp@32: [NvDsKittiDump: kitti_dump_detector] /tmp/detector-kitti-out is not accessible. Disabling kitti dump.
2024-03-14 14:41:10.797 WARN  extensions/nvdsinferenceutils/kitti_dump.cpp@32: [NvDsKittiDump: kitti_dump_tracker] /tmp/tracker-kitti-out is not accessible. Disabling kitti dump.
2024-03-14 14:41:10.798 INFO  ./extensions/nvdsbase/tee.hpp@23: initialize: tee tee_2

[mosq_mqtt_log_callback] Client uniqueID sending CONNECT
[mosq_mqtt_log_callback] Client uniqueID sending SUBSCRIBE (Mid: 1, Topic: test-topic-sr, QoS: 0, Options: 0x00)
2024-03-14 14:41:10.800 INFO  gxf/gxe/gxe.cpp@266: Running...
2024-03-14 14:41:10.800 INFO  extensions/nvdsbase/nvds_scheduler.cpp@121: Scheduling 13 elements and 14 components
2024-03-14 14:41:10.800 INFO  extensions/nvdssource/multi_uri_src_bin.cpp@343: create_element: NvDsMultiSrcInput multiple_source_input

2024-03-14 14:41:10.800 INFO  extensions/nvdssource/multi_uri_src_bin.cpp@389: bin_add: bin multiple_source_input

2024-03-14 14:41:10.801 INFO  extensions/nvdsmuxdemux/nvstreammux.hpp@37: create_element: nvstreammux streammux

2024-03-14 14:41:10.825 INFO  extensions/nvdsmuxdemux/nvstreammux.hpp@61: bin_add: nvstreammux streammux

2024-03-14 14:41:10.825 INFO  extensions/nvdstracker/nvtrackerbin.hpp@31: create_element: nvtrackerbin tracker

2024-03-14 14:41:10.867 INFO  extensions/nvdstracker/nvtrackerbin.hpp@55: bin_add: nvtrackerbin tracker

2024-03-14 14:41:10.867 INFO  ./extensions/nvdsbase/tee.hpp@31: create_element: tee tee_1

2024-03-14 14:41:10.867 INFO  ./extensions/nvdsbase/tee.hpp@55: bin_add: tee tee_1

2024-03-14 14:41:10.867 INFO  extensions/nvdsinference/nvinferbin.hpp@32: create_element: nvinferbin object_detector

2024-03-14 14:41:10.868 INFO  extensions/nvdsinference/nvinferbin.hpp@56: bin_add: nvinferbin object_detector

2024-03-14 14:41:10.868 INFO  extensions/nvdsinference/nvinferbin.hpp@32: create_element: nvinferbin car_color_classifier

2024-03-14 14:41:10.868 INFO  extensions/nvdsinference/nvinferbin.hpp@56: bin_add: nvinferbin car_color_classifier

2024-03-14 14:41:10.868 INFO  extensions/nvdsinference/nvinferbin.hpp@32: create_element: nvinferbin car_make_classifier

2024-03-14 14:41:10.868 INFO  extensions/nvdsinference/nvinferbin.hpp@56: bin_add: nvinferbin car_make_classifier

2024-03-14 14:41:10.868 INFO  extensions/nvdsinference/nvinferbin.hpp@32: create_element: nvinferbin vehicle_type_classifier

2024-03-14 14:41:10.868 INFO  extensions/nvdsinference/nvinferbin.hpp@56: bin_add: nvinferbin vehicle_type_classifier

2024-03-14 14:41:10.869 INFO  extensions/nvdsbase/nvdsbuffersyncbin.hpp@31: create_element: nvdsbuffersyncbin buffer_sync

2024-03-14 14:41:10.869 INFO  extensions/nvdsbase/nvdsbuffersyncbin.hpp@61: bin_add: nvdsbuffersyncbin buffer_sync

2024-03-14 14:41:10.869 INFO  extensions/nvdsvisualization/nvtilerbin.hpp@45: create_element: nvtilerbin tiler

2024-03-14 14:41:10.869 INFO  extensions/nvdsvisualization/nvtilerbin.hpp@69: bin_add: nvtilerbin tiler

2024-03-14 14:41:10.869 INFO  extensions/nvdsvisualization/nvosdbin.hpp@32: create_element: nvosdbin onscreen_display

2024-03-14 14:41:10.870 INFO  extensions/nvdsvisualization/nvosdbin.hpp@56: bin_add: nvosdbin onscreen_display

2024-03-14 14:41:10.870 INFO  extensions/nvdsoutputsink/nvvideorenderersinkbin.hpp@32: create_element: nvvideorenderersinkbin video_renderer

2024-03-14 14:41:10.870 INFO  extensions/nvdsoutputsink/nvvideorenderersinkbin.hpp@54: bin_add: nvvideorenderersinkbin video_renderer

2024-03-14 14:41:10.870 INFO  ./extensions/nvdsbase/tee.hpp@31: create_element: tee tee_2

2024-03-14 14:41:10.870 INFO  ./extensions/nvdsbase/tee.hpp@55: bin_add: tee tee_2

Runtime keyboard controls for tiler:
z<row-idx><col-idx> : Expand source at row-idx,col-idx in the tile.
z : Go back to the tiled view when in single source mode.
Text disabled. Use keyboard/mouse commands to toggle source expand and toggle text display.
0:00:05.267570224 66640 0x7fd0f0cd7ad0 INFO                 nvinfer gstnvinfer.cpp:682:gst_nvinfer_logger:<nvinfer_bin_nvinfer> NvDsInferContext[UID 4]: Info from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1988> [UID = 4]: deserialized trt engine from :/tmp/ds.mqtt_deepstream-test5/deepstream/sample_models/sec.vehicletypes.resnet18.caffemodel_b16_gpu0_int8.engine
INFO: ../nvdsinfer/nvdsinfer_model_builder.cpp:610 [Implicit Engine Info]: layers num: 2
0   INPUT  kFLOAT input_1         3x224x224       
1   OUTPUT kFLOAT predictions/Softmax 6x1x1           

0:00:05.344560449 66640 0x7fd0f0cd7ad0 INFO                 nvinfer gstnvinfer.cpp:682:gst_nvinfer_logger:<nvinfer_bin_nvinfer> NvDsInferContext[UID 4]: Info from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2091> [UID = 4]: Use deserialized engine model: /tmp/ds.mqtt_deepstream-test5/deepstream/sample_models/sec.vehicletypes.resnet18.caffemodel_b16_gpu0_int8.engine
0:00:05.347250004 66640 0x7fd0f0cd7ad0 INFO                 nvinfer gstnvinfer_impl.cpp:328:notifyLoadModelStatus:<nvinfer_bin_nvinfer> [UID 4]: Load new model:/tmp/ds.mqtt_deepstream-test5/deepstream/sample_models/config_infer_secondary_vehicletypes.txt sucessfully
0:00:10.137090629 66640 0x7fd0f0cd7ad0 INFO                 nvinfer gstnvinfer.cpp:682:gst_nvinfer_logger:<nvinfer_bin_nvinfer> NvDsInferContext[UID 5]: Info from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1988> [UID = 5]: deserialized trt engine from :/tmp/ds.mqtt_deepstream-test5/deepstream/sample_models/sec.carcolor.resnet18.caffemodel_b16_gpu0_int8.engine
INFO: ../nvdsinfer/nvdsinfer_model_builder.cpp:610 [Implicit Engine Info]: layers num: 2
0   INPUT  kFLOAT input_1         3x224x224       
1   OUTPUT kFLOAT predictions/Softmax 12x1x1          

0:00:10.230174166 66640 0x7fd0f0cd7ad0 INFO                 nvinfer gstnvinfer.cpp:682:gst_nvinfer_logger:<nvinfer_bin_nvinfer> NvDsInferContext[UID 5]: Info from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2091> [UID = 5]: Use deserialized engine model: /tmp/ds.mqtt_deepstream-test5/deepstream/sample_models/sec.carcolor.resnet18.caffemodel_b16_gpu0_int8.engine
0:00:10.232881887 66640 0x7fd0f0cd7ad0 INFO                 nvinfer gstnvinfer_impl.cpp:328:notifyLoadModelStatus:<nvinfer_bin_nvinfer> [UID 5]: Load new model:/tmp/ds.mqtt_deepstream-test5/deepstream/sample_models/config_infer_secondary_carcolor.txt sucessfully
0:00:15.053377932 66640 0x7fd0f0cd7ad0 INFO                 nvinfer gstnvinfer.cpp:682:gst_nvinfer_logger:<nvinfer_bin_nvinfer> NvDsInferContext[UID 6]: Info from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1988> [UID = 6]: deserialized trt engine from :/tmp/ds.mqtt_deepstream-test5/deepstream/sample_models/sec.carmake.resnet18.caffemodel_b16_gpu0_int8.engine
INFO: ../nvdsinfer/nvdsinfer_model_builder.cpp:610 [Implicit Engine Info]: layers num: 2
0   INPUT  kFLOAT input_1         3x224x224       
1   OUTPUT kFLOAT predictions/Softmax 20x1x1          

0:00:15.175338778 66640 0x7fd0f0cd7ad0 INFO                 nvinfer gstnvinfer.cpp:682:gst_nvinfer_logger:<nvinfer_bin_nvinfer> NvDsInferContext[UID 6]: Info from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2091> [UID = 6]: Use deserialized engine model: /tmp/ds.mqtt_deepstream-test5/deepstream/sample_models/sec.carmake.resnet18.caffemodel_b16_gpu0_int8.engine
0:00:15.178262958 66640 0x7fd0f0cd7ad0 INFO                 nvinfer gstnvinfer_impl.cpp:328:notifyLoadModelStatus:<nvinfer_bin_nvinfer> [UID 6]: Load new model:/tmp/ds.mqtt_deepstream-test5/deepstream/sample_models/config_infer_secondary_carmake.txt sucessfully
gstnvtracker: Loading low-level lib at /opt/nvidia/deepstream/deepstream/lib/libnvds_nvmultiobjecttracker.so
[NvMultiObjectTracker] Initialized
0:00:20.005549058 66640 0x7fd0f0cd7ad0 INFO                 nvinfer gstnvinfer.cpp:682:gst_nvinfer_logger:<nvinfer_bin_nvinfer> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1988> [UID = 1]: deserialized trt engine from :/tmp/ds.mqtt_deepstream-test5/deepstream/sample_models/primary.resnet10.caffemodel_b4_gpu0_int8.engine
INFO: ../nvdsinfer/nvdsinfer_model_builder.cpp:610 [Implicit Engine Info]: layers num: 3
0   INPUT  kFLOAT input_1         3x368x640       
1   OUTPUT kFLOAT conv2d_bbox     16x23x40        
2   OUTPUT kFLOAT conv2d_cov/Sigmoid 4x23x40         

0:00:20.086332471 66640 0x7fd0f0cd7ad0 INFO                 nvinfer gstnvinfer.cpp:682:gst_nvinfer_logger:<nvinfer_bin_nvinfer> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2091> [UID = 1]: Use deserialized engine model: /tmp/ds.mqtt_deepstream-test5/deepstream/sample_models/primary.resnet10.caffemodel_b4_gpu0_int8.engine
0:00:20.087023938 66640 0x7fd0f0cd7ad0 INFO                 nvinfer gstnvinfer_impl.cpp:328:notifyLoadModelStatus:<nvinfer_bin_nvinfer> [UID 1]: Load new model:/tmp/ds.mqtt_deepstream-test5/deepstream/sample_models/config_infer_primary.txt sucessfully
NOTE: To expand a source in the 2D tiled display and view object details, left-click on the source.
      To go back to the tiled display, right-click anywhere on the window.
Running...
****** NvDsScheduler Runtime Keyboard controls:
p: Pause pipeline
r: Resume pipeline
q: Quit pipeline

**PERF:  
**PERF:  
2024-03-14 14:41:30.874 INFO  extensions/nvdsbase/nvds_scheduler.cpp@398: NvDsScheduler Pipeline ready

2024-03-14 14:41:30.875 INFO  extensions/nvdsbase/nvds_scheduler.cpp@383: NvDsScheduler Pipeline running

**PERF:  

**PERF:  FPS 0 (Avg)	FPS 3 (Avg)	
**PERF:  31.85 (31.85)	32.16 (32.16)	
**PERF:  30.00 (30.56)	30.00 (30.66)	
**PERF:  30.00 (30.33)	30.00 (30.39)
  • I’ve also tried deepstream-6.4 (with Ubuntu 22.04, GStreamer 1.20.3, Nvidia driver 535, CUDA 12.2, TensorRT 8.6.1.6, mosquitto 2.0.15, reference_graph-6.4, graph composer-3.1.0) but the issue remained the same

Have you tried with deepstream-test5 c/c++ sample? Can the deepstream-test5 c/c++ sample enable the smart recording by the mqtt message?

Yeah I did it too but I found more issues. Here are the details:

  • I created rtsp stream from files using the following commands:
cvlc --loop sample_push.mov ":sout=#gather:rtp{sdp=rtsp://:8654/push}" :network-caching=1500 :sout-all :sout-keep
VLC media player 3.0.9.2 Vetinari (revision 3.0.9.2-0-gd4c1aefe4d)
[00005647d8c30620] dummy interface: using the dummy interface module...
  • I confirmed that the uri is valid by playing the stream with vlc.

  • Then I updated the uri in sources_rtsp.csv file, attached here:
    sources_rtsp.txt (127 Bytes)

  • Then I run the deepstream-test5 c/c++ app and I am seeing two issues:

  1. “No supported stream was found”
  2. Client doesn’t even send SUBSCRIBE (it only sends CONNECT)

The full terminal output is attached below:

(base) HP-Z4-G4-Workstation:/opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs$ ../deepstream-test5-app -c test5_dec_infer-resnet_tracker_sgie_tiled_display_int8.yml 
[mosq_mqtt_log_callback] Client uniqueID sending CONNECT
WARNING: ../nvdsinfer/nvdsinfer_model_builder.cpp:1487 Deserialize engine failed because file path: /opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Secondary_CarMake/resnet18.caffemodel_b16_gpu0_int8.engine open error
0:00:05.208506407 2545536 0x55f0bc82d860 WARN                 nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<secondary_gie_2> NvDsInferContext[UID 6]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1976> [UID = 6]: deserialize engine from file :/opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Secondary_CarMake/resnet18.caffemodel_b16_gpu0_int8.engine failed
0:00:05.311209208 2545536 0x55f0bc82d860 WARN                 nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<secondary_gie_2> NvDsInferContext[UID 6]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2081> [UID = 6]: deserialize backend context from engine from file :/opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Secondary_CarMake/resnet18.caffemodel_b16_gpu0_int8.engine failed, try rebuild
0:00:05.311238867 2545536 0x55f0bc82d860 INFO                 nvinfer gstnvinfer.cpp:682:gst_nvinfer_logger:<secondary_gie_2> NvDsInferContext[UID 6]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:2002> [UID = 6]: Trying to create engine from model files
WARNING: [TRT]: The implicit batch dimension mode has been deprecated. Please create the network with NetworkDefinitionCreationFlag::kEXPLICIT_BATCH flag whenever possible.
Warning: Flatten layer ignored. TensorRT implicitly flattens input to FullyConnected layers, but in other circumstances this will result in undefined behavior.
WARNING: ../nvdsinfer/nvdsinfer_model_builder.cpp:1459 Serialize engine failed because of file path: /opt/nvidia/deepstream/deepstream-6.3/samples/models/Secondary_CarMake/resnet18.caffemodel_b16_gpu0_int8.engine opened error
0:00:35.421360362 2545536 0x55f0bc82d860 WARN                 nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<secondary_gie_2> NvDsInferContext[UID 6]: Warning from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:2029> [UID = 6]: failed to serialize cude engine to file: /opt/nvidia/deepstream/deepstream-6.3/samples/models/Secondary_CarMake/resnet18.caffemodel_b16_gpu0_int8.engine
INFO: ../nvdsinfer/nvdsinfer_model_builder.cpp:610 [Implicit Engine Info]: layers num: 2
0   INPUT  kFLOAT input_1         3x224x224       
1   OUTPUT kFLOAT predictions/Softmax 20x1x1          

0:00:35.505874719 2545536 0x55f0bc82d860 INFO                 nvinfer gstnvinfer_impl.cpp:328:notifyLoadModelStatus:<secondary_gie_2> [UID 6]: Load new model:/opt/nvidia/deepstream/deepstream-6.3/samples/configs/deepstream-app/config_infer_secondary_carmake.yml sucessfully
WARNING: ../nvdsinfer/nvdsinfer_model_builder.cpp:1487 Deserialize engine failed because file path: /opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Secondary_CarColor/resnet18.caffemodel_b16_gpu0_int8.engine open error
0:00:40.229526916 2545536 0x55f0bc82d860 WARN                 nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<secondary_gie_1> NvDsInferContext[UID 5]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1976> [UID = 5]: deserialize engine from file :/opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Secondary_CarColor/resnet18.caffemodel_b16_gpu0_int8.engine failed
0:00:40.310976031 2545536 0x55f0bc82d860 WARN                 nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<secondary_gie_1> NvDsInferContext[UID 5]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2081> [UID = 5]: deserialize backend context from engine from file :/opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Secondary_CarColor/resnet18.caffemodel_b16_gpu0_int8.engine failed, try rebuild
0:00:40.310994256 2545536 0x55f0bc82d860 INFO                 nvinfer gstnvinfer.cpp:682:gst_nvinfer_logger:<secondary_gie_1> NvDsInferContext[UID 5]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:2002> [UID = 5]: Trying to create engine from model files
WARNING: [TRT]: The implicit batch dimension mode has been deprecated. Please create the network with NetworkDefinitionCreationFlag::kEXPLICIT_BATCH flag whenever possible.
Warning: Flatten layer ignored. TensorRT implicitly flattens input to FullyConnected layers, but in other circumstances this will result in undefined behavior.
WARNING: ../nvdsinfer/nvdsinfer_model_builder.cpp:1459 Serialize engine failed because of file path: /opt/nvidia/deepstream/deepstream-6.3/samples/models/Secondary_CarColor/resnet18.caffemodel_b16_gpu0_int8.engine opened error
0:01:07.867744231 2545536 0x55f0bc82d860 WARN                 nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<secondary_gie_1> NvDsInferContext[UID 5]: Warning from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:2029> [UID = 5]: failed to serialize cude engine to file: /opt/nvidia/deepstream/deepstream-6.3/samples/models/Secondary_CarColor/resnet18.caffemodel_b16_gpu0_int8.engine
INFO: ../nvdsinfer/nvdsinfer_model_builder.cpp:610 [Implicit Engine Info]: layers num: 2
0   INPUT  kFLOAT input_1         3x224x224       
1   OUTPUT kFLOAT predictions/Softmax 12x1x1          

0:01:07.954794917 2545536 0x55f0bc82d860 INFO                 nvinfer gstnvinfer_impl.cpp:328:notifyLoadModelStatus:<secondary_gie_1> [UID 5]: Load new model:/opt/nvidia/deepstream/deepstream-6.3/samples/configs/deepstream-app/config_infer_secondary_carcolor.yml sucessfully
WARNING: ../nvdsinfer/nvdsinfer_model_builder.cpp:1487 Deserialize engine failed because file path: /opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Secondary_VehicleTypes/resnet18.caffemodel_b16_gpu0_int8.engine open error
0:01:12.655908462 2545536 0x55f0bc82d860 WARN                 nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<secondary_gie_0> NvDsInferContext[UID 4]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1976> [UID = 4]: deserialize engine from file :/opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Secondary_VehicleTypes/resnet18.caffemodel_b16_gpu0_int8.engine failed
0:01:12.737114996 2545536 0x55f0bc82d860 WARN                 nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<secondary_gie_0> NvDsInferContext[UID 4]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2081> [UID = 4]: deserialize backend context from engine from file :/opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Secondary_VehicleTypes/resnet18.caffemodel_b16_gpu0_int8.engine failed, try rebuild
0:01:12.737132134 2545536 0x55f0bc82d860 INFO                 nvinfer gstnvinfer.cpp:682:gst_nvinfer_logger:<secondary_gie_0> NvDsInferContext[UID 4]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:2002> [UID = 4]: Trying to create engine from model files
WARNING: [TRT]: The implicit batch dimension mode has been deprecated. Please create the network with NetworkDefinitionCreationFlag::kEXPLICIT_BATCH flag whenever possible.
Warning: Flatten layer ignored. TensorRT implicitly flattens input to FullyConnected layers, but in other circumstances this will result in undefined behavior.
WARNING: ../nvdsinfer/nvdsinfer_model_builder.cpp:1459 Serialize engine failed because of file path: /opt/nvidia/deepstream/deepstream-6.3/samples/models/Secondary_VehicleTypes/resnet18.caffemodel_b16_gpu0_int8.engine opened error
0:01:39.806730692 2545536 0x55f0bc82d860 WARN                 nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<secondary_gie_0> NvDsInferContext[UID 4]: Warning from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:2029> [UID = 4]: failed to serialize cude engine to file: /opt/nvidia/deepstream/deepstream-6.3/samples/models/Secondary_VehicleTypes/resnet18.caffemodel_b16_gpu0_int8.engine
INFO: ../nvdsinfer/nvdsinfer_model_builder.cpp:610 [Implicit Engine Info]: layers num: 2
0   INPUT  kFLOAT input_1         3x224x224       
1   OUTPUT kFLOAT predictions/Softmax 6x1x1           

0:01:39.892446484 2545536 0x55f0bc82d860 INFO                 nvinfer gstnvinfer_impl.cpp:328:notifyLoadModelStatus:<secondary_gie_0> [UID 4]: Load new model:/opt/nvidia/deepstream/deepstream-6.3/samples/configs/deepstream-app/config_infer_secondary_vehicletypes.yml sucessfully
gstnvtracker: Loading low-level lib at /opt/nvidia/deepstream/deepstream-6.3/lib/libnvds_nvmultiobjecttracker.so
[NvMultiObjectTracker] Initialized
WARNING: ../nvdsinfer/nvdsinfer_model_builder.cpp:1487 Deserialize engine failed because file path: /opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Primary_Detector/resnet10.caffemodel_b2_gpu0_int8.engine open error
0:01:44.589804269 2545536 0x55f0bc82d860 WARN                 nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1976> [UID = 1]: deserialize engine from file :/opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Primary_Detector/resnet10.caffemodel_b2_gpu0_int8.engine failed
0:01:44.670700255 2545536 0x55f0bc82d860 WARN                 nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2081> [UID = 1]: deserialize backend context from engine from file :/opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Primary_Detector/resnet10.caffemodel_b2_gpu0_int8.engine failed, try rebuild
0:01:44.670718598 2545536 0x55f0bc82d860 INFO                 nvinfer gstnvinfer.cpp:682:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:2002> [UID = 1]: Trying to create engine from model files
WARNING: [TRT]: The implicit batch dimension mode has been deprecated. Please create the network with NetworkDefinitionCreationFlag::kEXPLICIT_BATCH flag whenever possible.
WARNING: ../nvdsinfer/nvdsinfer_model_builder.cpp:1459 Serialize engine failed because of file path: /opt/nvidia/deepstream/deepstream-6.3/samples/models/Primary_Detector/resnet10.caffemodel_b2_gpu0_int8.engine opened error
0:02:11.431901259 2545536 0x55f0bc82d860 WARN                 nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:2029> [UID = 1]: failed to serialize cude engine to file: /opt/nvidia/deepstream/deepstream-6.3/samples/models/Primary_Detector/resnet10.caffemodel_b2_gpu0_int8.engine
INFO: ../nvdsinfer/nvdsinfer_model_builder.cpp:610 [Implicit Engine Info]: layers num: 3
0   INPUT  kFLOAT input_1         3x368x640       
1   OUTPUT kFLOAT conv2d_bbox     16x23x40        
2   OUTPUT kFLOAT conv2d_cov/Sigmoid 4x23x40         

0:02:11.516702783 2545536 0x55f0bc82d860 INFO                 nvinfer gstnvinfer_impl.cpp:328:notifyLoadModelStatus:<primary_gie> [UID 1]: Load new model:/opt/nvidia/deepstream/deepstream-6.3/samples/configs/deepstream-app/config_infer_primary.yml sucessfully

Runtime commands:
	h: Print this help
	q: Quit

	p: Pause
	r: Resume

NOTE: To expand a source in the 2D tiled display and view object details, left-click on the source.
      To go back to the tiled display, right-click anywhere on the window.

Active sources : 0

**PERF:  FPS 0 (Avg)	FPS 1 (Avg)	
Fri Mar 15 12:23:02 2024
**PERF:  0.00 (0.00)	0.00 (0.00)	
** INFO: <bus_callback:239>: Pipeline ready

ERROR from src_elem0: No supported stream was found. You might need to allow more transport protocols or may otherwise be missing the right GStreamer RTSP extension plugin.
Debug info: gstrtspsrc.c(7474): gst_rtspsrc_setup_streams_start (): /GstPipeline:pipeline/GstBin:multi_src_bin/GstBin:src_sub_bin0/GstRTSPSrc:src_elem0
** INFO: <reset_source_pipeline:1706>: Resetting source 0
ERROR from src_elem1: No supported stream was found. You might need to allow more transport protocols or may otherwise be missing the right GStreamer RTSP extension plugin.
Debug info: gstrtspsrc.c(7474): gst_rtspsrc_setup_streams_start (): /GstPipeline:pipeline/GstBin:multi_src_bin/GstBin:src_sub_bin1/GstRTSPSrc:src_elem1
** INFO: <reset_source_pipeline:1706>: Resetting source 1
ERROR from src_elem1: No supported stream was found. You might need to allow more transport protocols or may otherwise be missing the right GStreamer RTSP extension plugin.
Debug info: gstrtspsrc.c(7474): gst_rtspsrc_setup_streams_start (): /GstPipeline:pipeline/GstBin:multi_src_bin/GstBin:src_sub_bin1/GstRTSPSrc:src_elem1
ERROR from src_elem0: No supported stream was found. You might need to allow more transport protocols or may otherwise be missing the right GStreamer RTSP extension plugin.
Debug info: gstrtspsrc.c(7474): gst_rtspsrc_setup_streams_start (): /GstPipeline:pipeline/GstBin:multi_src_bin/GstBin:src_sub_bin0/GstRTSPSrc:src_elem0
Active sources : 0
Fri Mar 15 12:23:07 2024
**PERF:  0.00 (0.00)	0.00 (0.00)	
q
Quitting
nvstreammux: Successfully handled EOS for source_id=0
nvstreammux: Successfully handled EOS for source_id=1
[NvMultiObjectTracker] De-initialized
[mosq_mqtt_log_callback] Client uniqueID sending DISCONNECT
mqtt disconnected
App run successful
  • I also tried a different config file: test5_config_file_src_infer.yml. It uses .mp4 files as input so I can focus on the mqtt side. But I am seeing the same issue:
  1. Client doesn’t even send SUBSCRIBE (it only sends CONNECT)

Again the full terminal output is attached below:

(base) HP-Z4-G4-Workstation:/opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs$ ../deepstream-test5-app -c test5_config_file_src_infer.yml 
[mosq_mqtt_log_callback] Client uniqueID sending CONNECT
gstnvtracker: Loading low-level lib at /opt/nvidia/deepstream/deepstream-6.3/lib/libnvds_nvmultiobjecttracker.so
[NvMultiObjectTracker] Initialized
WARNING: ../nvdsinfer/nvdsinfer_model_builder.cpp:1487 Deserialize engine failed because file path: /opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Primary_Detector/resnet10.caffemodel_b4_gpu0_int8.engine open error
0:00:05.243597994 2701625 0x559fd5c58b30 WARN                 nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1976> [UID = 1]: deserialize engine from file :/opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Primary_Detector/resnet10.caffemodel_b4_gpu0_int8.engine failed
0:00:05.321569142 2701625 0x559fd5c58b30 WARN                 nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2081> [UID = 1]: deserialize backend context from engine from file :/opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Primary_Detector/resnet10.caffemodel_b4_gpu0_int8.engine failed, try rebuild
0:00:05.321587410 2701625 0x559fd5c58b30 INFO                 nvinfer gstnvinfer.cpp:682:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:2002> [UID = 1]: Trying to create engine from model files
WARNING: [TRT]: The implicit batch dimension mode has been deprecated. Please create the network with NetworkDefinitionCreationFlag::kEXPLICIT_BATCH flag whenever possible.
WARNING: ../nvdsinfer/nvdsinfer_model_builder.cpp:1459 Serialize engine failed because of file path: /opt/nvidia/deepstream/deepstream-6.3/samples/models/Primary_Detector/resnet10.caffemodel_b4_gpu0_int8.engine opened error
0:00:38.411060756 2701625 0x559fd5c58b30 WARN                 nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:2029> [UID = 1]: failed to serialize cude engine to file: /opt/nvidia/deepstream/deepstream-6.3/samples/models/Primary_Detector/resnet10.caffemodel_b4_gpu0_int8.engine
INFO: ../nvdsinfer/nvdsinfer_model_builder.cpp:610 [Implicit Engine Info]: layers num: 3
0   INPUT  kFLOAT input_1         3x368x640       
1   OUTPUT kFLOAT conv2d_bbox     16x23x40        
2   OUTPUT kFLOAT conv2d_cov/Sigmoid 4x23x40         

0:00:38.495829200 2701625 0x559fd5c58b30 INFO                 nvinfer gstnvinfer_impl.cpp:328:notifyLoadModelStatus:<primary_gie> [UID 1]: Load new model:/opt/nvidia/deepstream/deepstream-6.3/samples/configs/deepstream-app/config_infer_primary.yml sucessfully

Runtime commands:
	h: Print this help
	q: Quit

	p: Pause
	r: Resume

NOTE: To expand a source in the 2D tiled display and view object details, left-click on the source.
      To go back to the tiled display, right-click anywhere on the window.

Active sources : 0

**PERF:  FPS 0 (Avg)	FPS 1 (Avg)	FPS 2 (Avg)	FPS 3 (Avg)	
Fri Mar 15 12:40:00 2024
**PERF:  0.00 (0.00)	0.00 (0.00)	0.00 (0.00)	0.00 (0.00)	
** INFO: <bus_callback:239>: Pipeline ready

** INFO: <bus_callback:225>: Pipeline running

WARNING; playback mode used with URI [file:///opt/nvidia/deepstream/deepstream-6.3/samples/streams/sample_1080p_h264.mp4] not conforming to timestamp format; check README; using system-time
WARNING; playback mode used with URI [file:///opt/nvidia/deepstream/deepstream-6.3/samples/streams/sample_1080p_h264.mp4] not conforming to timestamp format; check README; using system-time
WARNING; playback mode used with URI [file:///opt/nvidia/deepstream/deepstream-6.3/samples/streams/sample_1080p_h264.mp4] not conforming to timestamp format; check README; using system-time
[mosq_mqtt_log_callback] Client uniqueID sending PUBLISH (d0, q0, r0, m1, 'test-topic', ... (1625 bytes))
[mosq_mqtt_log_callback] Client uniqueID sending PUBLISH (d0, q0, r0, m2, 'test-topic', ... (1624 bytes))
[mosq_mqtt_log_callback] Client uniqueID sending PUBLISH (d0, q0, r0, m3, 'test-topic', ... (1624 bytes))
...
Publish callback with reason code: Success.
Publish callback with reason code: Success.
Publish callback with reason code: Success.
[NvMultiObjectTracker] De-initialized
[mosq_mqtt_log_callback] Client uniqueID sending DISCONNECT
mqtt disconnected
App run successful

What I changed in the config files?

  • sink1: msg-broker-proto-lib, msg-broker-conn-str
  • message-consumer0: proto-lib, conn-str, config-file

Please find the modified config files attached here:
test5_config_file_src_infer.txt (6.6 KB)
test5_dec_infer-resnet_tracker_sgie_tiled_display_int8.txt (8.4 KB)

I also attached the Wireshark log during my test with test5_config_file_src_infer.yml. It is confirmed that the deepstream-test5 client didn’t send SUBSCRIBE and test-topic-sr was not received by any clients.
dstest5_c_start_while_publisher_py_run.txt (835.8 KB)

For vlc issue, please refer to windows - Rtsp h264 missing plugin - Stack Overflow

For the c/c++ deepstream-test5, the subscribe is configured by message-consumer group : DeepStream Reference Application - deepstream-app — DeepStream documentation 6.4 documentation, please enable message-consumer group.

And the smart recording instruction is Smart Video Record — DeepStream documentation 6.4 documentation

The GXF extension is nvidia::deepstream::NvDsMsgRelayReceiver

Thank you again for your reply.

Regarding the vlc issue, I run
GST_DEBUG=3 gst-launch-1.0 rtspsrc location="rtsp://127.0.0.1:8654/push" latency=100 ! rtph264depay ! avdec_h264 ! autovideosink
and I didn’t get any issue. I can view the streaming video just fine though it outputs a few warning. Here is the complete output.

HP-Z4-G4-Workstation:~$ GST_DEBUG=3 gst-launch-1.0 rtspsrc location="rtsp://127.0.0.1:8654/push" latency=100 ! rtph264depay ! avdec_h264 ! autovideosink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://127.0.0.1:8654/push
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
0:00:00.189106549 3200740 0x563f4613cde0 WARN                  udpsrc gstudpsrc.c:1445:gst_udpsrc_open:<udpsrc0> warning: Could not create a buffer of requested 524288 bytes (Operation not permitted). Need net.admin privilege?
0:00:00.189171318 3200740 0x563f4613cde0 WARN                  udpsrc gstudpsrc.c:1455:gst_udpsrc_open:<udpsrc0> have udp buffer of 212992 bytes while 524288 were requested
0:00:00.189619114 3200740 0x563f4613cde0 WARN                  udpsrc gstudpsrc.c:1445:gst_udpsrc_open:<udpsrc1> warning: Could not create a buffer of requested 524288 bytes (Operation not permitted). Need net.admin privilege?
0:00:00.189668133 3200740 0x563f4613cde0 WARN                  udpsrc gstudpsrc.c:1455:gst_udpsrc_open:<udpsrc1> have udp buffer of 212992 bytes while 524288 were requested
Progress: (request) SETUP stream 0
0:00:00.214864589 3200740 0x563f4613cde0 WARN                  udpsrc gstudpsrc.c:1445:gst_udpsrc_open:<udpsrc3> warning: Could not create a buffer of requested 524288 bytes (Operation not permitted). Need net.admin privilege?
0:00:00.214933911 3200740 0x563f4613cde0 WARN                  udpsrc gstudpsrc.c:1455:gst_udpsrc_open:<udpsrc3> have udp buffer of 212992 bytes while 524288 were requested
0:00:00.215300105 3200740 0x563f4613cde0 WARN                  udpsrc gstudpsrc.c:1445:gst_udpsrc_open:<udpsrc4> warning: Could not create a buffer of requested 524288 bytes (Operation not permitted). Need net.admin privilege?
0:00:00.215345150 3200740 0x563f4613cde0 WARN                  udpsrc gstudpsrc.c:1455:gst_udpsrc_open:<udpsrc4> have udp buffer of 212992 bytes while 524288 were requested
Progress: (request) SETUP stream 1
Progress: (open) Opened Stream
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Progress: (request) Sending PLAY request
0:00:00.254320561 3200740 0x7f79b401a4c0 FIXME                default gstutils.c:3980:gst_pad_create_stream_id_internal:<fakesrc0:src> Creating random stream-id, consider implementing a deterministic way of creating a stream-id
0:00:00.254449462 3200740 0x7f79b401a520 FIXME                default gstutils.c:3980:gst_pad_create_stream_id_internal:<fakesrc1:src> Creating random stream-id, consider implementing a deterministic way of creating a stream-id
Progress: (request) Sent PLAY request
0:00:00.297008039 3200740 0x7f7998020640 FIXME               basesink gstbasesink.c:3246:gst_base_sink_default_event:<autovideosink0-actual-sink-xvimage> stream-start event without group-id. Consider implementing group-id handling in the upstream elements
0:00:00.325694089 3200740 0x7f79ac003300 WARN                 basesrc gstbasesrc.c:3072:gst_base_src_loop:<udpsrc1> error: Internal data stream error.
0:00:00.325724480 3200740 0x7f79ac003300 WARN                 basesrc gstbasesrc.c:3072:gst_base_src_loop:<udpsrc1> error: streaming stopped, reason not-linked (-1)
Redistribute latency...
0:00:02.405143406 3200740 0x563f4613cc60 WARN             xvimagesink xvimagesink.c:554:gst_xv_image_sink_handle_xevents:<autovideosink0-actual-sink-xvimage> error: Output window was closed
ERROR: from element /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage: Output window was closed
Additional debug info:
xvimagesink.c(554): gst_xv_image_sink_handle_xevents (): /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage
Execution ended after 0:00:02.151498101
Setting pipeline to NULL ...
0:00:02.425435053 3200740 0x563f4613cde0 WARN                 rtspsrc gstrtspsrc.c:6326:gst_rtsp_src_receive_response:<rtspsrc0> receive interrupted
0:00:02.425448119 3200740 0x563f4613cde0 WARN                 rtspsrc gstrtspsrc.c:6424:gst_rtspsrc_try_send:<rtspsrc0> receive interrupted
0:00:02.425454491 3200740 0x563f4613cde0 WARN                 rtspsrc gstrtspsrc.c:8672:gst_rtspsrc_pause:<rtspsrc0> PAUSE interrupted
0:00:02.426186122 3200740 0x563f4613cde0 WARN                 rtspsrc gstrtspsrc.c:6579:gst_rtspsrc_send:<rtspsrc0> got NOT IMPLEMENTED, disable method TEARDOWN
Freeing pipeline ...

I have decided to no longer investigate the VLC issue. Instead, I have successfully created a deepstream framework that takes a v4l2 camera as input and outputs an RTSP stream effortlessly. Now, I can view the stream on the deepstream-test5 C/C++ application without any problems.

Regarding the second and third points, I have made the necessary corrections. You can find my updated configuration file here:
sources_rtsp.txt (346 Bytes)
test5_dec_infer-resnet_tracker_sgie_tiled_display_int8.txt (8.4 KB)

However, the issue of the test-topic-sr not being received by the client still persists. This issue is evident in the terminal output (pasted below) and can also be seen in Wireshark. The good news is that this issue is not limited to the graph composer but is also present in the C/C++ application.

HP-Z4-G4-Workstation:/opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt$ ./deepstream-test5-app -c configs/test5_dec_infer-resnet_tracker_sgie_tiled_display_int8.yml -p 0
[mosq_mqtt_log_callback] Client uniqueID sending CONNECT
[mosq_mqtt_log_callback] Client uniqueID sending SUBSCRIBE (Mid: 1, Topic: test-topic-sr, QoS: 0, Options: 0x00)
WARNING: ../nvdsinfer/nvdsinfer_model_builder.cpp:1487 Deserialize engine failed because file path: /opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Primary_Detector/resnet10.caffemodel_b2_gpu0_int8.engine open error
0:00:05.302075361 588304 0x562700487360 WARN                 nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1976> [UID = 1]: deserialize engine from file :/opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Primary_Detector/resnet10.caffemodel_b2_gpu0_int8.engine failed
0:00:05.379853760 588304 0x562700487360 WARN                 nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2081> [UID = 1]: deserialize backend context from engine from file :/opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Primary_Detector/resnet10.caffemodel_b2_gpu0_int8.engine failed, try rebuild
0:00:05.380906022 588304 0x562700487360 INFO                 nvinfer gstnvinfer.cpp:682:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:2002> [UID = 1]: Trying to create engine from model files
WARNING: [TRT]: The implicit batch dimension mode has been deprecated. Please create the network with NetworkDefinitionCreationFlag::kEXPLICIT_BATCH flag whenever possible.
WARNING: ../nvdsinfer/nvdsinfer_model_builder.cpp:1459 Serialize engine failed because of file path: /opt/nvidia/deepstream/deepstream-6.3/samples/models/Primary_Detector/resnet10.caffemodel_b2_gpu0_int8.engine opened error
0:00:33.234346868 588304 0x562700487360 WARN                 nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:2029> [UID = 1]: failed to serialize cude engine to file: /opt/nvidia/deepstream/deepstream-6.3/samples/models/Primary_Detector/resnet10.caffemodel_b2_gpu0_int8.engine
INFO: ../nvdsinfer/nvdsinfer_model_builder.cpp:610 [Implicit Engine Info]: layers num: 3
0   INPUT  kFLOAT input_1         3x368x640       
1   OUTPUT kFLOAT conv2d_bbox     16x23x40        
2   OUTPUT kFLOAT conv2d_cov/Sigmoid 4x23x40         

0:00:33.318196974 588304 0x562700487360 INFO                 nvinfer gstnvinfer_impl.cpp:328:notifyLoadModelStatus:<primary_gie> [UID 1]: Load new model:/opt/nvidia/deepstream/deepstream-6.3/samples/configs/deepstream-app/config_infer_primary.yml sucessfully

Runtime commands:
	h: Print this help
	q: Quit

	p: Pause
	r: Resume

NOTE: To expand a source in the 2D tiled display and view object details, left-click on the source.
      To go back to the tiled display, right-click anywhere on the window.

Active sources : 0

**PERF:  FPS 0 (Avg)	FPS 1 (Avg)	
Thu Mar 21 15:26:29 2024
**PERF:  0.00 (0.00)	0.00 (0.00)	
** INFO: <bus_callback:239>: Pipeline ready

** INFO: <bus_callback:225>: Pipeline running

Active sources : 2
Thu Mar 21 15:26:34 2024
**PERF:  18.91 (18.91)	18.91 (18.91)	
Active sources : 2
Thu Mar 21 15:26:39 2024
**PERF:  16.65 (17.56)	16.64 (17.45)	
Active sources : 2
Thu Mar 21 15:26:44 2024
**PERF:  16.62 (17.21)	16.63 (17.21)	
Active sources : 2
Thu Mar 21 15:26:49 2024

Next, I will attempt to use NvDsMsgRelayReceiver to address this issue.

Have you tried to run the mosquitto_sub in the same environment of the deepstream-test5-app to make sure the subscribe can be done in the environment?

Thank you for your comments. I am able to subscribe to the test-topic-sr by using the command mosquitto_sub -t test-topic-sr . Here is a screenshot of my setup (I also use MQTT Explore to monitor the publishing of topics):

However the c/c++ app running at the same time doesn’t log any mqtt topic received in terminal:

HP-Z4-G4-Workstation:/opt/nvidia/deepstream/deepstream/sources/apps/sample_apps/deepstream-test5_mqtt$ ./deepstream-test5-app -c configs/test5_dec_infer-resnet_tracker_sgie_tiled_display_int8.yml -p 0
[mosq_mqtt_log_callback] Client uniqueID sending CONNECT
[mosq_mqtt_log_callback] Client uniqueID sending SUBSCRIBE (Mid: 1, Topic: test-topic-sr, QoS: 0, Options: 0x00)
WARNING: ../nvdsinfer/nvdsinfer_model_builder.cpp:1487 Deserialize engine failed because file path: /opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Primary_Detector/resnet10.caffemodel_b2_gpu0_int8.engine open error
0:00:05.276137687 956578 0x55c04e1ba160 WARN                 nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1976> [UID = 1]: deserialize engine from file :/opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Primary_Detector/resnet10.caffemodel_b2_gpu0_int8.engine failed
0:00:05.395615760 956578 0x55c04e1ba160 WARN                 nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2081> [UID = 1]: deserialize backend context from engine from file :/opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Primary_Detector/resnet10.caffemodel_b2_gpu0_int8.engine failed, try rebuild
0:00:05.395645090 956578 0x55c04e1ba160 INFO                 nvinfer gstnvinfer.cpp:682:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:2002> [UID = 1]: Trying to create engine from model files
WARNING: [TRT]: The implicit batch dimension mode has been deprecated. Please create the network with NetworkDefinitionCreationFlag::kEXPLICIT_BATCH flag whenever possible.
WARNING: ../nvdsinfer/nvdsinfer_model_builder.cpp:1459 Serialize engine failed because of file path: /opt/nvidia/deepstream/deepstream-6.3/samples/models/Primary_Detector/resnet10.caffemodel_b2_gpu0_int8.engine opened error
0:00:33.337244982 956578 0x55c04e1ba160 WARN                 nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:2029> [UID = 1]: failed to serialize cude engine to file: /opt/nvidia/deepstream/deepstream-6.3/samples/models/Primary_Detector/resnet10.caffemodel_b2_gpu0_int8.engine
INFO: ../nvdsinfer/nvdsinfer_model_builder.cpp:610 [Implicit Engine Info]: layers num: 3
0   INPUT  kFLOAT input_1         3x368x640       
1   OUTPUT kFLOAT conv2d_bbox     16x23x40        
2   OUTPUT kFLOAT conv2d_cov/Sigmoid 4x23x40         

0:00:33.423730379 956578 0x55c04e1ba160 INFO                 nvinfer gstnvinfer_impl.cpp:328:notifyLoadModelStatus:<primary_gie> [UID 1]: Load new model:/opt/nvidia/deepstream/deepstream-6.3/samples/configs/deepstream-app/config_infer_primary.yml sucessfully

Runtime commands:
	h: Print this help
	q: Quit

	p: Pause
	r: Resume

NOTE: To expand a source in the 2D tiled display and view object details, left-click on the source.
      To go back to the tiled display, right-click anywhere on the window.

Active sources : 0

**PERF:  FPS 0 (Avg)	FPS 1 (Avg)	
Mon Mar 25 11:32:22 2024
**PERF:  0.00 (0.00)	0.00 (0.00)	
** INFO: <bus_callback:239>: Pipeline ready

** INFO: <bus_callback:225>: Pipeline running

Active sources : 2
Mon Mar 25 11:32:27 2024
**PERF:  19.51 (19.33)	19.51 (19.33)	
Active sources : 2
Mon Mar 25 11:32:32 2024
**PERF:  16.65 (17.60)	16.65 (17.60)	
Active sources : 2
Mon Mar 25 11:32:37 2024
**PERF:  16.49 (17.21)	16.77 (17.21)

Currently the cloud message subscription is supported by Kafka only. Smart Video Record — DeepStream documentation 6.4 documentation

Can you change to Kafka cloud service?

The MQTT protocol is currently being used as a standard in our system. As a temporary solution, I can definitely just convert each MQTT topic to a Kafka topic. I am more interested to know if there are any plans to extend the cloud message subscription to include MQTT in the future, given that MQTT is already supported for certain other applications. Thank you.

We have added MQTT subcribe message into our roadmap.