Yeah I did it too but I found more issues. Here are the details:
- I created rtsp stream from files using the following commands:
cvlc --loop sample_push.mov ":sout=#gather:rtp{sdp=rtsp://:8654/push}" :network-caching=1500 :sout-all :sout-keep
VLC media player 3.0.9.2 Vetinari (revision 3.0.9.2-0-gd4c1aefe4d)
[00005647d8c30620] dummy interface: using the dummy interface module...
-
I confirmed that the uri is valid by playing the stream with vlc.
-
Then I updated the uri in sources_rtsp.csv file, attached here:
sources_rtsp.txt (127 Bytes)
-
Then I run the deepstream-test5 c/c++ app and I am seeing two issues:
- “No supported stream was found”
- Client doesn’t even send SUBSCRIBE (it only sends CONNECT)
The full terminal output is attached below:
(base) HP-Z4-G4-Workstation:/opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs$ ../deepstream-test5-app -c test5_dec_infer-resnet_tracker_sgie_tiled_display_int8.yml
[mosq_mqtt_log_callback] Client uniqueID sending CONNECT
WARNING: ../nvdsinfer/nvdsinfer_model_builder.cpp:1487 Deserialize engine failed because file path: /opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Secondary_CarMake/resnet18.caffemodel_b16_gpu0_int8.engine open error
0:00:05.208506407 2545536 0x55f0bc82d860 WARN nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<secondary_gie_2> NvDsInferContext[UID 6]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1976> [UID = 6]: deserialize engine from file :/opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Secondary_CarMake/resnet18.caffemodel_b16_gpu0_int8.engine failed
0:00:05.311209208 2545536 0x55f0bc82d860 WARN nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<secondary_gie_2> NvDsInferContext[UID 6]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2081> [UID = 6]: deserialize backend context from engine from file :/opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Secondary_CarMake/resnet18.caffemodel_b16_gpu0_int8.engine failed, try rebuild
0:00:05.311238867 2545536 0x55f0bc82d860 INFO nvinfer gstnvinfer.cpp:682:gst_nvinfer_logger:<secondary_gie_2> NvDsInferContext[UID 6]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:2002> [UID = 6]: Trying to create engine from model files
WARNING: [TRT]: The implicit batch dimension mode has been deprecated. Please create the network with NetworkDefinitionCreationFlag::kEXPLICIT_BATCH flag whenever possible.
Warning: Flatten layer ignored. TensorRT implicitly flattens input to FullyConnected layers, but in other circumstances this will result in undefined behavior.
WARNING: ../nvdsinfer/nvdsinfer_model_builder.cpp:1459 Serialize engine failed because of file path: /opt/nvidia/deepstream/deepstream-6.3/samples/models/Secondary_CarMake/resnet18.caffemodel_b16_gpu0_int8.engine opened error
0:00:35.421360362 2545536 0x55f0bc82d860 WARN nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<secondary_gie_2> NvDsInferContext[UID 6]: Warning from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:2029> [UID = 6]: failed to serialize cude engine to file: /opt/nvidia/deepstream/deepstream-6.3/samples/models/Secondary_CarMake/resnet18.caffemodel_b16_gpu0_int8.engine
INFO: ../nvdsinfer/nvdsinfer_model_builder.cpp:610 [Implicit Engine Info]: layers num: 2
0 INPUT kFLOAT input_1 3x224x224
1 OUTPUT kFLOAT predictions/Softmax 20x1x1
0:00:35.505874719 2545536 0x55f0bc82d860 INFO nvinfer gstnvinfer_impl.cpp:328:notifyLoadModelStatus:<secondary_gie_2> [UID 6]: Load new model:/opt/nvidia/deepstream/deepstream-6.3/samples/configs/deepstream-app/config_infer_secondary_carmake.yml sucessfully
WARNING: ../nvdsinfer/nvdsinfer_model_builder.cpp:1487 Deserialize engine failed because file path: /opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Secondary_CarColor/resnet18.caffemodel_b16_gpu0_int8.engine open error
0:00:40.229526916 2545536 0x55f0bc82d860 WARN nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<secondary_gie_1> NvDsInferContext[UID 5]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1976> [UID = 5]: deserialize engine from file :/opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Secondary_CarColor/resnet18.caffemodel_b16_gpu0_int8.engine failed
0:00:40.310976031 2545536 0x55f0bc82d860 WARN nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<secondary_gie_1> NvDsInferContext[UID 5]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2081> [UID = 5]: deserialize backend context from engine from file :/opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Secondary_CarColor/resnet18.caffemodel_b16_gpu0_int8.engine failed, try rebuild
0:00:40.310994256 2545536 0x55f0bc82d860 INFO nvinfer gstnvinfer.cpp:682:gst_nvinfer_logger:<secondary_gie_1> NvDsInferContext[UID 5]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:2002> [UID = 5]: Trying to create engine from model files
WARNING: [TRT]: The implicit batch dimension mode has been deprecated. Please create the network with NetworkDefinitionCreationFlag::kEXPLICIT_BATCH flag whenever possible.
Warning: Flatten layer ignored. TensorRT implicitly flattens input to FullyConnected layers, but in other circumstances this will result in undefined behavior.
WARNING: ../nvdsinfer/nvdsinfer_model_builder.cpp:1459 Serialize engine failed because of file path: /opt/nvidia/deepstream/deepstream-6.3/samples/models/Secondary_CarColor/resnet18.caffemodel_b16_gpu0_int8.engine opened error
0:01:07.867744231 2545536 0x55f0bc82d860 WARN nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<secondary_gie_1> NvDsInferContext[UID 5]: Warning from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:2029> [UID = 5]: failed to serialize cude engine to file: /opt/nvidia/deepstream/deepstream-6.3/samples/models/Secondary_CarColor/resnet18.caffemodel_b16_gpu0_int8.engine
INFO: ../nvdsinfer/nvdsinfer_model_builder.cpp:610 [Implicit Engine Info]: layers num: 2
0 INPUT kFLOAT input_1 3x224x224
1 OUTPUT kFLOAT predictions/Softmax 12x1x1
0:01:07.954794917 2545536 0x55f0bc82d860 INFO nvinfer gstnvinfer_impl.cpp:328:notifyLoadModelStatus:<secondary_gie_1> [UID 5]: Load new model:/opt/nvidia/deepstream/deepstream-6.3/samples/configs/deepstream-app/config_infer_secondary_carcolor.yml sucessfully
WARNING: ../nvdsinfer/nvdsinfer_model_builder.cpp:1487 Deserialize engine failed because file path: /opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Secondary_VehicleTypes/resnet18.caffemodel_b16_gpu0_int8.engine open error
0:01:12.655908462 2545536 0x55f0bc82d860 WARN nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<secondary_gie_0> NvDsInferContext[UID 4]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1976> [UID = 4]: deserialize engine from file :/opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Secondary_VehicleTypes/resnet18.caffemodel_b16_gpu0_int8.engine failed
0:01:12.737114996 2545536 0x55f0bc82d860 WARN nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<secondary_gie_0> NvDsInferContext[UID 4]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2081> [UID = 4]: deserialize backend context from engine from file :/opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Secondary_VehicleTypes/resnet18.caffemodel_b16_gpu0_int8.engine failed, try rebuild
0:01:12.737132134 2545536 0x55f0bc82d860 INFO nvinfer gstnvinfer.cpp:682:gst_nvinfer_logger:<secondary_gie_0> NvDsInferContext[UID 4]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:2002> [UID = 4]: Trying to create engine from model files
WARNING: [TRT]: The implicit batch dimension mode has been deprecated. Please create the network with NetworkDefinitionCreationFlag::kEXPLICIT_BATCH flag whenever possible.
Warning: Flatten layer ignored. TensorRT implicitly flattens input to FullyConnected layers, but in other circumstances this will result in undefined behavior.
WARNING: ../nvdsinfer/nvdsinfer_model_builder.cpp:1459 Serialize engine failed because of file path: /opt/nvidia/deepstream/deepstream-6.3/samples/models/Secondary_VehicleTypes/resnet18.caffemodel_b16_gpu0_int8.engine opened error
0:01:39.806730692 2545536 0x55f0bc82d860 WARN nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<secondary_gie_0> NvDsInferContext[UID 4]: Warning from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:2029> [UID = 4]: failed to serialize cude engine to file: /opt/nvidia/deepstream/deepstream-6.3/samples/models/Secondary_VehicleTypes/resnet18.caffemodel_b16_gpu0_int8.engine
INFO: ../nvdsinfer/nvdsinfer_model_builder.cpp:610 [Implicit Engine Info]: layers num: 2
0 INPUT kFLOAT input_1 3x224x224
1 OUTPUT kFLOAT predictions/Softmax 6x1x1
0:01:39.892446484 2545536 0x55f0bc82d860 INFO nvinfer gstnvinfer_impl.cpp:328:notifyLoadModelStatus:<secondary_gie_0> [UID 4]: Load new model:/opt/nvidia/deepstream/deepstream-6.3/samples/configs/deepstream-app/config_infer_secondary_vehicletypes.yml sucessfully
gstnvtracker: Loading low-level lib at /opt/nvidia/deepstream/deepstream-6.3/lib/libnvds_nvmultiobjecttracker.so
[NvMultiObjectTracker] Initialized
WARNING: ../nvdsinfer/nvdsinfer_model_builder.cpp:1487 Deserialize engine failed because file path: /opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Primary_Detector/resnet10.caffemodel_b2_gpu0_int8.engine open error
0:01:44.589804269 2545536 0x55f0bc82d860 WARN nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1976> [UID = 1]: deserialize engine from file :/opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Primary_Detector/resnet10.caffemodel_b2_gpu0_int8.engine failed
0:01:44.670700255 2545536 0x55f0bc82d860 WARN nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2081> [UID = 1]: deserialize backend context from engine from file :/opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Primary_Detector/resnet10.caffemodel_b2_gpu0_int8.engine failed, try rebuild
0:01:44.670718598 2545536 0x55f0bc82d860 INFO nvinfer gstnvinfer.cpp:682:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:2002> [UID = 1]: Trying to create engine from model files
WARNING: [TRT]: The implicit batch dimension mode has been deprecated. Please create the network with NetworkDefinitionCreationFlag::kEXPLICIT_BATCH flag whenever possible.
WARNING: ../nvdsinfer/nvdsinfer_model_builder.cpp:1459 Serialize engine failed because of file path: /opt/nvidia/deepstream/deepstream-6.3/samples/models/Primary_Detector/resnet10.caffemodel_b2_gpu0_int8.engine opened error
0:02:11.431901259 2545536 0x55f0bc82d860 WARN nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:2029> [UID = 1]: failed to serialize cude engine to file: /opt/nvidia/deepstream/deepstream-6.3/samples/models/Primary_Detector/resnet10.caffemodel_b2_gpu0_int8.engine
INFO: ../nvdsinfer/nvdsinfer_model_builder.cpp:610 [Implicit Engine Info]: layers num: 3
0 INPUT kFLOAT input_1 3x368x640
1 OUTPUT kFLOAT conv2d_bbox 16x23x40
2 OUTPUT kFLOAT conv2d_cov/Sigmoid 4x23x40
0:02:11.516702783 2545536 0x55f0bc82d860 INFO nvinfer gstnvinfer_impl.cpp:328:notifyLoadModelStatus:<primary_gie> [UID 1]: Load new model:/opt/nvidia/deepstream/deepstream-6.3/samples/configs/deepstream-app/config_infer_primary.yml sucessfully
Runtime commands:
h: Print this help
q: Quit
p: Pause
r: Resume
NOTE: To expand a source in the 2D tiled display and view object details, left-click on the source.
To go back to the tiled display, right-click anywhere on the window.
Active sources : 0
**PERF: FPS 0 (Avg) FPS 1 (Avg)
Fri Mar 15 12:23:02 2024
**PERF: 0.00 (0.00) 0.00 (0.00)
** INFO: <bus_callback:239>: Pipeline ready
ERROR from src_elem0: No supported stream was found. You might need to allow more transport protocols or may otherwise be missing the right GStreamer RTSP extension plugin.
Debug info: gstrtspsrc.c(7474): gst_rtspsrc_setup_streams_start (): /GstPipeline:pipeline/GstBin:multi_src_bin/GstBin:src_sub_bin0/GstRTSPSrc:src_elem0
** INFO: <reset_source_pipeline:1706>: Resetting source 0
ERROR from src_elem1: No supported stream was found. You might need to allow more transport protocols or may otherwise be missing the right GStreamer RTSP extension plugin.
Debug info: gstrtspsrc.c(7474): gst_rtspsrc_setup_streams_start (): /GstPipeline:pipeline/GstBin:multi_src_bin/GstBin:src_sub_bin1/GstRTSPSrc:src_elem1
** INFO: <reset_source_pipeline:1706>: Resetting source 1
ERROR from src_elem1: No supported stream was found. You might need to allow more transport protocols or may otherwise be missing the right GStreamer RTSP extension plugin.
Debug info: gstrtspsrc.c(7474): gst_rtspsrc_setup_streams_start (): /GstPipeline:pipeline/GstBin:multi_src_bin/GstBin:src_sub_bin1/GstRTSPSrc:src_elem1
ERROR from src_elem0: No supported stream was found. You might need to allow more transport protocols or may otherwise be missing the right GStreamer RTSP extension plugin.
Debug info: gstrtspsrc.c(7474): gst_rtspsrc_setup_streams_start (): /GstPipeline:pipeline/GstBin:multi_src_bin/GstBin:src_sub_bin0/GstRTSPSrc:src_elem0
Active sources : 0
Fri Mar 15 12:23:07 2024
**PERF: 0.00 (0.00) 0.00 (0.00)
q
Quitting
nvstreammux: Successfully handled EOS for source_id=0
nvstreammux: Successfully handled EOS for source_id=1
[NvMultiObjectTracker] De-initialized
[mosq_mqtt_log_callback] Client uniqueID sending DISCONNECT
mqtt disconnected
App run successful
- I also tried a different config file: test5_config_file_src_infer.yml. It uses .mp4 files as input so I can focus on the mqtt side. But I am seeing the same issue:
- Client doesn’t even send SUBSCRIBE (it only sends CONNECT)
Again the full terminal output is attached below:
(base) HP-Z4-G4-Workstation:/opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs$ ../deepstream-test5-app -c test5_config_file_src_infer.yml
[mosq_mqtt_log_callback] Client uniqueID sending CONNECT
gstnvtracker: Loading low-level lib at /opt/nvidia/deepstream/deepstream-6.3/lib/libnvds_nvmultiobjecttracker.so
[NvMultiObjectTracker] Initialized
WARNING: ../nvdsinfer/nvdsinfer_model_builder.cpp:1487 Deserialize engine failed because file path: /opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Primary_Detector/resnet10.caffemodel_b4_gpu0_int8.engine open error
0:00:05.243597994 2701625 0x559fd5c58b30 WARN nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1976> [UID = 1]: deserialize engine from file :/opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Primary_Detector/resnet10.caffemodel_b4_gpu0_int8.engine failed
0:00:05.321569142 2701625 0x559fd5c58b30 WARN nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2081> [UID = 1]: deserialize backend context from engine from file :/opt/nvidia/deepstream/deepstream-6.3/sources/apps/sample_apps/deepstream-test5_mqtt/configs/../../../../../samples/models/Primary_Detector/resnet10.caffemodel_b4_gpu0_int8.engine failed, try rebuild
0:00:05.321587410 2701625 0x559fd5c58b30 INFO nvinfer gstnvinfer.cpp:682:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:2002> [UID = 1]: Trying to create engine from model files
WARNING: [TRT]: The implicit batch dimension mode has been deprecated. Please create the network with NetworkDefinitionCreationFlag::kEXPLICIT_BATCH flag whenever possible.
WARNING: ../nvdsinfer/nvdsinfer_model_builder.cpp:1459 Serialize engine failed because of file path: /opt/nvidia/deepstream/deepstream-6.3/samples/models/Primary_Detector/resnet10.caffemodel_b4_gpu0_int8.engine opened error
0:00:38.411060756 2701625 0x559fd5c58b30 WARN nvinfer gstnvinfer.cpp:679:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:2029> [UID = 1]: failed to serialize cude engine to file: /opt/nvidia/deepstream/deepstream-6.3/samples/models/Primary_Detector/resnet10.caffemodel_b4_gpu0_int8.engine
INFO: ../nvdsinfer/nvdsinfer_model_builder.cpp:610 [Implicit Engine Info]: layers num: 3
0 INPUT kFLOAT input_1 3x368x640
1 OUTPUT kFLOAT conv2d_bbox 16x23x40
2 OUTPUT kFLOAT conv2d_cov/Sigmoid 4x23x40
0:00:38.495829200 2701625 0x559fd5c58b30 INFO nvinfer gstnvinfer_impl.cpp:328:notifyLoadModelStatus:<primary_gie> [UID 1]: Load new model:/opt/nvidia/deepstream/deepstream-6.3/samples/configs/deepstream-app/config_infer_primary.yml sucessfully
Runtime commands:
h: Print this help
q: Quit
p: Pause
r: Resume
NOTE: To expand a source in the 2D tiled display and view object details, left-click on the source.
To go back to the tiled display, right-click anywhere on the window.
Active sources : 0
**PERF: FPS 0 (Avg) FPS 1 (Avg) FPS 2 (Avg) FPS 3 (Avg)
Fri Mar 15 12:40:00 2024
**PERF: 0.00 (0.00) 0.00 (0.00) 0.00 (0.00) 0.00 (0.00)
** INFO: <bus_callback:239>: Pipeline ready
** INFO: <bus_callback:225>: Pipeline running
WARNING; playback mode used with URI [file:///opt/nvidia/deepstream/deepstream-6.3/samples/streams/sample_1080p_h264.mp4] not conforming to timestamp format; check README; using system-time
WARNING; playback mode used with URI [file:///opt/nvidia/deepstream/deepstream-6.3/samples/streams/sample_1080p_h264.mp4] not conforming to timestamp format; check README; using system-time
WARNING; playback mode used with URI [file:///opt/nvidia/deepstream/deepstream-6.3/samples/streams/sample_1080p_h264.mp4] not conforming to timestamp format; check README; using system-time
[mosq_mqtt_log_callback] Client uniqueID sending PUBLISH (d0, q0, r0, m1, 'test-topic', ... (1625 bytes))
[mosq_mqtt_log_callback] Client uniqueID sending PUBLISH (d0, q0, r0, m2, 'test-topic', ... (1624 bytes))
[mosq_mqtt_log_callback] Client uniqueID sending PUBLISH (d0, q0, r0, m3, 'test-topic', ... (1624 bytes))
...
Publish callback with reason code: Success.
Publish callback with reason code: Success.
Publish callback with reason code: Success.
[NvMultiObjectTracker] De-initialized
[mosq_mqtt_log_callback] Client uniqueID sending DISCONNECT
mqtt disconnected
App run successful
What I changed in the config files?
- sink1: msg-broker-proto-lib, msg-broker-conn-str
- message-consumer0: proto-lib, conn-str, config-file
Please find the modified config files attached here:
test5_config_file_src_infer.txt (6.6 KB)
test5_dec_infer-resnet_tracker_sgie_tiled_display_int8.txt (8.4 KB)
I also attached the Wireshark log during my test with test5_config_file_src_infer.yml. It is confirmed that the deepstream-test5 client didn’t send SUBSCRIBE and test-topic-sr was not received by any clients.
dstest5_c_start_while_publisher_py_run.txt (835.8 KB)