JenHao
March 31, 2023, 4:07am
1
• Hardware Platform ( GPU)
• DeepStream Version 6.2
• Issue Type( bugs)
I try to add Gst-nvmsgbroker into deepstream_reference_apps/runtime_source_add_delete .
Here is the difference between runtime_source_add_delete(tag 6.2) and my code Diff between origin code
However this code is not working properly.
But if I remove msgbroker from gst_bin_add_many. Everything work fine again. Remove msgbroker
JenHao
March 31, 2023, 4:16am
2
I use RTSP as my source, AMQP as my broker protocol. Below is the command I run my program.
./deepstream-test-rt-src-add-del rtsp://192.168.8.53/live1.sdp 0 filesink 1
I have RabbitMQ install on my local machine and service is running.
● rabbitmq-server.service - RabbitMQ Messaging Server
Loaded: loaded (/lib/systemd/system/rabbitmq-server.service; enabled; vendor preset: enabled)
Active: active (running) since Thu 2023-03-30 13:50:09 CST; 22h ago
Main PID: 4079924 (beam.smp)
Status: "Initialized"
Tasks: 163 (limit: 38230)
Memory: 92.9M
CGroup: /system.slice/rabbitmq-server.service
├─4079920 /bin/sh /usr/sbin/rabbitmq-server
├─4079924 /usr/lib/erlang/erts-10.6.4/bin/beam.smp -W w -A 128 -MBas ageffcbf -MHas ageffcbf -MBlmbcs 512 -MHlmbcs 512 -MMmcs 30 -P 10>
├─4080284 erl_child_setup 65536
├─4080382 inet_gethost 4
└─4080383 inet_gethost 4
Below is my config for msgconv and msgbroker. Other is remain the same as runtime_source_add_delete
dstest4_config.yml
################################################################################
# Copyright (c) 2022, NVIDIA CORPORATION. All rights reserved.
#
# Permission is hereby granted, free of charge, to any person obtaining a
# copy of this software and associated documentation files (the "Software"),
# to deal in the Software without restriction, including without limitation
# the rights to use, copy, modify, merge, publish, distribute, sublicense,
# and/or sell copies of the Software, and to permit persons to whom the
# Software is furnished to do so, subject to the following conditions:
#
This file has been truncated. show original
dstest4_msgconv_config.txt
################################################################################
# Copyright (c) 2018-2019, NVIDIA CORPORATION. All rights reserved.
#
# Permission is hereby granted, free of charge, to any person obtaining a
# copy of this software and associated documentation files (the "Software"),
# to deal in the Software without restriction, including without limitation
# the rights to use, copy, modify, merge, publish, distribute, sublicense,
# and/or sell copies of the Software, and to permit persons to whom the
# Software is furnished to do so, subject to the following conditions:
#
This file has been truncated. show original
fanzh
March 31, 2023, 7:09am
4
can you share the error log? and nvmsgconv and nvmsgbroker are opensource, you might add logs to check if the data was generated and sent.
JenHao
March 31, 2023, 7:22am
5
Here is the full log. To reproduce this issue, the generated video will not have tile window after add msgbroker into gst_bin_add_many.
log.txt
creating uridecodebin for [rtsp://192.168.8.53/live1.sdp]
Unknown key 'enable-past-frame' for group [tracker]Unknown key 'display-tracking-id' for group [tracker]0:00:02.298625711 2664946 0x55dfc91f1ea0 WARN nvinfer gstnvinfer.cpp:677:gst_nvinfer_logger:<secondary-nvinference-engine3> NvDsInferContext[UID 4]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1897> [UID = 4]: deserialize engine from file :/opt/nvidia/deepstream/deepstream-6.2/samples/models/Secondary_VehicleTypes/resnet18.caffemodel_b16_gpu0_int8.engine failed
0:00:02.369860234 2664946 0x55dfc91f1ea0 WARN nvinfer gstnvinfer.cpp:677:gst_nvinfer_logger:<secondary-nvinference-engine3> NvDsInferContext[UID 4]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2002> [UID = 4]: deserialize backend context from engine from file :/opt/nvidia/deepstream/deepstream-6.2/samples/models/Secondary_VehicleTypes/resnet18.caffemodel_b16_gpu0_int8.engine failed, try rebuild
0:00:02.370909695 2664946 0x55dfc91f1ea0 INFO nvinfer gstnvinfer.cpp:680:gst_nvinfer_logger:<secondary-nvinference-engine3> NvDsInferContext[UID 4]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1923> [UID = 4]: Trying to create engine from model files
WARNING: [TRT]: CUDA lazy loading is not enabled. Enabling it can significantly reduce device memory usage. See `CUDA_MODULE_LOADING` in https://docs.nvidia.com/cuda/cuda-c-programming-guide/index.html#env-vars
WARNING: ../nvdsinfer/nvdsinfer_model_builder.cpp:1487 Deserialize engine failed because file path: /opt/nvidia/deepstream/deepstream-6.2/samples/models/Secondary_VehicleTypes/resnet18.caffemodel_b16_gpu0_int8.engine open error
WARNING: [TRT]: CUDA lazy loading is not enabled. Enabling it can significantly reduce device memory usage. See `CUDA_MODULE_LOADING` in https://docs.nvidia.com/cuda/cuda-c-programming-guide/index.html#env-vars
WARNING: [TRT]: The implicit batch dimension mode has been deprecated. Please create the network with NetworkDefinitionCreationFlag::kEXPLICIT_BATCH flag whenever possible.
Warning: Flatten layer ignored. TensorRT implicitly flattens input to FullyConnected layers, but in other circumstances this will result in undefined behavior.
0:00:24.973395652 2664946 0x55dfc91f1ea0 WARN nvinfer gstnvinfer.cpp:677:gst_nvinfer_logger:<secondary-nvinference-engine3> NvDsInferContext[UID 4]: Warning from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1950> [UID = 4]: failed to serialize cude engine to file: /opt/nvidia/deepstream/deepstream-6.2/samples/models/Secondary_VehicleTypes/resnet18.caffemodel_b16_gpu0_int8.engine
This file has been truncated. show original
JenHao
March 31, 2023, 7:29am
6
I have some progress about this issue. after I add the following setup. The pipeline work again. The key point is "async", FALSE,
g_object_set (G_OBJECT (msgbroker),
"topic", "topicname",
"sync", TRUE, "async", FALSE,
"new-api", 1, NULL);
However, I still want to know is this a bug or not. Doc did not mention this at all.
Blockquote
and nvmsgconv and nvmsgbroker are opensource, you might add logs to check if the data was generated and sent.
I do try to debug. However the program seem to hang inside amqp broker protocol library so I have no way to debug.
fanzh
March 31, 2023, 9:31am
7
do you mean your code can work fine now? nvmsgbroker is a subclass of GstBaseSink, so it has async property. please refer to Gstreamer 's explanation async .
can you use nvbroker sample deeptream-test4 to reproduce this issue, you only need to modify the configuration file.
JenHao
April 6, 2023, 2:54am
8
do you mean your code can work fine now?
No, when I add nvmsgbroker into gst_element_link_many
, the pipeline break again.
can you use nvbroker sample deeptream-test4 to reproduce this issue, you only need to modify the configuration file.
If the source is video file, it work properly. However this example did not support RTSP source originally. I need some time to modify the code.
JenHao
April 6, 2023, 6:48am
9
I modify deepstream-test4 for RTSP souce, following is the difference with origin code.
Diff between origin deepstream-test4
The pipeline did not stuck.
Here is the conf file
I use the following to run app
./deepstream-test4-app dstest4_config.yml --no-display
fanzh
April 6, 2023, 8:42am
10
thanks for your update, you can compare the media pipelines, here is the method of dumping pipeline dump pipeline .
system
Closed
April 20, 2023, 8:43am
11
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.