there is still errors by --conn-str=localhost;5672;guest;guest
i set the " --con-str" according to the cfg_amqp.txt, am i right?
how to set the value of --conn-str?
would you like to give me a example of --conn-str?
below is the errors:
$ deepstream-mrcnn-app -i sample_720p.h264 -p libnvds_amqp_proto.so -c cfg_amqp.txt --conn-str=localhost;5672;gust;guest -topic=topicname
Now playing: sample_720p.h264
Using winsys: x11
Running…
ERROR from element nvmsg-broker: Could not initialize supporting library.
Error details: /dvs/git/dirty/git-master_linux/deepstream/sdk/src/gst-plugins/gst-nvmsgbroker/gstnvmsgbroker.c(359): legacy_gst_nvmsgbroker_start (): /GstPipeline:dsmrcnn-pipeline/GstNvMsgBroker:nvmsg-broker:
unable to open shared library
Returned, stopping playback
Deleting pipeline
bash: 5672: command not found
bash: gust: command not found
bash: guest: command not found
there is still a error by --conn-str=“localhost;5672;guest;guest” :
unable to open shared library.
should the port(5672) be like 192.168.1.2?
would you like to give me more suggestion?
below is the error:
$ deepstream-mrcnn-app -i sample_720p.h264 -p libnvds_amqp_proto.so -c cfg_amqp.txt --conn-str=“localhost;5672;guest;guest” -t “topicname”
Now playing: sample_720p.h264
Using winsys: x11
Running…
ERROR from element nvmsg-broker: Could not initialize supporting library.
Error details: gstnvmsgbroker.c(359): legacy_gst_nvmsgbroker_start (): /GstPipeline:dsmrcnn-pipeline/GstNvMsgBroker:nvmsg-broker:
unable to open shared library
Returned, stopping playback
Deleting pipeline
make sure you can find the library in the current path you specified,
-p libnvds_amqp_proto.so
and make sure amqp server service running, check this,
sources/libs/amqp_protocol_adaptor/README
according to the README, i have downloaded the rabbitmq-c.
but ,when cmake there is a error :
could NOT find OpenSSL.
would to like to give me a help again?
bellow the error:
djm@Hartai:~/rabbitmq-c/build$ cmake …
– The C compiler identification is GNU 7.5.0
– Check for working C compiler: /usr/bin/cc
– Check for working C compiler: /usr/bin/cc – works
– Detecting C compiler ABI info
– Detecting C compiler ABI info - done
– Detecting C compile features
– Detecting C compile features - done
– CMAKE_BUILD_TYPE not specified. Creating Release build
– Found C inline keyword: inline
– Looking for getaddrinfo
– Looking for getaddrinfo - found
– Looking for socket
– Looking for socket - found
– Looking for htonll
– Looking for htonll - not found
– Looking for poll
– Looking for poll - found
– Looking for clock_gettime in rt
– Looking for clock_gettime in rt - found
– Looking for posix_spawnp in rt
– Looking for posix_spawnp in rt - found
– Performing Test HAVE_GNU90
– Performing Test HAVE_GNU90 - Success
– Found POPT: /usr/include (found version “1.16”)
– Could NOT find XMLTO (missing: XMLTO_EXECUTABLE)
– Could NOT find Doxygen (missing: DOXYGEN_EXECUTABLE)
– Looking for pthread.h
– Looking for pthread.h - found
– Looking for pthread_create
– Looking for pthread_create - not found
– Looking for pthread_create in pthreads
– Looking for pthread_create in pthreads - not found
– Looking for pthread_create in pthread
– Looking for pthread_create in pthread - found
– Found Threads: TRUE
CMake Error at /usr/share/cmake-3.10/Modules/FindPackageHandleStandardArgs.cmake:137 (message):
Could NOT find OpenSSL, try to set the path to OpenSSL root folder in the
system variable OPENSSL_ROOT_DIR (missing: OPENSSL_CRYPTO_LIBRARY
OPENSSL_INCLUDE_DIR) (Required is at least version “0.9.8”)
Call Stack (most recent call first):
/usr/share/cmake-3.10/Modules/FindPackageHandleStandardArgs.cmake:378 (_FPHSA_FAILURE_MESSAGE)
/usr/share/cmake-3.10/Modules/FindOpenSSL.cmake:390 (find_package_handle_standard_args)
CMakeLists.txt:273 (find_package)
– Configuring incomplete, errors occurred!
See also “/home/djm/rabbitmq-c/build/CMakeFiles/CMakeOutput.log”.
See also “/home/djm/rabbitmq-c/build/CMakeFiles/CMakeError.log”.
i meet the “mask_rcnn_resnet50.etlt_b1_gpu0_int8.engine” open error,
when run deepstream-mrcnn-app.
i am using Tx2 with jetpack 4.4.
would you like to give me a help?
below the error:
djm@Hartai:/opt/nvidia/deepstream/deepstream-5.0/sources/apps/sample_apps/deepstream-mrcnn-test$ deepstream-mrcnn-app -i sample-720p.h264 -p libnvds_amqp_proto.so -c cfg_amqp.txt --conn-str=“localhost;5672;guest;guest” --topic=“topicname”
Now playing: sample-720p.h264
Using winsys: x11
Opening in BLOCKING MODE
ERROR: Deserialize engine failed because file path: /opt/nvidia/deepstream/deepstream-5.0/sources/apps/sample_apps/deepstream-mrcnn-test/…/…/…/…/samples/models/tlt_pretrained_models/mrcnn/mask_rcnn_resnet50.etlt_b1_gpu0_int8.engine open error
0:00:01.244412876 10486 0x2274ad30 WARN nvinfer gstnvinfer.cpp:616:gst_nvinfer_logger: NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1690> [UID = 1]: deserialize engine from file :/opt/nvidia/deepstream/deepstream-5.0/sources/apps/sample_apps/deepstream-mrcnn-test/…/…/…/…/samples/models/tlt_pretrained_models/mrcnn/mask_rcnn_resnet50.etlt_b1_gpu0_int8.engine failed
0:00:01.244489804 10486 0x2274ad30 WARN nvinfer gstnvinfer.cpp:616:gst_nvinfer_logger: NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1797> [UID = 1]: deserialize backend context from engine from file :/opt/nvidia/deepstream/deepstream-5.0/sources/apps/sample_apps/deepstream-mrcnn-test/…/…/…/…/samples/models/tlt_pretrained_models/mrcnn/mask_rcnn_resnet50.etlt_b1_gpu0_int8.engine failed, try rebuild
0:00:01.244514828 10486 0x2274ad30 INFO nvinfer gstnvinfer.cpp:619:gst_nvinfer_logger: NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1715> [UID = 1]: Trying to create engine from model files
parseModel: Failed to open TLT encoded model file /opt/nvidia/deepstream/deepstream-5.0/sources/apps/sample_apps/deepstream-mrcnn-test/…/…/…/…/samples/models/tlt_pretrained_models/mrcnn/mask_rcnn_resnet50.etlt
ERROR: failed to build network since parsing model errors.
ERROR: Failed to create network using custom network creation function
ERROR: Failed to get cuda engine from custom library API
0:00:01.244985069 10486 0x2274ad30 ERROR nvinfer gstnvinfer.cpp:613:gst_nvinfer_logger: NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1735> [UID = 1]: build engine file failed
Segmentation fault (core dumped)
i have downloaded the mrcnn model to the path:
/opt/nvidia/deepstream/deepstream-5.0/samples/models/tlt_pretrained_models/mrcnn.
there are two files: mask_rcnn_resnet50.etlt & cal.bin.
do not have the mask_rcnn_resnet50.etlt_b1_gpu0_int8.engine.
after download, i try to run deepstream-mrcnn-app again, there still have the error same as before.
that is “mask_rcnn_resnet50.etlt_b1_gpu0_int8.engine open error”.
how to get the engine file?
would you like to give me help again?
below is the error:
Using winsys: x11
Opening in BLOCKING MODE
ERROR: Deserialize engine failed because file path: /opt/nvidia/deepstream/deepstream-5.0/sources/apps/sample_apps/deepstream-mrcnn-test/…/…/…/…/samples/models/tlt_pretrained_models/mrcnn/mask_rcnn_resnet50.etlt_b1_gpu0_int8.engine open error
0:00:03.260159449 12232 0x2fc8db20 WARN nvinfer gstnvinfer.cpp:616:gst_nvinfer_logger: NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1690> [UID = 1]: deserialize engine from file :/opt/nvidia/deepstream/deepstream-5.0/sources/apps/sample_apps/deepstream-mrcnn-test/…/…/…/…/samples/models/tlt_pretrained_models/mrcnn/mask_rcnn_resnet50.etlt_b1_gpu0_int8.engine failed
0:00:03.260260248 12232 0x2fc8db20 WARN nvinfer gstnvinfer.cpp:616:gst_nvinfer_logger: NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1797> [UID = 1]: deserialize backend context from engine from file :/opt/nvidia/deepstream/deepstream-5.0/sources/apps/sample_apps/deepstream-mrcnn-test/…/…/…/…/samples/models/tlt_pretrained_models/mrcnn/mask_rcnn_resnet50.etlt_b1_gpu0_int8.engine failed, try rebuild
0:00:03.260286936 12232 0x2fc8db20 INFO nvinfer gstnvinfer.cpp:619:gst_nvinfer_logger: NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1715> [UID = 1]: Trying to create engine from model files
ERROR: [TRT]: UffParser: Validator error: multilevel_propose_rois: Unsupported operation _MultilevelProposeROI_TRT
parseModel: Failed to parse UFF model
ERROR: failed to build network since parsing model errors.
ERROR: Failed to create network using custom network creation function
ERROR: Failed to get cuda engine from custom library API
0:00:04.334584996 12232 0x2fc8db20 ERROR nvinfer gstnvinfer.cpp:613:gst_nvinfer_logger: NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1735> [UID = 1]: build engine file failed
For instance segmentation model MaskRCNN, use deepstream_app_source1_mrcnn.txt
It also requires TRT plugin using GitHub - NVIDIA/TensorRT at release/7.0.
Follow
I followed these instructions and am still receiving the following error:
ERROR: [TRT]: UffParser: Validator error: multilevel_propose_rois: Unsupported operation _MultilevelProposeROI_TRT
parseModel: Failed to parse UFF model
ERROR: failed to build network since parsing model errors.
ERROR: Failed to create network using custom network creation function
ERROR: Failed to get cuda engine from custom library API
I am on jetson nano using L4T 32.4.3 and DS5.0.
Previously I had /usr/lib/aarch64-linux-gnu/libnvinfer_plugin.so.7.1.3 installed and that didn’t work. So per the above instructions, I ran the following commands:
i am running the example of “deepstream-mrcnn-test”.
the “dsmrcnn_pgie_config.txt” should be used in that example ,i think.
in the “dsmrcnn_pgie_config.txt”,
the “model-engine-file” should be “mask_rcnn_resnet50.etlt-b1-gpu0-int8.engine”.
but, the error is not found “mask_rcnn_resnet50.etlt-b1-gpu0-int8.engine”.
how to get or download the “mask_rcnn_resnet50.etlt-b1-gpu0-int8.engine”?
should i change the libnvinfer_plugin.so from 7.1.3 to the 7.0.0.1?
below is a part of the “dsmrcnn_pgie_config.txt”:
[property]
net-scale-factor=0.017507
offsets=123.675;116.280;103.53
model-color-format=0
labelfile-path=…/…/…/…/samples/configs/tlt_pretrained_models/mrcnn_labels.txt
tlt-encoded-model=…/…/…/…/samples/models/tlt_pretrained_models/mrcnn/mask_rcnn_resnet50.etlt
tlt-model-key=nvidia_tlt
model-engine-file=…/…/…/…/samples/models/tlt_pretrained_models/mrcnn/mask_rcnn_resnet50.etlt_b1_gpu0_int8.engine
int8-calib-file=…/…/models/tlt_pretrained_models/mrcnn/cal.bin
Previously I had /usr/lib/aarch64-linux-gnu/libnvinfer_plugin.so.7.1.3 installed and that didn’t work. So per the above instructions, I ran the following commands:"
→ clear the cache, and try again.
rm ~/.cache/gstreamer-1.0/ -rf