Deepstream-7.1 error Secondary_VehicleMake




pipeline_config.txt (2.1 KB)

Hello? This is JetsonMom. I have succeeded in Primary_Detector and am currently working on Secondary_VehicleMake. However, I keep getting Runtime commands:
pipeline_config.txt* is in the attached file.

orin@orin-desktop:~/deepstream_test2$ deepstream-app -c pipeline_config.txt

h: Print this help
q: Quit

p: Pause
r: Resume

NOTE: To expand a source in the 2D tiled display and view object details, left-click on the source.
To go back to the tiled display, right-click anywhere on the window.

** INFO: <bus_callback:291>: Pipeline ready

Opening in BLOCKING MODE
NvMMLiteOpen:Block:BlockType=261
NvMMLiteBlockCreate:Block:BlockType=261
** INFO: <bus_callback:277>: Pipeline running

0:00:00.664323446 14119 0xaaaaddfa1f00 ERROR nvinfer gstnvinfer.cpp:678:gst_nvinfer_logger:<secondary_gie_0> NvDsInferContext[UID 2]: Error in NvDsInferContextImpl::parseBoundingBox() <nvdsinfer_context_impl_output_parsing.cpp:60> [UID = 2]: Could not find output coverage layer for parsing objects
0:00:00.664401369 14119 0xaaaaddfa1f00 ERROR nvinfer gstnvinfer.cpp:678:gst_nvinfer_logger:<secondary_gie_0> NvDsInferContext[UID 2]: Error in NvDsInferContextImpl::fillDetectionOutput() <nvdsinfer_context_impl_output_parsing.cpp:736> [UID = 2]: Failed to parse bboxes
Segmentation fault (core dumped)
I’m getting this error. Please help me if you’ve tried it.

This problem may be caused by an error in pgie parsing the bbox. Can you regenerate the engine file? Also please use GDB to get the full stack.

gdb --args  deepstream-app -c pipeline_config.txt

Thank you. I’m 66 years old and I’m a follower so I don’t know much so it’s hard. (gdb) run
===> It only recognizes cars, people, bicycles, etc.
ERROR: [TRT]: ICudaEngine::getTensorIOMode: Error Code 3: Internal Error (Given invalid tensor name: predictions/Softmax. Get valid tensor names with getIOTensorName())

orin@orin-desktop:~/deepstream_test2$ gdb --args deepstream-app -c pipeline_config.txt
GNU gdb (Ubuntu 12.1-0ubuntu1~22.04.2) 12.1
Copyright (C) 2022 Free Software Foundation, Inc.
License GPLv3+: GNU GPL version 3 or later http://gnu.org/licenses/gpl.html
This is free software: you are free to change and redistribute it.
There is NO WARRANTY, to the extent permitted by law.
Type “show copying” and “show warranty” for details.
This GDB was configured as “aarch64-linux-gnu”.
Type “show configuration” for configuration details.
For bug reporting instructions, please see:
https://www.gnu.org/software/gdb/bugs/.
Find the GDB manual and other documentation resources online at:
http://www.gnu.org/software/gdb/documentation/.

For help, type “help”.
Type “apropos word” to search for commands related to “word”…
Reading symbols from deepstream-app…
(No debugging symbols found in deepstream-app)
(gdb) run
Starting program: /usr/bin/deepstream-app -c pipeline_config.txt
[Thread debugging using libthread_db enabled]
Using host libthread_db library “/lib/aarch64-linux-gnu/libthread_db.so.1”.
[Detaching after fork from child process 5635]

(gst-plugin-scanner:5635): GStreamer-WARNING **: 20:54:22.041: Failed to load plugin ‘/opt/nvidia/deepstream/deepstream/lib/gst-plugins/libnvdsgst_udp.so’: librivermax.so.0: cannot open shared object file: No such file or directory

(gst-plugin-scanner:5635): GStreamer-WARNING **: 20:54:22.123: Failed to load plugin ‘/opt/nvidia/deepstream/deepstream/lib/gst-plugins/libnvdsgst_inferserver.so’: libtritonserver.so: cannot open shared object file: No such file or directory

(gst-plugin-scanner:5635): GStreamer-WARNING **: 20:54:22.306: Failed to load plugin ‘/usr/lib/aarch64-linux-gnu/gstreamer-1.0/deepstream/libnvdsgst_udp.so’: librivermax.so.0: cannot open shared object file: No such file or directory

(gst-plugin-scanner:5635): GStreamer-WARNING **: 20:54:22.307: Failed to load plugin ‘/usr/lib/aarch64-linux-gnu/gstreamer-1.0/deepstream/libnvdsgst_inferserver.so’: libtritonserver.so: cannot open shared object file: No such file or directory
[New Thread 0xffffc17b3840 (LWP 5637)]
** INFO: <create_encode_file_bin:364>: Could not create HW encoder. Falling back to SW encoder
[New Thread 0xffffbac78840 (LWP 5638)]
[New Thread 0xffffba22b840 (LWP 5639)]
[New Thread 0xffffb98c5840 (LWP 5640)]
[New Thread 0xffffb90b5840 (LWP 5641)]
[New Thread 0xffffb88a5840 (LWP 5642)]
[New Thread 0xffffa3ff8840 (LWP 5643)]
[New Thread 0xffffa37e8840 (LWP 5644)]
[New Thread 0xffffa2fd8840 (LWP 5645)]
[New Thread 0xffffa27c8840 (LWP 5646)]
Setting min object dimensions as 16x16 instead of 1x1 to support VIC compute mode.
0:00:02.697345201 5633 0xaaaaab6e3060 WARN nvinfer gstnvinfer.cpp:681:gst_nvinfer_logger:<secondary_gie_0> NvDsInferContext[UID 2]: Warning from NvDsInferContextImpl::initialize() <nvdsinfer_context_impl.cpp:1243> [UID = 2]: Warning, OpenCV has been deprecated. Using NMS for clustering instead of cv::groupRectangles with topK = 20 and NMS Threshold = 0.5
WARNING: [TRT]: Using an engine plan file across different models of devices is not recommended and is likely to affect performance or even cause errors.
0:00:02.890028142 5633 0xaaaaab6e3060 INFO nvinfer gstnvinfer.cpp:684:gst_nvinfer_logger:<secondary_gie_0> NvDsInferContext[UID 2]: Info from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:2092> [UID = 2]: deserialized trt engine from :/home/orin/deepstream_test2/models/Secondary_VehicleMake/resnet18_vehiclemakenet_pruned.onnx_b1_gpu0_int8.engine
Implicit layer support has been deprecated
INFO: [Implicit Engine Info]: layers num: 0

0:00:02.890182006 5633 0xaaaaab6e3060 INFO nvinfer gstnvinfer.cpp:684:gst_nvinfer_logger:<secondary_gie_0> NvDsInferContext[UID 2]: Info from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2195> [UID = 2]: Use deserialized engine model: /home/orin/deepstream_test2/models/Secondary_VehicleMake/resnet18_vehiclemakenet_pruned.onnx_b1_gpu0_int8.engine
[New Thread 0xffffa1381840 (LWP 5647)]
[New Thread 0xffffa0b71840 (LWP 5648)]
[New Thread 0xffff93ff8840 (LWP 5649)]
0:00:02.951372086 5633 0xaaaaab6e3060 INFO nvinfer gstnvinfer_impl.cpp:343:notifyLoadModelStatus:<secondary_gie_0> [UID 2]: Load new model:/home/orin/deepstream_test2/config_infer_secondary_vehiclemake.txt sucessfully
[New Thread 0xffff937e8840 (LWP 5650)]
[New Thread 0xffff92fd8840 (LWP 5651)]
Setting min object dimensions as 16x16 instead of 1x1 to support VIC compute mode.
WARNING: [TRT]: Using an engine plan file across different models of devices is not recommended and is likely to affect performance or even cause errors.
0:00:02.996922660 5633 0xaaaaab6e3060 INFO nvinfer gstnvinfer.cpp:684:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:2092> [UID = 1]: deserialized trt engine from :/home/orin/deepstream_test2/models/Primary_Detector/resnet18_trafficcamnet_pruned.onnx_b1_gpu0_int8.engine
Implicit layer support has been deprecated
INFO: [Implicit Engine Info]: layers num: 0

0:00:02.997028330 5633 0xaaaaab6e3060 INFO nvinfer gstnvinfer.cpp:684:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2195> [UID = 1]: Use deserialized engine model: /home/orin/deepstream_test2/models/Primary_Detector/resnet18_trafficcamnet_pruned.onnx_b1_gpu0_int8.engine
[New Thread 0xffff927c8840 (LWP 5652)]
[New Thread 0xffff91fb8840 (LWP 5653)]
[New Thread 0xffff917a8840 (LWP 5654)]
0:00:03.004536146 5633 0xaaaaab6e3060 INFO nvinfer gstnvinfer_impl.cpp:343:notifyLoadModelStatus:<primary_gie> [UID 1]: Load new model:/home/orin/deepstream_test2/config_infer_primary.txt sucessfully
[New Thread 0xffff90f98840 (LWP 5655)]
[New Thread 0xffff6fff8840 (LWP 5656)]
[New Thread 0xffff6f7e8840 (LWP 5657)]
[New Thread 0xffff6efd8840 (LWP 5658)]
[New Thread 0xffff6e7c8840 (LWP 5659)]

Runtime commands:
h: Print this help
q: Quit

p: Pause
r: Resume

NOTE: To expand a source in the 2D tiled display and view object details, left-click on the source.
To go back to the tiled display, right-click anywhere on the window.

** INFO: <bus_callback:291>: Pipeline ready

[New Thread 0xffff6dfb8840 (LWP 5660)]
[New Thread 0xffff6d7a8840 (LWP 5661)]
[New Thread 0xffff6cf98840 (LWP 5662)]
[New Thread 0xffff47378840 (LWP 5664)]
[New Thread 0xffff46b68840 (LWP 5665)]
[New Thread 0xffff44358840 (LWP 5666)]
[New Thread 0xffff41b48840 (LWP 5667)]
[New Thread 0xffff3d338840 (LWP 5668)]
[Detaching after vfork from child process 5669]
[Detaching after vfork from child process 5672]
Opening in BLOCKING MODE
NvMMLiteOpen : Block : BlockType = 261
[New Thread 0xffff39fef840 (LWP 5675)]
[New Thread 0xffff397df840 (LWP 5676)]
[New Thread 0xffff38fcf840 (LWP 5677)]
NvMMLiteBlockCreate : Block : BlockType = 261
[New Thread 0xffff387bf840 (LWP 5678)]
[New Thread 0xffff37faf840 (LWP 5679)]
** INFO: <bus_callback:277>: Pipeline running

[New Thread 0xffff3779f840 (LWP 5680)]
[New Thread 0xffff36f8f840 (LWP 5681)]
[New Thread 0xffff3677f840 (LWP 5682)]
[New Thread 0xffff35f6f840 (LWP 5683)]
[New Thread 0xffff3575f840 (LWP 5684)]
[New Thread 0xffff34f4f840 (LWP 5685)]
[New Thread 0xffff27ff8840 (LWP 5686)]
[New Thread 0xffff277e8840 (LWP 5687)]
[New Thread 0xffff26fd8840 (LWP 5688)]
[New Thread 0xffff267c8840 (LWP 5689)]
[New Thread 0xffff25fb8840 (LWP 5690)]
[New Thread 0xffff257a8840 (LWP 5691)]
[New Thread 0xffff24f98840 (LWP 5692)]
0:00:04.935915219 5633 0xaaaaab4ba300 ERROR nvinfer gstnvinfer.cpp:678:gst_nvinfer_logger:<secondary_gie_0> NvDsInferContext[UID 2]: Error in NvDsInferContextImpl::parseBoundingBox() <nvdsinfer_context_impl_output_parsing.cpp:60> [UID = 2]: Could not find output coverage layer for parsing objects
0:00:04.935998007 5633 0xaaaaab4ba300 ERROR nvinfer gstnvinfer.cpp:678:gst_nvinfer_logger:<secondary_gie_0> NvDsInferContext[UID 2]: Error in NvDsInferContextImpl::fillDetectionOutput() <nvdsinfer_context_impl_output_parsing.cpp:736> [UID = 2]: Failed to parse bboxes

Thread 12 “deepstream-app” received signal SIGSEGV, Segmentation fault.
[Switching to Thread 0xffffa1381840 (LWP 5647)]
0x0000ffffbfc5afa0 in ?? () from /opt/nvidia/deepstream/deepstream/lib/gst-plugins/libnvdsgst_infer.so
(gdb) bt
#0 0x0000ffffbfc5afa0 in ()
at /opt/nvidia/deepstream/deepstream/lib/gst-plugins/libnvdsgst_infer.so
#1 0x0000ffffbfc580b8 in ()
at /opt/nvidia/deepstream/deepstream/lib/gst-plugins/libnvdsgst_infer.so
#2 0x0000fffff7c94990 in () at /lib/aarch64-linux-gnu/libglib-2.0.so.0
#3 0x0000fffff6b8d5b8 in start_thread (arg=0x0) at ./nptl/pthread_create.c:442
#4 0x0000fffff6bf5edc in thread_start ()
at …/sysdeps/unix/sysv/linux/aarch64/clone.S:79
(gdb

Your secondary model is a classification model, so set the process-mode=2 and try again. You can also refer to the configuration in deepstream-test2

yes thanks

--------- 원본 메일 ---------

보낸사람: Luong Quang Dung via NVIDIA Developer Fo notifications@nvidia.discoursemail.com
받는사람: jmerrier@hanmail.net
날짜: 25.02.25 01:42 GMT +0900
제목: [NVIDIA Developer Forums] [Intelligent Video Analytics/DeepStream SDK] Deepstream-7.1 error Secondary_VehicleMake

| lqdisme
February 24 |

  • | - |

Your secondary model is a classification model, so set the process-mode=2 and try again. You can also refer to the configuration in deepstream-test2


Visit Topic or reply to this email to respond.

To unsubscribe from these emails, click here.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.