Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU) x86 RTX-3060
• DeepStream Version 6.1
• JetPack Version (valid for Jetson only) N/A
• TensorRT Version 8.2.5.1
• Triton Version 2.24.0
• NVIDIA GPU Driver Version (valid for GPU only) 8.2.5.1
• Issue Type( questions, new requirements, bugs) questions
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)
Utilizing
Hi,
I am trying to utilize the multi-batch inference using nvds-inferserver
plugin to handle multiple input sources with the deepstream app , so ideally batch size should be equal to number of input sources to process, so as per the Gst-nvinferserver documentation the triton inference server need to be installed to handle multi-batch scenario, so i installed the triton-inference server from source(without docker) and other other components inclusing deppstream tensorrt also installed without docker. As deepstream app works fine with nvinfer plugin type but when tried for nvinferserver plugin with below app config settings,
[primary-gie]
enable=1
#(0): nvinfer; (1): nvinferserver
plugin-type=1
gpu-id=0
batch-size=1
#Required by the app for OSD, not a plugin property
bbox-border-color0=1;0;0;1
bbox-border-color1=0;1;1;1
bbox-border-color2=0;0;1;1
bbox-border-color3=0;1;0;1
gie-unique-id=1
nvbuf-memory-type=0
config-file=config/config_yoloV8.txt
After running ,
gst-inspect-1.0 nvinferserver
It’s showign below output,
(gst-plugin-scanner:195446): GStreamer-WARNING **: 21:58:39.579: Failed to load plugin '/usr/lib/x86_64-linux-gnu/gstreamer-1.0/deepstream/libnvdsgst_udp.so': librivermax.so.0: cannot open shared object file: No such file or directory
(gst-plugin-scanner:195446): GStreamer-WARNING **: 21:58:39.708: Failed to load plugin '/usr/lib/x86_64-linux-gnu/gstreamer-1.0/deepstream/libnvdsgst_ucx.so': libucs.so.0: cannot open shared object file: No such file or directory
Factory Details:
Rank primary (256)
Long-name NvInferServer plugin
Klass NvInferServer Plugin
Description Nvidia DeepStreamSDK TensorRT plugin
Author NVIDIA Corporation. Deepstream for Tesla forum: https://devtalk.nvidia.com/default/board/209
Plugin Details:
Name nvdsgst_inferserver
Description NVIDIA DeepStreamSDK TensorRT Inference Server plugin
Filename /usr/lib/x86_64-linux-gnu/gstreamer-1.0/deepstream/libnvdsgst_inferserver.so
Version 6.1.0
License Proprietary
Source module nvinferserver
Binary package NVIDIA DeepStreamSDK TensorRT Inference Server plugin
Origin URL http://nvidia.com/
GObject
+----GInitiallyUnowned
+----GstObject
+----GstElement
:
gst-inspect-1.0 nvinferserver +----GstBaseTransform
+----GstNvInferServer
Pad Templates:
SINK template: 'sink'
Availability: Always
Capabilities:
video/x-raw(memory:NVMM)
format: { (string)NV12, (string)RGBA }
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
SRC template: 'src'
Availability: Always
Capabilities:
video/x-raw(memory:NVMM)
format: { (string)NV12, (string)RGBA }
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
Element has no clocking capabilities.
Factory Details:
Rank primary (256)
Long-name NvInferServer plugin
Klass NvInferServer Plugin
Description Nvidia DeepStreamSDK TensorRT plugin
Author NVIDIA Corporation. Deepstream for Tesla forum: https://devtalk.nvidia.com/default/board/209
Plugin Details:
Name nvdsgst_inferserver
Description NVIDIA DeepStreamSDK TensorRT Inference Server plugin
Filename /usr/lib/x86_64-linux-gnu/gstreamer-1.0/deepstream/libnvdsgst_inferserver.so
Version 6.1.0
License Proprietary
Source module nvinferserver
Binary package NVIDIA DeepStreamSDK TensorRT Inference Server plugin
Origin URL http://nvidia.com/
GObject
+----GInitiallyUnowned
+----GstObject
+----GstElement
+----GstBaseTransform
+----GstNvInferServer
Pad Templates:
SRC template: 'src'
Availability: Always
Capabilities:
video/x-raw(memory:NVMM)
format: { (string)NV12, (string)RGBA }
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
SINK template: 'sink'
Availability: Always
Capabilities:
video/x-raw(memory:NVMM)
format: { (string)NV12, (string)RGBA }
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
Element has no clocking capabilities.
Element has no URI handling capabilities.
Pads:
SINK: 'sink'
Pad Template: 'sink'
SRC: 'src'
Pad Template: 'src'
Element Properties:
batch-size : Maximum batch size for inference
flags: readable, writable, changeable only in NULL or READY state
Unsigned Integer. Range: 0 - 1024 Default: 0
config-file-path : Path to the configuration file for this instance of nvinferserver
flags: readable, writable, changeable in NULL, READY, PAUSED or PLAYING state
String. Default: ""
infer-on-class-ids : Operate on objects with specified class ids
Use string with values of class ids in ClassID (int) to set the property.
e.g. 0:2:3
flags: readable, writable, changeable only in NULL or READY state
String. Default: ""
infer-on-gie-id : Infer on metadata generated by GIE with this unique ID.
Set to -1 to infer on all metadata.
flags: readable, writable, changeable only in NULL or READY state
Integer. Range: -1 - 2147483647 Default: -1
interval : Specifies number of consecutive batches to be skipped for inference
flags: readable, writable, changeable only in NULL or READY state
Unsigned Integer. Range: 0 - 2147483647 Default: 0
name : The name of the object
flags: readable, writable
String. Default: "nvinferserver0"
parent : The parent of the object
flags: readable, writable
Object of type "GstObject"
process-mode : Inferserver processing mode, (0):None, (1)FullFrame, (2)ClipObject
flags: readable, writable, changeable only in NULL or READY state
Unsigned Integer. Range: 0 - 2 Default: 0
qos : Handle Quality-of-Service events
flags: readable, writable
Boolean. Default: false
raw-output-generated-callback: Pointer to the raw output generated callback funtion
(type: gst_nvinfer_server_raw_output_generated_callback in 'gstnvdsinfer.h')
flags: readable, writable, changeable only in NULL or READY state
Pointer.
raw-output-generated-userdata: Pointer to the userdata to be supplied with raw output generated callback
flags: readable, writable, changeable only in NULL or READY state
Pointer.
unique-id : Unique ID for the element. Can be used to identify output of the element
flags: readable, writable, changeable only in NULL or READY state
Unsigned Integer. Range: 0 - 4294967295 Default: 0
As referring to some relevant issues like this one, it emphasizes on using the docker image from containers/deepstream but i am not comfortable with the docker system hence i installed everything manually without docker.
Hence after running the deepstream app it’s giving below error,
(ds-app:198149): GLib-GObject-WARNING **: 22:04:55.013: g_object_set_is_valid_property: object class 'GstNvInferServer' has no property named 'input-tensor-meta'
** INFO: <create_primary_gie_bin:145>: gpu-id: 0 in primary-gie group is ignored, only accept in nvinferserver's config
gstnvtracker: Loading low-level lib at /opt/nvidia/deepstream/deepstream/lib/libnvds_nvmultiobjecttracker.so
gstnvtracker: Batch processing is ON
gstnvtracker: Past frame output is OFF
[NvMultiObjectTracker] Initialized
[libprotobuf ERROR /home/amkale/jitendrak/Triton-Github/client/build/_deps/repo-third-party-build/grpc-repo/src/grpc/third_party/protobuf/src/google/protobuf/text_format.cc:317] Error parsing text-format nvdsinferserver.config.PluginControl: 61:1: Extension "property" is not defined or is not an extension of "nvdsinferserver.config.PluginControl".
0:00:00.122810548 198149 0x7f408c002330 WARN nvinferserver gstnvinferserver_impl.cpp:441:start:<primary_gie> error: Configuration file parsing failed
0:00:00.122818756 198149 0x7f408c002330 WARN nvinferserver gstnvinferserver_impl.cpp:441:start:<primary_gie> error: Config file path: /home/swap/ultraanalytics/Video-Analytics/config/config_yoloV8.txt
0:00:00.122831738 198149 0x7f408c002330 WARN nvinferserver gstnvinferserver.cpp:459:gst_nvinfer_server_start:<primary_gie> error: gstnvinferserver_impl start failed
0:00:00.122836948 198149 0x7f408c002330 WARN GST_PADS gstpad.c:1142:gst_pad_set_active:<primary_gie:sink> Failed to activate pad
[NvMultiObjectTracker] De-initialized
** ERROR: <main:716>: Failed to set pipeline to PAUSED
Quitting
ERROR from primary_gie: Configuration file parsing failed
Debug info: gstnvinferserver_impl.cpp(441): start (): /GstPipeline:pipeline/GstBin:primary_gie_bin/GstNvInferServer:primary_gie:
Config file path: /home/swap/ultraanalytics/Video-Analytics/config/config_yoloV8.txt
ERROR from primary_gie: gstnvinferserver_impl start failed
Debug info: gstnvinferserver.cpp(459): gst_nvinfer_server_start (): /GstPipeline:pipeline/GstBin:primary_gie_bin/GstNvInferServer:primary_gie
App run failed
Any help on this would be appreciated,
Thank You!