Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU) : GPU • DeepStream Version : 6.1 • JetPack Version (valid for Jetson only) • TensorRT Version : 8.4.1 • NVIDIA GPU Driver Version (valid for GPU only) : 535.129.03 • Issue Type( questions, new requirements, bugs) : questions • How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing) : • Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)
deepstream으로 다른 docker에서 실행시킨 triton으로 inference를 실행하고자 합니다.
python의 multi process로 여러 프로세스로 inference를 실행하고자 합니다.
그런데 enable_cuda_buffer_sharing을 true로 설정하고 진행하는데 계속 아래와 같은 에러가 발생하는데 이유가 뭐야?
ERROR: infer_grpc_client.cpp:223 Failed to register CUDA shared memory.
ERROR: infer_grpc_client.cpp:311 Failed to set inference input: shared memory region ‘inbuf_0x7faaf00018e0’ already in manager
ERROR: infer_grpc_backend.cpp:140 gRPC backend run failed to create request for model: yolov8_16batch_dynamic
ERROR: infer_grpc_backend.cpp:195 TritonSimple failed to run inference on model yolov8_16batch_dynamic, nvinfer error:NVDSINFER_TRITON_ERROR
Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU)
• DeepStream Version
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)
• The pipeline being used
There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks
How did you start docker? Can you share the specific command line?
Which apps/pipelines are you running? Can you share the steps to reproduce?
Please use English
As an additional supplement, is it consistent with this question?