Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU) RTX 4050
• DeepStream Version 7.1
• JetPack Version (valid for Jetson only)
• TensorRT Version 10.3.0.26-1+cuda12.5
• NVIDIA GPU Driver Version (valid for GPU only) 566.36 (NVCC Version 12.6)
• Issue Type( questions, new requirements, bugs) Question
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)
When running in a windows 11 system with docker desktop, getting the following error:
app-1 | =============================
app-1 | == Triton Inference Server ==
app-1 | =============================
app-1 |
app-1 | NVIDIA Release 24.08 (build 107631419)
app-1 | Triton Server Version 2.49.0
app-1 |
app-1 | Copyright (c) 2018-2024, NVIDIA CORPORATION & AFFILIATES. All rights reserved.
app-1 |
app-1 | Various files include modifications (c) NVIDIA CORPORATION & AFFILIATES. All rights reserved.
app-1 |
app-1 | This container image and its contents are governed by the NVIDIA Deep Learning Container License.
app-1 | By pulling and using the container, you accept the terms and conditions of this license:
app-1 | https://developer.nvidia.com/ngc/nvidia-deep-learning-container-license
app-1 |
app-1 | ERROR: The NVIDIA Driver is present, but CUDA failed to initialize. GPU functionality will not be available.
app-1 | [[ Named symbol not found (error 500) ]]
app-1 |
The same docker works fine in a linux system.