Hi @sophwats,
So I got an error when accessing this link. I can download it from my terminal and browser in the same engine, but not from inside the container. I think it’s relates to an SSL certificate issue in the container. I import certificate manually when the container is running, then I can download the link
https://xfiles.ngc.nvidia.com/org/nim/team/meta/models/llama-3.2-1b-instruct/versions/hf-9213176-tool_calling/files/generation_config.json?ssec-algo=AES256&versionId=B8sLN3vF7wHO5xH2V2fpsGeE_ndN3goU&ssec-key=J3HTtMzq1jyfJs2ae22sI99X3rdZsF1RGp%2Bpoy4Mqpb74SOe%2F9kYnJIRMYY24IpFq4c%2BFu1Yq3sQmvjpFejugf77oj%2Bthl%2BcPfQf6AJTshYglA0Fp1NFxB4jRuYLUiUe%2BbNCamzrwK58oh25hxxT9YX0O1O9J%2Bb7OyYRf%2BFG75k%2FKthV2%2BUdAU%2F6onM7emUzEG0laBiffmbVXwxSljXipyfZBaPurmUEmiSTniiOb9%2BY8GTzRQ2yzQmmYfAuZLB9p8xZfUsHW0YfvIATli8WEBemFpW4V38NfIXYVR3c%2FPbE4q6bsOfFttM3nJguIWHZWLyjuMrWcMFfvqfkpf7Uh04o42yt1urkc%2FlFKJMn22gY2abv4GCoPp9VkGYYH6uFfs93K9sxV%2BX%2FB%2BTV27ktuSmLeWp6k0RL2yYntX5Ad9tvns1WcHlp2hwgaHMNVvJiZkQdaYfXBklK9oXnJQUWI%2BU%2BIN4Xw%2BRfzswJVJSucOLPg%2BnWmuN9%2F8XmI0mK4FNI&kid=bXJrLWU3OGM1M2FhZjE4YzRiNmJiNjlkYmRhZjcxNjA3YWEw&ssec-enabled=true&Expires=1754040510&Signature=sIxvqzJCoYmfz6hHPb1uWo0C3XYBQ78gl7zBF~74ClOJ7NOVFJyHb7xGmA-3liZazHF3y7VJ6eNNMPtKynBnPasAj4qddR~mbLyC0wmi6N2-8ZGCfkjTT-jEjDnDNu4Xw9fvJhy0BIZbO-ngcsR6Eq36ab8sOB0QTAs85-50XIwQiJivimlE2OZjh4v6K2cRR3O4bCp6YDTbRSzufsh2U34W2g34dDjjdcVSL7hcW7tpmN5wcLwpGMP4frEpvjNUxIgP404BVZ6ef9-UORoAfA9uWW6gCG2smZb7yP3viG0H9rrlNpY4zCAzsX3mO~G3y7wfnx2qt5heZLG8uMOuIQ__&Key-Pair-Id=KCX06E8E9L60W
But, how can I make it work directly in a single docker run command without doing it manually inside the container.
Below is the log output,
datawizard@datawizard-data:~$ docker run -it --gpus all --shm-size=16GB -e NGC_API_KEY=$NGC_API_KEY -e NIM_DISABLE_SSL_VERIFY=true -v "$LOCAL_NIM_CACHE:/opt/nim/.cache" -u 0 -p 8000:8000 nvcr.io/nim/meta/llama-3.2-1b-instruct:latest
===========================================
== NVIDIA Inference Microservice LLM NIM ==
===========================================
NVIDIA Inference Microservice LLM NIM Version 1.10.1
Model: meta/llama-3.2-1b-instruct
Container image Copyright (c) 2016-2025, NVIDIA CORPORATION & AFFILIATES. All rights reserved.
The NIM container is governed by the NVIDIA Software License Agreement (https://www.nvidia.com/en-us/agreements/enterprise-software/nvidia-software-license-agreement/) and Product-Specific Terms for AI Products (https://www.nvidia.com/en-us/agreements/enterprise-software/product-specific-terms-for-ai-products/)
A copy of this license can be found under /opt/nim/LICENSE.
The use of this model is governed by the NVIDIA Community Model License (found at https://www.nvidia.com/en-us/agreements/enterprise-software/nvidia-community-models-license/).
ADDITIONAL INFORMATION: : Llama 3.2 Community License Agreement (https://www.llama.com/llama3_2/license/). Built with Llama.
INFO 07-31 09:28:28 [__init__.py:256] Automatically detected platform cuda.
INFO 2025-07-31 09:28:29.781 ngc_profile.py:360] Running NIM without LoRA. Only looking for compatible profiles that do not support LoRA.
INFO 2025-07-31 09:28:29.781 ngc_profile.py:362] Detected 1 compatible profile(s).
INFO 2025-07-31 09:28:29.781 ngc_injector.py:158] Valid profile: 4f904d571fe60ff24695b5ee2aa42da58cb460787a968f1e8a09f5a7e862728d (vllm-bf16-tp1-pp1) on GPUs [0]
INFO 2025-07-31 09:28:29.781 ngc_injector.py:322] Selected profile: 4f904d571fe60ff24695b5ee2aa42da58cb460787a968f1e8a09f5a7e862728d (vllm-bf16-tp1-pp1)
INFO 2025-07-31 09:28:29.782 ngc_injector.py:330] Profile metadata: feat_lora: false
INFO 2025-07-31 09:28:29.782 ngc_injector.py:330] Profile metadata: llm_engine: vllm
INFO 2025-07-31 09:28:29.782 ngc_injector.py:330] Profile metadata: pp: 1
INFO 2025-07-31 09:28:29.782 ngc_injector.py:330] Profile metadata: precision: bf16
INFO 2025-07-31 09:28:29.782 ngc_injector.py:330] Profile metadata: tp: 1
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/opt/nim/llm/nim_llm_sdk/entrypoints/launch.py", line 649, in <module>
asyncio.run(main())
File "/usr/lib/python3.12/asyncio/runners.py", line 194, in run
return runner.run(main)
^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/asyncio/runners.py", line 118, in run
return self._loop.run_until_complete(task)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/asyncio/base_events.py", line 687, in run_until_complete
return future.result()
^^^^^^^^^^^^^^^
File "/opt/nim/llm/nim_llm_sdk/entrypoints/launch.py", line 512, in main
inference_env = prepare_environment()
^^^^^^^^^^^^^^^^^^^^^
File "/opt/nim/llm/nim_llm_sdk/entrypoints/args.py", line 214, in prepare_environment
engine_args, extracted_name = inject_ngc_hub(engine_args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/nim/llm/nim_llm_sdk/hub/ngc_injector.py", line 353, in inject_ngc_hub
engine_args = prepare_workspace_from_workspace(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/nim/llm/nim_llm_sdk/hub/ngc_injector.py", line 183, in prepare_workspace_from_workspace
return prepare_workspace_from_repo(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/nim/llm/nim_llm_sdk/hub/ngc_injector.py", line 203, in prepare_workspace_from_repo
cached = repo.get_all()
^^^^^^^^^^^^^^
Exception: ConnectionError: Check your ability to access the remote source and any network/dns/firewall/proxy settings. Details: reqwest::Error { kind: Request, url: "https://xfiles.ngc.nvidia.com/org/nim/team/meta/models/llama-3.2-1b-instruct/versions/hf-9213176-tool_calling/files/generation_config.json?ssec-algo=AES256&versionId=B8sLN3vF7wHO5xH2V2fpsGeE_ndN3goU&ssec-key=J3HTtMzq1jyfJs2ae22sI99X3rdZsF1RGp%2Bpoy4Mqpb74SOe%2F9kYnJIRMYY24IpFq4c%2BFu1Yq3sQmvjpFejugf77oj%2Bthl%2BcPfQf6AJTshYglA0Fp1NFxB4jRuYLUiUe%2BbNCamzrwK58oh25hxxT9YX0O1O9J%2Bb7OyYRf%2BFG75k%2FKthV2%2BUdAU%2F6onM7emUzEG0laBiffmbVXwxSljXipyfZBaPurmUEmiSTniiOb9%2BY8GTzRQ2yzQmmYfAuZLB9p8xZfUsHW0YfvIATli8WEBemFpW4V38NfIXYVR3c%2FPbE4q6bsOfFttM3nJguIWHZWLyjuMrWcMFfvqfkpf7Uh04o42yt1urkc%2FlFKJMn22gY2abv4GCoPp9VkGYYH6uFfs93K9sxV%2BX%2FB%2BTV27ktuSmLeWp6k0RL2yYntX5Ad9tvns1WcHlp2hwgaHMNVvJiZkQdaYfXBklK9oXnJQUWI%2BU%2BIN4Xw%2BRfzswJVJSucOLPg%2BnWmuN9%2F8XmI0mK4FNI&kid=bXJrLWU3OGM1M2FhZjE4YzRiNmJiNjlkYmRhZjcxNjA3YWEw&ssec-enabled=true&Expires=1754040510&Signature=sIxvqzJCoYmfz6hHPb1uWo0C3XYBQ78gl7zBF~74ClOJ7NOVFJyHb7xGmA-3liZazHF3y7VJ6eNNMPtKynBnPasAj4qddR~mbLyC0wmi6N2-8ZGCfkjTT-jEjDnDNu4Xw9fvJhy0BIZbO-ngcsR6Eq36ab8sOB0QTAs85-50XIwQiJivimlE2OZjh4v6K2cRR3O4bCp6YDTbRSzufsh2U34W2g34dDjjdcVSL7hcW7tpmN5wcLwpGMP4frEpvjNUxIgP404BVZ6ef9-UORoAfA9uWW6gCG2smZb7yP3viG0H9rrlNpY4zCAzsX3mO~G3y7wfnx2qt5heZLG8uMOuIQ__&Key-Pair-Id=KCX06E8E9L60W", source: hyper_util::client::legacy::Error(Connect, Custom { kind: Other, error: Custom { kind: InvalidData, error: InvalidCertificate(UnknownIssuer) } }) }
sys:1: RuntimeWarning: coroutine 'main.<locals>.shutdown' was never awaited