What is the path that’s created in the NIM cache when downloading a model? I’m looking to run the NIM container in an offline environment, so I want to download the models ahead of time. I couldn’t find any documentation on the folder structure of where the models are stored, e.g., something like $NIM_CACHE_DIR/models/llama3-8B or an equivalent such that the container skips downloading the model.
Hi @mthreet ,
I think I found the answer to your question in the official documentation. Check out this link: Configuring a NIM - NVIDIA Docs
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.