nanoLLM on Docker

I’m trying to run the python code using nanollm library on the jetson container (Multimodal — NanoLLM 24.7 documentation),
but got this error

04:52:52 | INFO | loading /data/models/huggingface/models–Efficient-Large-Model–VILA1.5-3b/snapshots/42d1dda6807cc521ef27674ca2ae157539d17026 with MLC
04:52:56 | INFO | NumExpr defaulting to 6 threads.
Traceback (most recent call last):
File “/home/ailab/Desktop/llmaster/lvlm.py”, line 46, in
model = NanoLLM.from_pretrained(
File “/opt/NanoLLM/nano_llm/nano_llm.py”, line 91, in from_pretrained
model = MLCModel(model_path, **kwargs)
File “/opt/NanoLLM/nano_llm/models/mlc.py”, line 60, in init
quant = MLCModel.quantize(self.model_path, self.config, method=quantization, max_context_len=max_context_len, **kwargs)
File “/opt/NanoLLM/nano_llm/models/mlc.py”, line 258, in quantize
os.symlink(model, model_path, target_is_directory=True)
FileNotFoundError: [Errno 2] No such file or directory: ‘/data/models/huggingface/models–Efficient-Large-Model–VILA1.5-3b/snapshots/42d1dda6807cc521ef27674ca2ae157539d17026/llm’ → ‘/data/models/mlc/dist/models/VILA1.5-3b’

I tried delete and re download the model, restart the container, and other things like copy file by my hands on /data/models/mlc like below

root@ubuntu:/data/models/mlc/dist/models/VILA1.5-3b# ls
config.json model.safetensors.index.json
generation_config.json special_tokens_map.json
model-00001-of-00002.safetensors tokenizer_config.json
model-00002-of-00002.safetensors tokenizer.model

but only config.json file is white color text, the others are red color

how can I modify sample code or docker to run nanollm sample code

Sorry for the late response.
Is this still an issue to support? Any result can be shared?

its only 30 min after I post this, So please don’t be sorry about late response ( I think its super fast LoL)

And yes, I’m still working on this issue

Hi,

It looks like there is no VILA model in your environment:

FileNotFoundError: [Errno 2] No such file or directory: ‘/data/models/huggingface/models–Efficient-Large-Model–VILA1.5-3b/snapshots/42d1dda6807cc521ef27674ca2ae157539d17026/llm’ → ‘/data/models/mlc/dist/models/VILA1.5-3b’

Before running the sample, have you requested the access as below?
https://dusty-nv.github.io/NanoLLM/models.html#tested-models

Thanks.

for VILA model, I think I don’t need any access before use the model.
So I just run the code on the docker env and got the error about missing file directory,
I saw all the model data had been downloaded, and I can see ```
‘/data/models/huggingface/models–Efficient-Large-Model–VILA1.5-3b/snapshots/42d1dda6807cc521ef27674ca2ae157539d17026/llm’

Hi,

Sorry for the missing. VILA doesn’t need to apply the access.

We give it a check and the model can download in our environment.
Do you also see the same in your environment?

# ll /data/models/huggingface/models--Efficient-Large-Model--VILA1.5-3b/snapshots/42d1dda6807cc521ef27674ca2ae157539d17026/llm
total 20
drwxr-xr-x 2 root root 4096 Aug 15 08:21 ./
drwxr-xr-x 5 root root 4096 Aug 15 08:21 ../
lrwxrwxrwx 1 root root   55 Aug 15 08:17 config.json -> ../../../blobs/b2ec4d57442d166b846728a699180663011f0a8a
lrwxrwxrwx 1 root root   55 Aug 15 08:17 generation_config.json -> ../../../blobs/bf84ec1a28ba89feb07162d95b06633a40b4975f
lrwxrwxrwx 1 root root   79 Aug 15 08:21 model-00001-of-00002.safetensors -> ../../../blobs/4eed552fa9ca41f3d6fb14b59a481bf12137a37e964df0ec60f412b3ac2a8637
lrwxrwxrwx 1 root root   79 Aug 15 08:18 model-00002-of-00002.safetensors -> ../../../blobs/b63acc16bd9be4e7f900ba7e66ddc82400c3c12d77cd5c2cfa4bc77761c0732d
lrwxrwxrwx 1 root root   55 Aug 15 08:17 model.safetensors.index.json -> ../../../blobs/8b173c9ac8194749df58c92051618c0ff74c4c20
lrwxrwxrwx 1 root root   55 Aug 15 08:17 special_tokens_map.json -> ../../../blobs/14761dcf1466dc232bd41de9c21d4c617b15755e
lrwxrwxrwx 1 root root   55 Aug 15 08:17 tokenizer_config.json -> ../../../blobs/47ab96cd62cc374653a0ea0fb77f9457e0f53481
lrwxrwxrwx 1 root root   79 Aug 15 08:17 tokenizer.model -> ../../../blobs/7aedb3582ecda9fa99ee9242c17a9658f6744db083ee6ebdc8fb14857f84d220

Thanks.

root@ubuntu:/home/ailab/Desktop/llmaster# ls -l /data/models/huggingface/models–Efficient-Large-Model–VILA1.5-3b/snapshots/42d1dda6807cc521ef27674ca2ae157539d17026
total 20
lrwxrwxrwx 1 root root 52 Aug 14 08:54 config.json → …/…/blobs/a0bd45e5e5bcdf2e9aae2dc0a659ad5d9c95afb7
-rw-r–r-- 1 root root 6834 Aug 14 09:09 config.json.backup
drwxr-xr-x 2 root root 4096 Aug 14 09:09 llm
drwxr-xr-x 2 root root 4096 Aug 14 08:54 mm_projector
lrwxrwxrwx 1 root root 52 Aug 14 08:54 README.md → …/…/blobs/14a0cbded72d9b3776fad22a63629ba7bc29f41b
lrwxrwxrwx 1 root root 52 Aug 14 08:54 trainer_state.json → …/…/blobs/008d75e3a1129697fe52c34feda6ae4101496ad8
drwxr-xr-x 2 root root 4096 Aug 14 08:59 vision_tower

for me, looks like this

Hi @jksim1833, sorry for the trouble - I noticed this looks like a path on the host device - do you have your users home directory mounted into the container, or are you trying to run it from outside the container?

Inside the container, can you try manually creating the symlink like this, and see if it works?

ln -s /data/models/huggingface/models–Efficient-Large-Model–VILA1.5-3b/snapshots/42d1dda6807cc521ef27674ca2ae157539d17026/llm /data/models/mlc/dist/models/VILA1.5-3b

If you did not start the container with jetson-containers run command, that automatically mounts --volume jetson-containers/data:/data and you would need to add that to your docker run command if starting it manually.

Yes, I inheritant nanollm docker and make my custom docker with adding some libraries. lvlm.py is just copy of example code ( Multimodal — NanoLLM 24.7 documentation and give a permission for accessing local file.

after changing my docker opening promt, it works well

sudo docker run -it --network host --env=“DISPLAY” --volume=“/tmp/.X11-unix:/tmp/.X11-unix:rw” -v /home/ailab:/home/ailab -v /home/ailab/Desktop/jetson-containers/data:/data --user root vlmcontainer:llmaster

Thanks for your help and wonderful works!

OK great thanks, glad you got it working!

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.