Run from the terminal with llava.serve.cli on Nvidia Jetson AGX Orin

Hi,

I am trying to get llava work on Nvidia Jetson AGX Orin 64 GB with JetPack 5 (L4T r35.x) and booted headless. I am following the instructions in here, however, when I run the command

./run.sh $(./autotag llava) \
  python3 -m llava.serve.cli \
    --model-path liuhaotian/llava-v1.5-13b \
    --image-file /data/images/hoover.jpg

It starts downloading the requirements and fills up the entire disk and hence it fails.
Is there anything missing from the instructions provided?

Before running the above command I clone the jetson-containers.

Hi,

Please check the remaining storage space in your environment.
llava.serve.cli requires 26GB to download. You can find this in the below doc:
https://www.jetson-ai-lab.com/tutorial_llava.html#2-run-from-the-terminal-with-llavaservecli

What you need
3. Sufficient storage space (preferably with NVMe SSD).

  • 6.1GB for llava container
  • 14GB for Llava-7B (or 26GB for Llava-13B)

Thanks.

Hi @AastaLLL, thanks for your response. I have seen that information, and I started from a brand new Jetson AGX Orin so most of the hard disk was empty which is more than what Llava-13B needs based on the document you shared. But it still runs out of storage. I thought maybe I need to make changes to the following command,

./run.sh $(./autotag llava) \
  python3 -m llava.serve.cli \
    --model-path liuhaotian/llava-v1.5-13b \
    --image-file /data/images/hoover.jpg

Or run.sh, if not I will try with llava-v1.5-7b to see if the problem would be resolved.

Hi,

Could you help us do the following experience?

  1. Check how much storage remains right before the failure.
$ free -m
  1. Test if the llava-v1.5-7b model work.

Thanks.

Hi @AastaLLL

Thanks for your response! I will give an update shortly.

Thanks
Mahshid

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.