Hi,
I’m currently working with the Orin Nano dev kit and my goal is to get a video streaming demo working with any of the ViLA models.
Following:
- I’ve successfully upgraded to Jetpack 6
- Attempted to run the example (different VLM, but decided to stick to the script. The tool tip said that it’s still supported on my nano dev kit):
jetson-containers run $(autotag nano_llm) \
python3 -m nano_llm --api=mlc \
--model liuhaotian/llava-v1.6-vicuna-7b \
--max-context-len 768 \
--max-new-tokens 128
I got the following output:
Traceback (most recent call last):
File "/usr/lib/python3.10/runpy.py", line 129, in _get_module_details
spec = importlib.util.find_spec(mod_name)
File "/usr/lib/python3.10/importlib/util.py", line 94, in find_spec
parent = __import__(parent_name, fromlist=['__path__'])
ModuleNotFoundError: No module named 'local_llm'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/lib/python3.10/runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/usr/lib/python3.10/runpy.py", line 86, in _run_code
exec(code, run_globals)
File "/opt/NanoLLM/nano_llm/__main__.py", line 9, in <module>
runpy.run_module('local_llm.chat', run_name='__main__')
File "/usr/lib/python3.10/runpy.py", line 220, in run_module
mod_name, mod_spec, code = _get_module_details(mod_name)
File "/usr/lib/python3.10/runpy.py", line 138, in _get_module_details
raise error(msg.format(mod_name, type(ex).__name__, ex)) from ex
ImportError: Error while finding module specification for 'local_llm.chat' (ModuleNotFoundError: No module named 'local_llm')
I’m confused about the local_llm → NanoLLM transition. I thought that it was a single library that was renamed.
- Is NanoLLM an evolution that has dependencies on the old local_llm which aren’t included in it’s install?
- Is NanoLLM only part way through the transition and some of the examples don’t work?
Thanks!
