Hello NVIDIA,
As described in the Agent Studio AI labs page, how can I use the UserPrompt node to add files from the terminal for the model to interpret? This is the UserPrompt within the Agent Studio Node Editor.
Thank you in advance!
Hello NVIDIA,
As described in the Agent Studio AI labs page, how can I use the UserPrompt node to add files from the terminal for the model to interpret? This is the UserPrompt within the Agent Studio Node Editor.
Thank you in advance!
Hi,
You can find an example to use the plugin below:
https://github.com/dusty-nv/NanoLLM/blob/main/nano_llm/agents/chat.py
Or this can be tested with nano_llm.chat directly:
$ git clone https://github.com/dusty-nv/NanoLLM
$ jetson-containers run -v ${PWD}/NanoLLM:/opt/NanoLLM $(autotag nano_llm)
$ python3 -m nano_llm.chat --model meta-llama/Llama-2-7b-chat-hf --api=mlc --quantization q4f16_ft
...
05:51:18 | INFO | using chat template 'llama-2' for model Llama-2-7b-chat-hf
05:51:18 | INFO | model 'Llama-2-7b-chat-hf', chat template 'llama-2' stop tokens: ['</s>'] -> [2]
>> PROMPT:
Thanks.
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.