Why i can't switch to Llama model in ChatWithRTX?

It only displays Mistral. How can I switch to Llama?

You need to have 16GB or more VRAM to use Llama model.

Hello, I have 64 gb, I dont see LLM options, should I do an additional install?

No. You don’t have 64GB VRAM. You have 64GB RAM on your mainboard. and the VRAM attached on your Geforce RTX board has less than 16GB VRAM. that’s why installer skips building Llama2 model.

You’re absolutely right!