ChatwithRTX setup issue

I’ve spoken with the Live Chat support and they suggested I post this here.
When trying to Install Chat with RTX 0.2 there is no llama option for me because my graphic card only has 12Gb vram, so I changed the RAG file setting from default 15 to 7 in order to install the llma2 in ChatwithRTX. I can see three options now but llma2 fails to install.
My graphic card is rtx3060.
Any help would be appreciated.
20b3fdfc9ae2ee34dd327156d7ea54f

1 Like